Uncommon Wisdom


January 6, 2011  2:45 PM

Cisco’s Videoscape: A good video-service-layer move, with caveats



Posted by: Tom Nolle
Cisco, Comcast, content monetization, TV everywhere, Video, Videoscape

Cisco took what could be a giant step for itself at CES with its new video ecosystem. Cisco’s Videoscape combines in-home tools and software to centralize the mediation and management of video relationships, creating what’s probably the most architected video service layer available to network operators today. Since Cisco was already doing well in the early content monetization project trials, Videoscape could be a real winner for the company.

But despite the positives, Videoscape still has some issues in my view. Paramount is that Cisco is developing a content strategy in the absence of an overall service-layer strategy, or at least is creating the latter by simply assembling pieces instead of creating an architecture. Most of the elements in the Videoscape Conductor (the back-end) could easily be helpful in other missions, but it’s not clear how they’d be applied outside the video context. There’s also a strong push for video sharing and uploading, which generates traffic for operators and has essentially no potential for monetization. That makes the product a bit of a risk in itself, but it also shows that Cisco may pursue its own aspirations (which are to generate so much consumer video traffic that operators are essentially forced to buy tons of Big Iron to carry it) more than support the operators’ business cases.

Overall though, Videoscape is a strong achievement for Cisco because it plays to the company’s strength—breadth in the video market. The net effect of deployment could be a kind of “TV Everywhere,” and with Comcast pushing that very thing already, the timing couldn’t be better.

December 28, 2010  3:02 PM

The FCC net neutrality order and murky pay-for-priority issues



Posted by: Tom Nolle
FCC, net neutrality, pay-for-priority, regulation, Title II

I had a chance to review the full text of the FCC’s Net Neutrality Order (number 10-201 if you’re into the FCC’s numbering system). And while there were no real surprises in the material versus the commentary that was provided in the public meeting, I’m still concerned that the FCC hasn’t created a solid legal foundation for the order, which means that it would be at risk in an appeal.

There are plenty of people on both sides of the issue who say they might appeal the matter, but in truth, it takes some financial resources to fund an appeal process if you’re earnest about getting results. The FCC, as I’ve noted in the past, is holding open the docket on reclassifying broadband under Title II, perhaps to threaten the ISPs (with pockets deep enough to appeal).

Another tactic that the text might reveal is the Commission’s statement that a lot of the key issues like what constitutes traffic management are to be addressed on a case-by-case basis rather than through meticulous details in the order. That means that anyone who wants to dispute something will either have to file for a declaratory ruling or wait to get zapped by the FCC and then make their case—either to the FCC or on appeal. Continued »


December 21, 2010  7:47 PM

FCC issues net neutrality order, but don’t expect much change



Posted by: Tom Nolle
Broadband, Congress, Court of Appeals, FCC, Mobile, net neutrality, regulation, Title II, usage-based pricing, wireless

The FCC’s neutrality vote went as expected, with commentary by various people involved in the process, including the Commissioners. I found a lot that I agreed with, but I disagreed with at least some of what virtually everyone said. It’s not a disappointing order, though I’m sure that most will characterize it that way.

The only thing that’s disappointing is that in my view, it doesn’t address the issue of the FCC’s authority to act. The loss of the previous neutrality doctrine was a result of the Court of Appeals decision having overturned that doctrine for lack of authority to act. I don’t think the current order establishes a strong position, and there will be no lack of players to appeal the order.

The FCC’s position is pretty much as expected, based on prior comments by the Commissioners. The FCC will require that wireline broadband services be subject to handling rules that are transparent, non-discriminatory in terms of sites, devices and traffic types. For mobile services, the transparency rules are in force, but non-discrimination is weakened a bit to reflect the special nature of wireless. For mobile, blocking traffic that’s competitive with the ISP’s own service is prohibited, but other blocking for traffic management may be allowed if the need can be proved. The “specialized services” that flow in parallel with the Internet will be reviewed, but nothing will bar either payment for priority or tiered pricing per se.

The jurisdiction issue here is going to seem trivial, but it’s really central. Continued »


December 21, 2010  6:14 PM

How quickly can Alcatel-Lucent’s Open API program adapt to the market?



Posted by: Tom Nolle
Alcatel-Lucent, APIs, application development, application enablement, application program interface, Open API, open APIs, OTT, over-the-top

Alcatel-Lucent continues to showcase the developer side of its Application Enablement approach, including its Open API program, which federates application services across multiple developers. There is no question that the company has started to gain some traction in the market with this, but there is still a question in our mind regarding how quickly the program can adapt to market conditions.

The thing that has made over-the-top players successful in the service layer is that they’ve dodged inertia. Because they don’t worry about standards beyond blowing a casual kiss here and there, they can expose features via APIs very quickly. If you wait for industry consensus on APIs, you’re putting yourself at the tail end of a multi-year process and then saying you’re running at market speed. I’d like to see Alcatel-Lucent open up more regarding how it will create features in Application Enablement and how quickly it can expose them using RESTful APIs.


December 21, 2010  5:30 PM

What’s behind Oracle moving its enterprise apps to Amazon’s EC2?



Posted by: Tom Nolle
Amazon, Cloud computing, cloud computing services, EC2, enterprise applications, IaaS, Infrastructure as a Service, JD Edwards, Oracle, PeopleSoft, SaaS, Software as a Service

Fresh off of ramping up its server/networking connections for the year ahead, Oracle announced that it will be supporting at least some of its PeopleSoft and JD Edwards applications on Amazon’s EC2. This seems like a reversal for the company, which had initially seemed to reject the cloud computing model. I think it’s worth looking at the move to see if it reveals some hidden truths.

First, the revenue impact of the decision isn’t significant on its face, because Oracle will treat EC2 virtual machines just like customer virtual machines; the same license terms and rules apply. So what we’re seeing here is a model to accept infrastructure as a cloud service rather than to promote public cloud-based enterprise apps in Software as a Service form. But why even do that? I think that Oracle is realizing that the hybrid cloud is its path to enterprise prominence, and in particular a path leading past HP in a competitive sense.

Another factor is that Oracle’s database appliances are selling strongly. These appliances provide database management systems (DBMS) as a service, and thus could make it much more practical to have a cloud application access an on-premises database with reasonable performance. Thus you could argue that the hybrid cloud model is perfect to socialize Oracle’s appliances in a market that already seems to be catching on to their value.


December 21, 2010  3:23 PM

NSN’s three vendors standing vision — but which ones?



Posted by: Tom Nolle
Alcatel-Lucent, application enablement, Cisco, Ericsson, Huawei, Juniper, NSN, procurement zones, service layer, telecom equipment vendors

NSN’s CEO has recently suggested that the telecom sector will consolidate with only three major players remaining; Ericsson, Huawei and NSN.  I think that vision of the future is a tad self-serving in terms of the players, but I think that it is very clear that somewhere around three players is what we could expect to see if the industry can’t find better feature differentiation.  If Alcatel-Lucent wants to make the cut here, it definitely needs to make Application Enablement work, and it’s frustrating to me how close itis to that, and yet how far. But it’s not atypical with service-layer strategies in the big vendor space. Nobody has it right there.

One might ask where this consolidation would leave Cisco and Juniper, the other big players in the IP layer at least. That’s another area where NSN’s comments oversimplify. In operator trends toward procurement zones for buying, we’re seeing an attempt to create a market where a single giant vendor with a full product line can’t dominate everything.

The operators would like innovation, particularly with respect to the service layer, and they can’t get it by having everything collapse into a single giant commoditized space. But if the specialty guys like Cisco and Juniper can’t make a case in the service layer, then they can’t defend their narrower position in a commoditizing market. Thus, we could see the NSN “vision of three” being right, even if the three turn out to be different names than expects.


December 20, 2010  3:07 PM

Considering the dual logic behind the FCC’s net neutrality order



Posted by: Tom Nolle
Broadband, Comcast, Congress, Court of Appeals, FCC, Federal Communications Commission, net neutrality, regulation, Regulations

The FCC will be releasing its net neutrality order tomorrow, though it’s not fully baked at this point and might still be pulled from the agenda. The order appears to be a curious mixture of a logical application of net neutrality and an illogical legal foundation. I’ve reviewed the Court of Appeals ruling in the Comcast case, and it’s hard for me to see how this dodges the legal issues the court has already raised. The only avenue forward would be for the FCC to now assert (and justify) the view that Section 706 of the Telecom Act gave the FCC “new” powers to encourage broadband, and not just a specific justification to exercise the powers it already had. To this point, though, the FCC has consistently taken the opposite position.

Republicans in Congress are rattling their sabers, threatening to pass a bill that offers no funding for the FCC’s neutrality rules. Apart from whether this is even legal, it’s pretty obvious that it could never pass in the divisive political world of Washington. Similarly, it’s clear that net neutrality legislation more aggressive than the FCC proposes — mandating no traffic management, no premium handling except for free, and full wireless regulation — wouldn’t pass, either.  So whether either extreme is the right answer doesn’t matter.

What does matter is having a set of rules that will pass legal muster, and that’s where I’m concerned. The FCC’s “third way” was the right answer. It was clearly legal and it would have offered exactly what the situation needed. Some of the Democratic commissioners want it, and frankly, I’d rather they held out. I disagree that this order is better than no order — if it’s not enforceable, then it truly is “no order.”


December 16, 2010  2:47 PM

Consumer-driven public networking hits financial watershed



Posted by: Tom Nolle
Comcast, FCC, ISPs, net neutrality, Online advertising, peering, Quality of Service, regulation

The holiday season is always dominated by consumerism, but it should be pretty clear to everyone that networking itself is increasingly dominated by the consumer. We’re headed very quickly for a time when the consumer essentially funds all public networking, creates the design paradigms and the economic trade-offs. Along the way, though, we’re facing some potentially significant hurdles and shifts in the course.

The Internet has already made public IP infrastructure the basis for public networks, though of course that infrastructure tends to be less homogeneous than many see. Ethernet is a smarter edge strategy, for example, because most consumer services will haul traffic to either a metro off-ramp or a metro cache/server farm. You don’t need a lot of connectivity to get to one place. Still, the Internet has won IP a victory at the service layer, where the IP address space is the only framework we could expect to see in the network of the future.

This month, we’re heading to a kind of financial watershed with public network services. There’s been a surge of growth in online services funded by advertising, but advertising represents only a fraction of the money needed to fund a public network. Recent legal disputes (on Interclick’s history-tracking, for example) show that advertising-related sites are pushing the limits of public and judicial tolerance in a quest to tie up those limited dollars.

Ultimately people have to pay for stuff to fund a $3-trillion worldwide industry like networking. The FCC is likely to set the boundaries of where pay works and where it doesn’t in its December 21st order on net neutrality. But whatever they do, there’s no turning away from the fact that advertising isn’t ever going to fund the public network, so something else has to. Continued »


December 16, 2010  1:01 PM

Potential NSN minority stake sale has good news/bad news



Posted by: Tom Nolle
Alcatel-Lucent, Ericsson, managed services, NSN, outsourcing, service layer

There are renewed stories that NSN is looking to sell about a third of the company to a private equity consortium. The stories aren’t indicating at this point how the share would be divided among the buyers or where it would come from in terms of Nokia and Siemens. It’s a classic good news/bad news item no matter how it divides, though.

The good news side is that nobody buys something that’s worthless. NSN does, in fact, have strong assets, and certainly those assets could be leveraged to produce a good return on any private equity investment. The bad news is that if you’ve got good assets that could produce a good ROI, why aren’t they producing one for you? If they’re not, you’re messing up. Clearly neither Nokia nor Siemens would be looking to sell off a stellar activity.

But there are reports that the managed services space that Alcatel-Lucent, Ericsson, and NSN all want to share is expanding. Ericsson won a 3 Italia deal to revamp its IT processes — not exactly a giant deal, and in any case isn’t a broad endorsement of a outsource-based service-layer strategy. Operators tell us they’re happy to outsource stuff that’s a cost center, has no direct competitive impact, and depends on skills they don’t have and don’t want to develop. They’re less sanguine about outsourcing what makes them profitable. I think that the question here is whether the private equity guys are drinking the PR Kool-Aid on managed services or whether they see that changes need to be made in NSN’s service-layer positioning and are confident they can make them.

We said in our 2009 vendors analysis that NSN needed to sing prettier at the strategy level to create strategic service-layer traction with buyers. We also said that such traction would be increasingly critical to success and to sustaining margins at lower layers in the network. The problem is that our surveys have shown that NSN lost credibility since that analysis. While its worst dip was from the fall of 2009 to the spring of 2010, it has gained little ground between spring and fall, and in some key areas (like the radio network in mobile infrastructure), it actually lost slightly. There’s absolutely nothing wrong with NSN’s product line or technical skills —t he problem is purely marketing and positioning.

That’s the centerpiece of the dilemma that confronts any organization that buys a piece of NSN. You can believe that managed services tides will lift all boats, including NSN, and that you see this great truth even though neither of the current partners does. Or you can believe that the problem of the service layer can be solved for NSN by singing its song more effectively. Given that, I’d be looking at creating an NSN choir if I were senior management there. Otherwise a deal could go sour simply by having the current NSN trends continue in the face of a newly aggressive position by one of the competitors.


December 8, 2010  12:28 AM

Google’s Chrome OS: Framework for cloud client device and platform



Posted by: Tom Nolle
Chrome OS, Cloud computing, Comcast, Google, OTT, over-the-top, peering, usage-based pricing

Google let the industry have its first look at its Chrome OS, which it sees as the framework for a “cloud client” device, as well as a platform that combines Google’s desktop position with one in the smartphone space (Android) and a service-side position (Google’s cloud services) to create a new and complete (yes, and completely Google) solution for future computing and communication. I don’t think Google has grandiose visions of owning the computing/networking world, but I do think that it’s thinking through the process of ecosystemic computing more seriously and effectively than most. The commercial launch of Google’s Chrome OS may be delayed, but some of the impacts may be visible even before then.

If you look at Chrome OS and Chrome (the browser on which the OS is based) , what you see is a reflection of the fundamental truth that cloud computing is not ceding computing to the cloud, but rebalancing computing activity between the cloud and the client. In effective cloud computing, the process-intense tasks of information editing and display should be pushed outward to the client to reduce the impact of these tasks on central resources and to insure that the network connection to the client doesn’t become congested with a lot of unnecessary display-oriented babble. Continued »


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: