Uncommon Wisdom


October 7, 2010  12:29 PM

Alcatel-Lucent sheds light on Open API, LTE & IMS strategy

Tom Nolle Tom Nolle Profile: Tom Nolle

Alcatel-Lucent had an invitation event for industry analysts this week, and since the group was small relative to normal events, it was a good opportunity for discussion and engagement. The goal was to give us an idea of where Alcatel-Lucent was going in the near term and in a more strategic sense. I think it accomplished the goal overall.

It’s clear that Alcatel-Lucent is still having a bit of an identity crisis—several, in fact. It’s still apologizing for the aftermath of the merger, which looks like it’s finally finished, and not just in name. It is also having a bit of a confidence crisis, even though its articulation is strong and its strategic credibility numbers lead the network equipment vendor space by a pretty decent margin. Alcatel-Lucent has been battered a bit by Wall Street and by the internecine struggles of the past; it kind of needs a hug.

In a tangible sense, the big news out of the event was that Alcatel-Lucent has a much broader capability set in Open API than was first apparent. Yes, the program is linked to applications and developers and the smartphone universe, but it’s really more than that. Open API is a federation engine that absorbs multiple APIs, orchestrates unions and exposes the results. It could be used to federate content delivery networks (CDNs) — which is something Alcatel-Lucent says it’s working on, though it didn’t say if the Open API was part of the work), cloud computing, and even multiprovider service provisioning of the type that the TM Forum/IPsphere forum (IPSF) has been involved in. How far the company take this capability probably depends on operator traction, but watch the space for some action later this year as a possible signal. Continued »

October 4, 2010  11:59 PM

What does Verizon’s data center spending tell us?

Tom Nolle Tom Nolle Profile: Tom Nolle

Verizon announced it would be making a major investment in data centers for cloud computing, among other things. The major investment is that it will be adding space for more than 5,000 servers and expanding to about 200 data centers worldwide, including sites in Australia and the UK.

While “the cloud” gets a lot of play in this deal, it’s really more about enhanced services and a shift in Verizon’s profit model from selling bits to selling experiences. A couple decades ago, new services meant new network equipment. These days, it means servers and software, and at the moment, it’s being driven by a rush to create a meaningful strategy for content delivery and monetization. While that’s the hottest issue in the market, it’s an example of the broader issue of generating revenue in an age where transport and connection matter a lot less.

The next generation of carrier “services” will be customer “experiences.” The foundation of experiences is software, running on a connected set of data centers — a cloud.  But media hype about the value of the cloud has been several miles wide and a lot less than an inch deep. Most operators would echo the Pacnet exec quoted in a recent article — he’s glad he’s not the only one working through the fog of the cloud.

For operators, in particular, the imprecision of “cloud” is a challenge because they want an architecture on which to build their infrastructure plans. They had that for networks, and now they need it for clouds. All those data centers need to be filled with gear, and how that will work and how it will make money are now the critical question for operators — and vendors.


October 1, 2010  12:04 PM

Net neutrality dies in House; There’s always Title II

Tom Nolle Tom Nolle Profile: Tom Nolle

Rep. Henry Waxman (D-Calif), the House sponsor of an attempt to pass legislation to direct the FCC’s decisions on net neutrality, has withdrawn the bill for lack of support. 

This ends, at least for the moment, another Congressional attempt to create telecom policy through explicit legislation. It’s not the first time bills have been dropped. Since the Telecom Act of 1996 virtually every attempt to change policy has died without coming to a vote.

I still believe that a Title II telecom services classification (regulation under the Communications Act of 1934) with reasonable wholesale rates (the Canadian model, for example) is workable, and in fact might be more logical given the range of services that are likely to migrate to IP without being part of the Internet.

IPTV and carrier voice, as well as enterprise services, fit that model. The FCC has to weave a complicated ruling to protect both the Internet and the business model for IP-converged services.  Title II is the best way to do that.


September 28, 2010  3:54 PM

Does IBM’s Blade Network acquisition go far enough?

Tom Nolle Tom Nolle Profile: Tom Nolle

IBM is acquiring Blade Network Technologies, an Ethernet switch vendor that specializes in blade-server switching for the data center and supports both IBM’s and HP’s blade centers. The move seems targeted at closing a gap between IBM’s data center strategy and Cisco’s UCS, which includes blade switching. The deal gets IBM back into at least the enterprise part of networking, after having sold its networking business to (of all people) Cisco.

But so far it’s a narrow return. IBM didn’t announce any deal to buy a broader-based provider like Brocade, Juniper, F5, or Extreme, all of whom make more general and larger Ethernet products. It didn’t make any attempt to acquire carrier-grade Ethernet switches either, although it might later.

IBM is an IT kingpin, perhaps the greatest IT player the world has ever known. IBM knows data center networks are important to the future of IT, and how important cloud networking is as well. It would clearly prefer to have a series of partnerships with benign network industry players and focus on their own expertise, but the problem is that competitors like HP, Cisco, and perhaps Oracle are threatening to widen the IT space to envelop at least some of networking.

IBM doesn’t want to be caught without an asset that suddenly becomes a competitive focus, so it moves. Once movement starts, it becomes hard to say how far it will go, how much of networking might become a target of IT acquisition.


September 23, 2010  5:10 PM

Verizon’s 4G LTE tiered pricing: The only answer for ROI?

Tom Nolle Tom Nolle Profile: Tom Nolle

 Verizon’s announcement that it would be delivering LTE services with tiered pricing shortly is an important if not-unexpected indicator of a harsh reality in networking. We have a vibrant industry filled with venture capital money tirelessly working to exploit Internet bandwidth and connectivity. At the same time, we largely ignore the question of how we’ll sustain the infrastructure that provides us with all of that good stuff.

The problem isn’t as much one of technology (though innovation could certainly help) as that of a sustainable business model. When all-you-can-eat pricing was introduced for the Internet, it worked because nobody was eating much. There was little content, no video, and no social networking explosion to fuel usage, and so even giving a customer a faster connection at little incremental cost wasn’t likely to create much of a financial burden. That’s no longer true.

Network operators like Verizon have to earn a decent return on investment, and operators already earn far less than the public over-the-top players like Google. Future investment is likely to come at an even lower ROI unless pricing plans change. That’s part of the reason why Verizon is focusing on 4G LTE services, but another reason is that they can.

4G isn’t rolled out to most users, and so the decision to offer it with pricing tiers for data usage doesn’t create the kind of push-back that a similar move in 3G data would create. That doesn’t mean there won’t be such a move; we think that we’re facing usage pricing tiers even in wireline broadband.

Things like net neutrality have also contributed to the problem. The FCC continues to debate whether IP-non-Internet services are violating a still-undefined net neutrality position. If they are, then there’s no escape from the Internet pricing model for incremental services like IPTV. That means that the bandwidth itself must be made to pay a fair return, and that will limit the growth of new applications by making their increased usage a pay-to-play proposition.

We could have avoided all of this with rational ecosystemic thinking, but I think it’s now too late for that. Raising the dreaded specter of usage-based pricing is the big risk, and one that’s already being taken. Having taken the step, operators have little to gain by then working to remove the pricing tiers by supplementing their revenues elsewhere, even if a plan to do so is developed.

We see operators worldwide focusing on content delivery, and in particular on content monetization. Much of these activities are looking not at the network but at hosted assets related perhaps to identity, location, metadata, and other customer and content asset attributes. They’ll still need the network to deliver, but the best hope that that delivery network will continue to be innovative and feature-rich may now be pricing tiers, and that’s a sad outcome for us all.


September 22, 2010  1:20 PM

Will tech M&A activity fill need for broad-based product strategy?

Tom Nolle Tom Nolle Profile: Tom Nolle

Companies are now actively pursuing mergers and acquisitions to consolidate and improve economies of scale. Even when mergers and acquisitions don’t directly consolidate the companies (IBM’s deal with Netezza is an example), they are likely to result in some job loss overall. And even where companies aren’t involved in M&A in any way, they’re pressured to produce better profits in a world of nearly static sales. That means cutting costs, and probably jobs, or at best not hiring much. The phenomenon is market-wide and includes tech.

In tech, M&A is still fueled mostly by a combination of competitive pressure to offer a broad-based product strategy and a desire to “mine” revenue from customer relationships. The key point here is that revenue mining works only when there’s revenue to mine. Where markets are commoditizing, you can improve efficiency and costs, but it’s a race to the bottom that U.S. companies will lose to offshore competitors, particularly in China.

In IT, where most of the M&A has been happening, software is a key differentiator, and technology is directly linked through software to the productivity value proposition. In networking, the underlying problem is price commoditization at the network layer and below, and that problem cannot ever be solved again. Thus the major players in the space will ultimately fail, specialize or seek differentiation elsewhere. Alcatel-Lucent, Ericsson, and NSN are all vulnerable here because they have broad product portfolios that can be attacked piecemeal by Huawei and ZTE.

Will they go the Nortel route? It’s possible in our view, but it’s more likely that there will be a musical-chairs shifting of business elements among these players as each one strives to find a niche it can defend. In effect, these companies have to get smaller in product footprint and revenue to survive.

The challenges of the major network players may create problems for smaller vendors, many of whom are relying on OEM relationships with giants in networking or IT, or are looking to these firms as buyers down the road. There are surely going to be network deals done (Juniper is said to be preparing to announce it is buying Trapeze from Belden, an example of musical-chairs product elements), but we think these will be fewer, smaller and harder to monetize effectively than those of the IT space. Network vendors are losing strategic engagement, as our spring study showed, and without that it’s hard to leverage even good assets.


September 3, 2010  6:48 PM

FCC revises net neutrality NPRM with eye toward Google/Verizon plan

Tom Nolle Tom Nolle Profile: Tom Nolle

The Federal Communications Commission (FCC) has released a revision/expansion to its net neutrality notice of proposed rulemaking (NPRM), targeting the two issues specifically raised by the Google/Verizon proposal — broadband wireless and “managed services.” The latter is always what we’ve believed to be the key issue.

In the Google/Verizon accord, that term was used for what we’ve called “IP-non-Internet” services, meaning services that are delivered over the same access conduit as the Internet but which are structurally separate. Some voice over IP (VoIP) and Internet Protocol television (IPTV) services already fit this model. An excessive neutrality position would encourage operators to move more services to this category; as such, it could represent both a path to continue to encourage broadband investment and a way to marginalize the Internet. We think that it could well be both, depending on just how far things go in what’s clearly a face-off developing between operators and regulators.

We already have many communications services that are delivered on IP and not subject to neutrality provisions — business services are the best example. We also have non-IP services that are delivered over the same conduit as Internet access, including phone services using time-division multiplexing (TDM). While our definition excludes these services from consideration, it’s less clear whether the FCC debate might not branch out there, if for no other reason than that expansion of neutrality provisions to non-Internet IP would beg the question, “Why then not use another protocol, like Ethernet?” That would further muddy the waters.

In our view, all of this debate could be ended decisively by simply declaring that broadband access was a telecommunications (Title II) service fully subject to the Telecom Act requirements for wholesaling, and then setting wholesale rates at retail prices, less avoidable sales/marketing costs.


September 3, 2010  6:26 PM

Rumored Skype buy could be monster service layer move for Cisco

Tom Nolle Tom Nolle Profile: Tom Nolle

If it’s true that Cisco has offered to buy Skype, it would be a truly revolutionary development. We can’t get any specifics on this particular story, other than that Cisco is indeed looking to make some big M&A moves. The linkage with Skype could be highly favorable to Cisco because it could be used to both promote online video calling and intercalling with telepresence systems (which would be good for Cisco both in pushing telepresence and generating network traffic). How Skype matches up with Google Voice is the competitive dynamic most likely to reshape carrier voice.

Could Cisco then offer carriers a kind of white-box Skype service? It’s an interesting speculation. A move like this would also create a peer-to-peer (P2P) based service layer framework for Cisco, the company who’s already creating strong service tools building upward from its UCS strategy. The downside, of course, would be that Cisco could be seen as competing with the telcos to which it tries to sell gear. We don’t think that’s a major problem because most telcos already think that traditional voice services are becoming an albatross.

Recall that some have accepted Skype apps on smartphones, and we’ve been hearing for almost 18 months that operators were exploring P2P alternatives to next-gen IP voice, still seen as too infrastructure-heavy for the plummeting voice prices and margins.


September 1, 2010  10:54 PM

Alcatel-Lucent OpenPlug move reinforces app-development strategy

Tom Nolle Tom Nolle Profile: Tom Nolle

Alcatel-Lucent announced its second developer-oriented acquisition in its Applications Enablement strategy expansion: OpenPlug. The company provides a virtual appliance client framework that can be programmed with a Flex-based development system and then cross-compiled to run on virtually any device, from smartphones and “brightphones” with basic web capability to set-top boxes and even auto electronics.

More significantly, Alcatel-Lucent plans to integrate OpenPlug tools with its Open API, which would make operator-exposed service features directly available to developers and build applications that were stickier to users and more directly exploitative of network capabilities. That would differentiate them from simple OTT applications, which is what most smartphone apps really are.

This move by Alcatel-Lucent is critical for the company, but also for network equipment’s role in the service layer. Our surveys have shown that operators are visualizing their service-layer expansion more and more in IT terms, largely because network vendors have yet to provide an understandable and cohesive positioning in the service layer. The OpenPlug deal offers Alcatel-Lucent an exceptionally strong position on which to build a more complete service-layer story.

Interestingly, the fact that Alcatel-Lucent is coming at the service layer from the appliance/developer side means that for the moment it’se nt risking collision with Cisco’s efforts here, which are developing from the content side.

The combination of Cisco and Alcatel-Lucent moves in the service layer will put further pressure on the other equipment vendors because it’s now virtually impossible for either of them to duplicate either approach. That confines their positioning to the true service-layer middleware, which is harder to explain and position.

But that’s where everyone has to go, and so the biggest question with this deal, may be what Alcatel-Lucent has planned for the actual foundation technology within its Open API framework. If it’s strong, we’ll have a network battle of the titans with every IT player also looking for its best strategy, and the service layer will become a major competitive focus point in carrier infrastructure.


August 30, 2010  5:30 PM

Next up for Google: Streaming full movies (for a fee) on YouTube?

Tom Nolle Tom Nolle Profile: Tom Nolle

The Financial Times reports that Google is negotiating with most major studios on a plan to offer streaming movie rentals via YouTube, though we’re hearing that the studios would like to see a different brand involved in the deal to shed the free amateur video image.

According to the story, the movies would be available concurrent with their DVD release and be offered at about $5. In the present deal, the movies will be streaming only; downloading will be disabled by the digital rights management (DRM) process, but we also hear that this restriction may be removed at some point when studios are happy with the DRM process and a price is set. We think the rumored price is likely a bit higher than Google wants, but studios can’t kill the DVD revenue stream. We also hear that there will be a number of older movies available with lower prices and also perhaps with fewer restrictions on downloading.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: