Verizon is taking “TV Everywhere” to more places, or at least taking it places under more conditions. Its new FiOS Flex View is a kind of Netflix-like approach to content, except that instead of streaming it, you download it subject to digital rights management (DRM), with 30 days to watch the material. The fact that the video is downloaded means that users can view it even when there’s no Internet access available or when connection quality is very poor. For most users, the Verizon Media Manager is the portal to organizing Flex View content on mobile devices.
Obviously this is another of the growing number of ways in which operators are looking to monetize content. Like TV Everywhere, which is essentially a notion of “subscription equals rights,” Flex View basically says that having content rights means the right to view it wherever it can be delivered. That approach empowers firms that can both offer content on a syndicated or subscription basis and deliver it and maintain DRM credibility.
One interesting thing about the Flex View model is that it seems to disconnect the user from the network, at least as a requirement for viewing content. That’s consistent with other Verizon moves, one of which being the iPad with WiFi and a 3G hub partner device. Flex View could make Verizon’s content strategy consume less 3G bandwidth (and 4G bandwidth eventually) and promote a usage model for migratory devices where permanent connectivity isn’t critical.
This doesn’t mean that Verizon is de-emphasizing the mobile network, but instead that it’s trying to uncouple content leadership from dependence on that network, and also reduce the risk that content success could come only by destructive price decreases for mobile capacity.
Verizon has started to advertise its iPad deal, and it’s a bit more complicated than the rumors had suggested. What the big telco is doing is bundling an iPad with a mobile WiFi hub, a gadget that links to a 3G service and then creates a mini-hotspot to which WiFi devices can then attach. This finesses the exclusivity AT&T has for the original iPad, and it’s similar to strategies used by other carriers to end-run Apple’s restrictions.
What makes the deal confusing or complicated is that the data plan that comes with the iPad bundle has a much smaller charge for incremental gigabytes over the base plan than other mobile broadband plans from Verizon. Most pundits expect that Verizon will be normalizing the pricing over time, but it’s also possible that Verizon intends to promote the use of wireless trans-connectors to link smart WiFi devices to 3G/4G networks instead of relying on specific 3G/4G receivers. Such a move would have some interesting consequences.
- First, it would promote customers’ use of multiple devices through a single data connection and on a single data plan. We’re told that many of Verizon’s iPad prospects are also laptop users and that many also have smartphones. You can see the value down the line in having one 3G/4G device and a set of WiFi slaves.
- Second, the trans-connect model would facilitate 3G/4G offload with WiFi and provide in-home use of smart devices using the existing home wireless network, rather than perhaps requiring femtocell deployment. This could be especially valuable if Verizon intends to offer special services to FiOS customers, who already have FiOS-linked wireless hubs in home.
- Third, the model could reduce the issues associated with migrating smart devices to 4G by making it necessary only to migrate the trans-connect hub and not the devices. It’s also probably easier to make a high-quality 3G/4G/WiFi hub than to stick all those RFs into one phone or tablet.
All of this makes sense for non-phone devices, but if Verizon were really serious about the hub approach, it would also be logical to think about releasing a hub-linked smartphone. That would also play into rumored plans for a WiFi-based, FiOS-linked VoIP service that have been floating around.
The latest political wisdom, arising from the results of the mid-terms, is that net neutrality as an issue hurt those who supported it. That combines with the Democratic loss in the House to create a loss of momentum—at least according to popular wisdom.
In truth, net neutrality never had any momentum. Congress doesn’t like to intervene in telecom because problems there are too easily created and too hard to fix. That’s why we have an FCC. When there’s a problem with FCC authority, Congress is inclined to stand by and let the FCC do its best, as I think it intended here all along. That the problem is getting more complicated by things like the Fox/Cablevision brouhaha is only making “becoming a tree” a more attractive option to our leaders on the Hill. The FCC closed comments on this, and it will no doubt issue a Notice of Proposed Rulemaking at some point, but don’t expect magic.
One of the complications is illustrated in Australia, where the boundaries between the new NBN, a public access network intended to provide good broadband by bypassing the commercial process and the national carrier (Telstra), and Telstra. The NBN now wants to get into inter-city transport, creating a backhaul network that would link metro broadband customers to the Internet by hauling them between cities to get to a big on-ramp. At one level it’s not a serious issue because that type of traffic is often not particularly profitable. On the other hand, it illustrates that when politics get involved in broadband, the normal political tendency to build one’s own empire rolls over into the broadband space.
Does NBN’s head (Quigley, formerly of Alcatel-Lucent) see himself as the czar of Australian broadband, and perhaps even of networking? Minister of Networks? Aside from the fact that this sort of thing would devalue the investment of millions of Australians who bought into Telstra’s privatization with the encouragement of the government, it raises the question of how far NBN will really go, and how much taxpayers will have to kick in.
Worldwide, the tension between consumers/voters who want everything for nothing and businesses who want something for everything isn’t going to be resolved through government ownership. If transport/connection isn’t profitable, we need to figure out how to achieve public policy goals within the framework of networking today, because dismantling that framework at this point is simply not possible. If broadband were a good business, VCs would be fighting over the carcass as we speak. Instead, they’ve long since gotten out of Dodge. Think about it.
Broadband continues to be in the news, both in terms of policy and in terms of business model. Of course, the two should have some relation to each other, but it’s increasingly clear that’s not going to be the case in many world markets.
The U.S. elections and the Clearwire announcement that it would be cutting staff illustrate the issue perfectly. The loss of the House for the Democrats means that Republicans will now chair the House committee that oversees the FCC, and this has already stimulated predictions that net neutrality is dead in Congress. Not true; it’s been dead in Congress all along. We’ve never believed there was much of a chance that Congress would step up on the issue, and frankly we’d be just as happy if it didn’t.
The reason we have a federal commission in charge of communications is that Congress isn’t likely to be able to address complex technology issues well. We saw how complex issues here are with the Fox/Cablevision standoff. Fox cut access to some of its websites for Cablevision customers, at least for a time, and also cut off its TV programming as the companies battled over how much Cablevision would pay for carriage rights. The FCC admitted it had no authority to act here, even though most would say that non-neutral behavior by content providers has the same effect as by access providers. That means that meaningful net neutrality rules would have to come from Congress, and they’d have to break totally new ground. That alone would be likely to send Congress into a tizzy, and combine it with the combination of heavy industry lobbying revolving around wireless impacts and consumer desires to have everything both neutral and free, and you have a political minefield with no big incentive to address. The election, after all, is over. The Clearwire dilemma shows that some attention is needed here, though.
Broadband services of any sort create a classical S-curve of cash flow, with “first cost” driving a provider far into the red as they build out infrastructure to credible levels and fund marketing campaigns. The hope is that success will turn this around, but the problem in broadband is that even “success” looks a lot like failure when it’s time to add up the numbers. There’s not much margin, and that means it takes a long time to recover early costs. Clearwire needs to go back to the well, and neutrality issues with wireless aren’t going to make it any easier to do that.
The decision by T-Mobile to tout its HSPA+ offering as “4G” is another indicator. Most operators would agree that HSPA+ is a less costly transition than a full migration to LTE, and if one accepted the 3GPP definition of 4G, none of the current LTE offerings could qualify either. But a marketing slogan may help sales, and sales could help turn that S-curve of cost around quicker. Wireline services are already less than marginal in terms of profit, and wireless could easily move into negative territory as well. We may see wireless capex slip in 2011 and beyond if we don’t get clarity on this issue.
One of the prime areas of focus for tech recently has been the tablet PC space. Tablets are far from new — in fact, some of the “new” models are more like reprises of earlier tablets, in that they’re little more than a keyboard-less notebook. The iPad, of course, created an alternative vision of a tablet as a kind of overfed smartphone, an all-display device designed to be a conduit of information to the user with a relatively sparse capability to move data the other way. Some see tablets as consumer devices, while some like the model of enterprise tablet use. The vendors are struggling with which model to support; ViewSonic expects to offer both 7-inch and 10-inch forms, and both Android and Windows 7 (even dual-boot, so the rumor goes).
However the tablet goes, the big news will be the network and the impact of tablets on user behavior. Movement to tablets on a large scale means movement to ubiquitous wireless, but we’ll need to look hard at just what “ubiquitous” means. As I’ve noted, there’s an opportunity for hospitality-Fi networks to play an enormous role in future tablet networking. I think wireless providers and equipment vendors realize that and are trying to figure out how to promote a truly compelling case for 3G/4G wireless versus Wi-Fi. The problem is that it’s going to be an uphill battle, because device vendors have everything to gain by pushing Wi-Fi versions of their devices to get a larger near-term market share.
Behavior, mobility and devices all create interdependence. Consumers aren’t set on tablet use, wireless models or behavioral patterns at this point. That means giving them support for a specific usage model can condition them to consume that model, whatever form that support may take. An explosion in tablet competition could empower a host of competitors, create a hospitality-Fi wave and erode the business model for 4G. It could foster a different model for mobility that focuses on roaming data sessions between Wi-Fi hot spots, independent of traditional mobility tools of the past and of IMS. It could even erode the operators’ positions in the service layer, because Wi-Fi is traditionally an over-the-top framework tied to no operator in a technical sense.
Alcatel-Lucent, whose quarter showed some real promise for growth, seems to recognize that. They announced a program with Eurozone provider KPN that demonstrates the exposure of provider network assets through Alcatel-Lucent’s Application Enablement and Open API program. This is the first large-scale success of a provider API program to deliver premium network features up the stack to the service layer. The application itself is still a bit simplistic, focusing again on QoS and bandwidth rather than on the more complex areas of identity, federation, content delivery networks and application-service feature creation, but it’s a convincing demonstration that operators do have a path to monetize their underlying network assets either by offering high-level services that exploit them or by wholesaling them to somebody else. This kind of capability may be critical if things like tablets and hospitality-Fi start to erode the traditional mobile opportunity.
The IT world has provided us with a number of interesting developments this week, starting with a Google suit filed over a proposed Department of the Interior (DoI) messaging system award to Microsoft. Google feels that its own Google Apps could have been used for this and that it should have been given the opportunity to demonstrate compliance with federal security requirements and bid on the contract. Thus, the lawsuit.
Some in the DoI have suggested to us that the problem with Google Apps is the same one that’s a problem for users of Google’s online competitors to Microsoft Office: The features Google provides are a subset of those already in use rather than the full set. What’s not totally clear is whether the missing features are actually used at DoI. But in some ways, you have to be sympathetic with the department. How easily could DoI find out whether all or some features were used?
As a result, the suit may be an important one for cloud computing services in general. Many (probably most) cloud-based alternatives to popular installed software tools are functionally more limited than the stuff they’re intended to replace. That’s also true with most open-source tools. I’ve tried Google’s document tools and they won’t properly process either our spreadsheets or our presentations. They create problems with some publication/paper styles, as well. Same for OpenOffice. But there’s no question that you could do most of what I do in either Apps or OpenOffice if you started from scratch. So cloud applications would be promoted if they were deemed acceptable, if they offered relatively full functionality — even if they do it differently, or if they offered at least some way of doing what buyers actually did rather than what they could do. Without that kind of ruling, it may be hard to promote the cloud version of many apps unless cloud providers step up and fully duplicate capabilities. Frankly, that’s what they should do. You can’t sue your buyer into submission as a long-term business strategy. Continued »
Anyone who’s followed my writing knows that I’m no fan of the National Broadband Plan. My main issue is with the data that’s been presented to back that plan, as well as some recent work I’ve been doing that is making me even more skeptical — if that’s possible.
What started me off was a comment by a White House science type. He was sure there were billions to be gained in productivity and jobs if broadband were more available, though he admitted he didn’t know exactly how those benefits were calculated or realized. OK, I said, let’s take a look at broadband versus economics and see if there’s a correlation. The FCC has data by ZIP code that shows where there are a lot of broadband providers available. Other agencies provide household income data, also by ZIP code. Suppose we correlated the two?
If broadband availability is in fact an economic benefit, we should see some correlation between the number of providers and the household income of consumers. We do, but it’s the wrong kind. The data shows that the correlation is overwhelmingly in the reverse. The areas with the most broadband providers available are the areas with the lowest household income.
Last week saw what has become the usual push and pull of supply-side and demand-side issues, and perhaps a bit more than the usual confusion in the markets (financial, enterprise and consumer) about the net outcome. It wasn’t the wild week of stock swings that could have happened if economic news had been bad. At the same time, there wasn’t much that could be called a big upside of hope either. In all, “tepid” probably says it best.
I’ve commented several times recently on broadband issues, many arising out of what it has become increasingly clear are misleading or bad numbers about broadband deployment. It’s not surprising that broadband would become a political football in this most political of all recent election years, but it’s bad for the industry because it’s pulling everyone’s eyes off of the ball. Despite continuous evidence that economic density is the most decisive factor in broadband market effectiveness, we continue to ignore it. Despite the fact that there’s no clear indication that broadband has any societal value whatsoever, we continue to assert that it does. A real plan, based on exploiting what we know and objectively studying what we don’t, could get the market moving.
Meanwhile, the mobile space is showing us the shape of the future.
Speaking at the Broadband World Forum, Alcatel-Lucent chief marketing and strategy officer Stephen Carter talked about the need to create a “European approach” to 4G broadband. Some of the specific points in the talk weren’t new: We need to move beyond all-you-can-eat pricing; We need to add some specific partnership and settlement processes; We need to recognize the intrinsic differences in the major markets.
What is interesting to me is that all of this is coming to a head right now, and why might be the most interesting thing of all.
Juniper announced a mobile security suite, building on its Junos Pulse agent/client software that operates across a wide variety of mobile and PC platforms. The elements of the suite (the antivirus, firewall, etc., that are common to most PC suites) are less news than the framework in which it’s being provided. Juniper is binding security as an element in a device agent, then coordinating it through central management of that agent so it’s effectively a part of a collective network- or organization-wide security program.
The newest problem facing both enterprises and operators these days comes from the fact that a single user is extended across multiple appliances and increasingly uses those appliances as facets of a virtual personality. This is true with social-driven consumers, as well as increasingly with productivity-driven enterprises. Point-solution security not only doesn’t secure the range of devices, it forces those who want security to integrate disparate policies and processes to create a secure framework. One miss destroys collective security and also risks cross-contamination of the other channels to the user.
I like the Juniper approach here, not because of its capabilities or because of Juniper’s need to validate the research it sponsored. We have security on devices, and we’ll have it on all of them eventually, and the problems of mobile device security are hardly a surprise, even without new research. What I like is that Junos Pulse extends “the network” to the device itself and makes it an agent of network policy and services. It seems the only long-term solution to both security issues and creating service value-add. Plus the multiple device faces of the user are going to pop up in a lot of future service missions, and they will be problematic to those without a device-integrated approach.
It’s hard to pull this story out of the Juniper talk, in part because it’s focused so much on security needs and the point-solution remedy, but the real story is the ecosystem.