The Full Spectrum


August 1, 2016  10:38 AM

ISPs should prepare to fight to own home Wi-Fi services

Alex Scroxton Alex Scroxton Profile: Alex Scroxton

For many consumers, a Wi-Fi router is no more than an ugly but necessary evil to ensure they have untethered access to the internet on all of their devices, says XCellAir co-founder Todd Mersch.

For that reason, the majority – at least 60% according to an IHS Report – are more than happy to let service providers bundle a router with their service. It means one less trip to an overcrowded electronics store to buy an alien-looking, antennae-laden device, boasting baffling features such as “Turbo QAM”.

Have you been to a Currys lately? There are huge displays right as you walk in the door of – drum roll – Wi-Fi routers. These used to be tucked away on some back corner shelf, next to the cables and PC components. So what’s going on?

Consumers are getting smarter. They have realised Wi-Fi is the performance driver for their home network – a network that increasingly must cope with not only multiple devices per resident, but also video streaming and Internet of Things (IoT) applications.

This, combined with a largely unmanaged, inconsistent and therefore frustrating Wi-Fi service from their operator, has them looking elsewhere for a better experience – and potentially switching internet service providers as evidenced in a recent Consumer Reports survey.

In turn, this burgeoning demand means there are growing numbers of new entrants providing direct-to-consumer smart Wi-Fi equipment. Companies like Eero and Luma have been launched with that very aim of fixing Wi-Fi in the home. Most of them focus on covering the whole home (in Eero’s case this comes with a $500 price tag), but when combined with cloud-based management tools consumers are now empowered to manage their own network.

Todd Mersch XCellAir

Service providers can go from zero to hero by offering a managed, consistent, and high performance home Wi-Fi service, says Mersch.

Good for consumers?

Sounds cool – right? As a consumer you look like a tech savvy wireless guru and as a service provider your customers stop calling when their Wi-Fi does not work.

But this is a short-sighted view. For the individual it’s great if you know anything about Wi-Fi, want to own service problems and are happy to shell out a lot of money for the right to do it.

However the service provider quietly forfeits the most critical component of the customer relationship and guarantees its relegation to bit-pipe status.

On the flip side, operators have an opportunity to not only save this relationship but deepen it by offering a managed, consistent, and high performance home Wi-Fi service.

To understand how an operator can move from zero to hero, you first have to have a picture of what drives the erratic and frustrating performance of today’s status quo. Wi-Fi issues can be put into a few categories.

Wi-Fi interference, congestion and coverage

First off is interference and congestion. This is where too many devices and routers are trying to use the same Wi-Fi spectrum at the same time. That challenge is exacerbated by consumers buying a more powerful router or adding unmanaged extenders.

It becomes a bit like shouting above the noise scenario – all it encourages is for everyone to raise their voices, and once everyone is shouting, no one is any better off. A whole bunch of unmanaged APs near each other means that channels are used inefficiently. Research we carried out last year found that in an average area, capacity unavailable thanks to inefficiencies was enough to stream another 25 high-definition videos.

Through intelligent and automated use of unlicensed spectrum, the operator can tap into the latent capacity and deliver better, more reliable performance. And this won’t be possible if consumers are using their own router.

The second big problem is coverage. In many larger homes, it is increasingly difficult to deliver whole home coverage with one router. Additionally, the actual placement of the Wi-Fi access point (AP) is often not ideal.

This does not mean every home needs multiple access points. The key is for the service provider to be able to identify what is driving the coverage issue – placement or the size of the area the AP is trying to cover – and proactively solve the problem. This can happen at installation as well as during operation.

Finally, there is the inherent fragility of the Wi-Fi hardware itself. In most cases, these products are mass produced at low-cost and are not designed to take the punishment we dole out. An operator has a unique skill set developed over decades of delivering highly-reliable service. By automating basic fault avoidance techniques – like resetting the router before your service is impacted – they can provide reliability not currently available.

By delivering reliable, high-performance, broad coverage and, importantly, a managed service – service providers can actively monetise Wi-Fi with services such as Wi-Fi calling, wireless video distribution and in-home IoT.

But if they do not deliver, this new breed of provider will take over the customer relationship, and rob the service provider of this opportunity.


Todd Mersh is co-founder and EVP of sales and marketing at California-based XCellAir

July 12, 2016  10:03 AM

Consumer broadband habits versus next-gen services

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Bandwidth, Broadband, streaming

G-PON as we know it today is rapidly approaching the end of the growth phase in its technology lifecycle, writes Adtran’s Ronan Kelly. There is a surge in gigabit broadband service offerings that began in the US and is becoming increasingly prevalent across Europe. From the consumer perspective, there is a wave of new technologies such as 4K and virtual reality, which while still in their early stages, are expected to gain significant traction in the coming years.

Whilst today’s gigabit-enabled consumer is not yet utilising the service to its full capacity, this will change in the next three to five years as applications begin to emerge that take advantage of the capacity on offer. When that happens, we will reach the tipping point where G-PON deployments are no longer prudent, and next generation PON technologies such as NGPON2 and XGS-PON become the de facto standard.

If an average consumers’ available broadband data were quadrupled today their usage patterns would not change for three to four months. This is because their experience when using online services is largely informed by service capabilities and once a better service is available it takes time to find out what they can do with it.

As upgrades are rolled out, consumers tend to continue using their bandwidth as before, albeit with a better experience, whilst slowly exploring services that were not previously usable. With that in mind, everything we are using today is largely created for the bandwidth that has been available for the last four or five years.

What’s behind the change?

Historically, bandwidth always comes before an application; nobody develops an app that needs more bandwidth than is available to the mass market, otherwise it would be useless. This is no great secret.

Look at the technologies we take for granted today, such as FaceTime, video streaming and massive attachments like photographs. We would have a very different experience trying to use them on the 1MB connections we had in 2011. But with exponentially faster bandwidth on the horizon, we are only just beginning to see what developers can create. With the emergence of cloud-first services and entertainment such as virtual reality, high quality video services like Netflix and Amazon, 4K and higher resolutions will be the norm on all screens.

If you walked into Currys two years ago you would have struggled to find more than a handful of 4K televisions available in a small premium suite. Today, however, you’re spoilt for choice on what 4K screens you can purchase, and HD TVs are tucked away in a corner. With PC displays now matching 4K resolutions, and tablets set to follow, it’s obvious that the consumer electronic manufacturers have historically driven uptake in the consumer space.

It’s true to say that it’s always a race in the consumer electronics space. The moment one manufacturer releases a tablet with 4K capabilities, the others quickly follow suit. Before you know it, the market has shifted, leaving the vast majority of consumers with 4K ready technology. Similarly, when a consumer upgrades to the iPhone 6S and starts recording video, that video is now recording in 4K. Other consumers then move to the next phone, start recording and without any conscious thought drive more change.

With screen resolution constantly increasing, and us being at the benchmark level for 4K within those screens, when people stream content via services like Netflix or Amazon they already have a 4K offering. Consumers start using these services on these devices and, by default, again start to change and influence their broadband usage behaviour.

It took HD 10 years from when it first appeared to get to the point when there was a decent amount of content available for consumers, because satellite and cable TV companies dictated its proliferation. Until they upgraded their background infrastructure, users couldn’t watch HD content on anything but a Blu Ray player. 4K is at this early stage; there is currently next to no live broadcast content yet available and all 4K offerings come via streaming.

Back when HD launched, streaming services didn’t exist. Coupled with the flat screen TV revolution, the HD capability became a common feature in lounges across the country faster than HD broadcast content was made available. Due to the absence of competing streaming services, there was no major pressure for content providers to start broadcasting in HD. It’s different now – major pressure is coming from the over-the-top (OTT) providers, which is why we’re starting to see a more rapid push from some of the dominant players, who are gearing up to have 4K content ready even though the consumers are not adopting as quickly as they did when flatscreens first became available.

The agent of change is now different. Me getting rid of the huge TV taking up space in my living room was a compelling argument to move to flatscreen, and having HD as part of that transaction was a nice bonus, but it wasn’t the primary driver for change. Now, the consumers have the flatscreen and a 4K flatscreen coming along isn’t quite as compelling, so the rate of change, unless pushed by the electronics manufacturers, will be slower. Typically TVs have a 10-15 year lifespan before they’re replaced. If we reflect on the timeline of the flatscreen revolution, the early adopters are now well in that window. Will this serve as the catalyst that accelerates 4K adoption?

While some of these factors have long been the cause of change in bandwidth speed, the surge in adoption of services from OTT services is driving demand for even higher speeds. As consumers move to 4K, they will be met with a very different experience; every TV is now smart, so more streaming content will be accessible than ever. Similarly the gaming industry now pushes consumers to download or stream games, so the distribution model for that industry has totally shifted to a cloud-based model, much like the music industry.

Looking ahead

Looking ahead there are technologies coming that will put a huge strain on the network, the connected car being at the forefront of that charge. Autonomous vehicles offer countless benefits, the most obvious being the socioeconomic impacts: with cars essentially driving themselves, the ‘driver’ takes back the productive or leisure time which had previously been lost to travel.

However, we still have a long way to go to achieve this. The capacity requirement to implement this is typically up to 27mbps per car. Take 100 square metres of London on any given day and imagine the number of cars using broadband, and you’ll quickly get a sense of the capacity that we’ll need.

With these factors in mind, it’s vital to continue to develop next-generation broadband technologies, such as,NG-PON2, XGS-PON, and G.fast, that use existing infrastructure and enable gigabit speeds in a timeframe that matches demand. As technology advances, so too must our available broadband offering. If the two cannot progress at similar speeds, or if broadband cannot keep up with the digital consumer, then society will suffer.


Ronan Kelly is CTO at Adtran and a frequent commentator on broadband issues.


June 28, 2016  8:58 AM

Broadband must not be forgotten after Brexit

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Brexit, Broadband, fibre, G.fast

In the wake of last week’s vote to leave the European Union (EU), if we are indeed going to invoke Article 50 of the Lisbon Treaty later in the year – and despite the lingering hope that there might be some sort of loophole to wiggle through, Brexit now seems very likely – there must be a concerted effort to make the best of it.

As our economy crashes and burns, our banks look to relocate and the last of our manufacturing base flees the country, we are going to need to think strategically in order to remain competitive against a 27 member trading bloc that our country has gravely offended, and that will want to see us fail.

And make no mistake, the EU will take us to the cleaners in the negotiations to come… who can really blame them?

One way that we can compete on a level footing to ensure we are light years, and light speeds ahead of them when it comes. We are going to have to have, hands down, the very best connectivity possible. South Korean-style, if possible.

Yes, ultimately this means universal fibre-to-the-premises (FTTP) to every home, office and factory in the country, regardless of location. But since we now have even less money than we thought we did, this possibility is now even more remote than it was before the referendum. For now, it’s not realistic to push for it.

Yet the technology already exists to go ultrafast at low cost. G.fast technology is not true fibre, but it can deliver speeds of over 100Mbps, in the field, right now, with just a little upgrade in the cabinet. We don’t need to dig the roads up, we don’t need to wait for wayleaves. As the post-Brexit squeeze tightens its grip, G.fast suddenly begins to look much more compelling. I believe it is time for us to get behind it.

We face obstacles, for sure. Just a week before the decisive vote, Computer Weekly met with Openreach’s CEO Clive Selley, who said Brexit would materially damage his ability to invest in new engineers and commercial roll-out of broadband. This is a significant concern, and something that must not be allowed to happen. The onus is on Selley to stick up for Openreach and protect it at all costs.

Equally, whoever inherits the government in the next few months will have to take charge of broadband policy. Ed Vaizey, the current minister in charge, is a true supporter of the digital economy and connectivity, but we have to acknowledge that he may be replaced under new leadership. If a new minister is put in charge at DCMS, they must be a powerful and credible voice for our sector.

Britain faces an uphill battle to stay competitive outside the EU bloc. Ultrafast broadband networks will help to keep the British economy competitive in the post Brexit world. We cannot lose our focus now.


June 22, 2016  10:33 AM

Telematics, insurance and the driverless car evolution

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Insurance, Internet of Things, telematics

The political momentum behind the connected future enabled by the internet of things (IoT) is clear, writes Jonathan Hewett of Octo Telematics. Increasingly we see insurance positioned at the frontline of this technology shift as players across the auto industry recognise the key role it will play as we transition to the world of the driverless car.

In the Queen’s Speech, the UK became the first country in the world to announce its intention to legislate on insurance requirements for driverless cars. The Modern Transport Bill will “ensure the UK is at the forefront of technology for new forms of transport, including autonomous and electric vehicles”.

While some commentators contend that driverless cars will make motor insurance unnecessary, as summed up by roads minister Andrew Jones, the suggestion “is a lot of pie in the sky.” Insurance will remain fundamental, but policies of the future will look very different to those we buy today.

By harnessing driver data and applying data analytics, insurers can benefit from the insight they need to forecast risk.

By harnessing driver data and applying data analytics, driverless car insurers can benefit from the insight they need to forecast risk, says Hewett.

Currently, insurers have to price their policies based on proxies and assumptions. The risk premium for car insurance, is currently determined by factors including the driver’s age, postcode or the model of car they own, regardless of their actual driving behaviour.

However, by harnessing driver data and applying sophisticated data analytics, insurers can benefit from the crucial insight they need to forecast individual risk. This is an increasingly essential capability in the shift towards connected and driverless cars.

Insight into government thinking around driverless car insurance suggests that future policies would go even further than this, essentially holding vehicles, and by extension their manufacturers, liable for accidents. This is a very different proposition for insurers to consider. The focus is taken away from the driver and their involvement. Instead it falls on the robustness of the algorithms and technology provided by manufacturers that enable vehicles to make decisions and drive.

Accordingly, as autonomous vehicles become more widely adopted, the driverless car industry will undoubtedly require third parties to verify potential insurance claims and determine liability. In these situations, on-board telematics technology will be crucial in enabling an accurate data-informed reconstruction of any incident. For instance, should a Google and Tesla car crash into one another as they simultaneously avoid a pedestrian, the data gathered will be essential to determine which car is at fault and which manufacturer is liable. However for reasons of impartiality a third party will be necessary to hold that data and provide the analytics.

The momentum behind autonomous vehicles has the power to revolutionise the insurance industry, but it could be decades before regulators allow vehicles to be built without manual controls and the fact remains that we are still far from a driverless reality. Nonetheless, some of the technologies integral to the autonomous future are already enabling more equitable insurance and helping to shape better, more self-aware drivers.

Much of the technology that will take the driverless car from concept to reality already exists and is being utilised to assist insurers in processing claims faster and more accurately. And the market is growing fast: by 2020, consultancy Ptolemus estimates that nearly 100 million vehicles will be insured with telematics polices, growing to nearly 50% of world’s vehicles by 2030.

Telematics is providing both manufacturers and insurers with the opportunity to innovate and to develop new products, partnerships and approaches that are grounded in actual data. Not only will data-driven insurance will be a necessary norm for determining crash liability in the driverless future, but it will also be an essential step in making the transition to that future a safe reality.


Jonathan Hewett is global chief marketing officer at Octo Telematics


June 17, 2016  11:14 AM

Total smartphone service: put public Wi-Fi in the mix

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Mobile, Wi-Fi

The recent announcement that UK regulator Ofcom plans to make additional spectrum available for Wi-Fi is great news for consumers, writes DeviceScape’s Dave Fraser. Because Wi-Fi operates in unlicensed spectrum it is very much the people’s network; anyone with a broadband connection is able to get good quality wireless access to the content and services they love.

It’s the perfect complement to cellular services operating in licensed spectrum, not least because Wi-Fi has become established as the de facto access technology for indoor use. We spend most of our time and consume the vast majority of our smartphone data indoors, in spaces where it is often difficult to get a good quality cellular data signal.

Another uniquely populist characteristic of Wi-Fi is that it enables people and entities to share connectivity with one another. If you want a measure of the importance of Wi-Fi sharing, look no further than the fact that etiquette bible Debretts has expressed a position on it.

So, if you were to design a connectivity service from scratch in 2016, based on an understanding of how consumers use their smartphones, shared public Wi-Fi would be a necessary component of that service.

If you were to design a connectivity service from scratch in 2016, shared public Wi-Fi would be a necessary component of that service.

Shared Wi-Fi is most valuable to consumers in public indoor locations, because they visit many more of these than they do other people’s homes or offices. That it is generally available to all consumers, and not just those who subscribe to one particular operator, emphasises its egalitarian nature.

No surprise, then, that it has become a staple of the customer experience offered by numerous retail, service and hospitality businesses, from independent corner cafés right up to the UK’s largest retailers. These businesses understand their customers’ desire for unbroken connectivity and cater to it just as they cater to other basic needs.

So, if you were to design a connectivity service from scratch in 2016, based on an understanding of how consumers use their smartphones, shared public Wi-Fi would be a necessary component of that service.

The reality today, though, is that almost all smartphone connectivity services are legacies of a time before wireless and smartphones, when the primary functional requirement of a mobile phone was that it connected to the mobile network. Then, as now, a licence to operate a mobile network assured membership of a club whose exclusivity was guaranteed by the high costs of entry and operation, and crucial to the retrieval of those costs.

A companion legacy is the mobile sector’s inclination towards a divided, network-centric view of the world, in which carrier-grade cellular trumps the Wild West environment of Wi-Fi every time. It’s certainly well understood that there can be significant variations in QoS in the public Wi-Fi arena. And while it has phenomenal indoor coverage to its credit, the process of connecting to these Wi-Fi networks can be frustrating and off-putting for consumers.

But instead of viewing these issues as grounds to discriminate against public Wi-Fi, mobile operators should seize the opportunity to improve their customers’ experience of this hugely valuable, if sometimes variable, resource. The issues of quality and ease of access can be managed to ensure quality of experience across a service that combines Wi-Fi and cellular to play to both sets of strengths.
Meanwhile, you only have to look to the surge of enthusiasm among operators for Wi-Fi calling over the last two years for evidence (if your own experience is not enough) that cellular itself is often found wanting.

The truth is that consumers are reliant on both wireless and cellular, rendering the “competing networks” mentality outdated. In the smartphone era the ability to connect to the mobile network is simply one of a number of important underlying enablers required of the device and — judged in terms of time spent connected — it may not even be first among equals. What end users need is a service tailored to their movements and habits, and that has to be based upon a blend of cellular and Wi-Fi.

Caroline Gabriel, research director and co-founder at Rethink Technology Research, recently observed that: “The more ubiquitous and carrier-class public Wi-Fi becomes, the more the battle heats up between mobile operators and their non-cellular challengers, to see which can harness it more effectively.”

Gabriel was on the money. Recent moves by big cable players to integrate public Wi-Fi into their service offerings — as well as high profile forays into the space by Google — suggest that, openly at least, the battle is being brought by the non-cellular challengers. It is important to note that connectivity is not the product for these providers, it is simply the means by which they bring the product and the smartphone user together. They are free from any vested interest in one particular bearer.

What now remains to be seen is how mobile operators will respond. Will they embrace Wi-Fi — in particular public Wi-Fi — to enhance their customer experience, and increase the stickiness of their service? Or will they continue to abide by the tenets of network separation and cellular supremacy, managing the experience only where it relates to the mobile network?

Old habits are certainly hard to shake. But the greater portion of a customer’s connectivity experience any operator is able to manage, influence, and improve, the better chance they have of securing that customer’s loyalty. For this reason, as a more diverse field of competitors bid to own the customer, services based on a blend of cellular and Wi-Fi are taking root: If you want to serve the people, you need the people’s network.


Dave Fraser is CEO of DeviceScape


May 6, 2016  2:50 PM

Attack the government over broadband, but do it properly

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband

This morning I came close to running a story that would have claimed the government had u-turned on a pledge to provide superfast – 24Mbps and above – broadband to every home in the country by 2020. This was after stories appeared in both The Times and The Telegraph saying that the government had given up on the idea of automatically meeting the needs of the final 5%, those left out of the commercial and Broadband Delivery UK (BDUK) roll-outs.

These stories, and a number of others, were based on a month-and-a-half old consultation document put out by the government, consulting on plans for a universal service obligation (USO) of 10Mbps, which I covered at the time.

In the consultation document, the government did indeed say that an additional broadband roll-out to the final 5% was not proportionate and would not represent value. This is because it was unlikely on the evidence available that every single one of them would want it. Hence the idea of a 10Mbps USO which should it go ahead, those who want it will be able to request and receive.

The thing is, this is not a u-turn as such, because while the needs of the final 5% have been disgracefully neglected by the government, the document merely notes that given the cost to the taxpayer in reaching those remote places, the government believes it makes more sense to establish a request-based scheme to reach them.

Furthermore, it may be that eventual technological advances will bring down the cost of deployment and make a universal roll-out more cost-effective, we simply don’t know yet. Researchers at UCL reckon they’ve hit on a way to reduce the costs with a new type of optical receiver, so work is going on here.

And all this is without considering the money that BT is putting back into the BDUK roll-out as a result of hitting its take-up targets. It hopes this will extend the programme beyond 95%.

In essence, nothing has yet been decided, let alone u-turned upon.

Look, it is right that the government is held to account over the state of the rural broadband roll-out, and it is absolutely not right that 10Mbps will be sufficient. Actually, I think 10Mbps is laughable and David Cameron and Ed Vaizey should be ashamed of themselves for even considering something so unambitious.

This is an emotive issue, particularly for those that want and cannot receive a broadband connection, but I have always believed it does one’s cause no good at all to base arguments on provable inaccuracies.


March 2, 2016  10:51 AM

Why Europe is coming round to BT’s broadband thinking

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband, FTTP, Ofcom, Openreach

by Ronan Kelly

BT has been in the UK headlines quite a lot these past few weeks. News of a rare service interruption aside, BT’s recent press cuttings file is par for the course. A little background chatter about potential regulatory tightening here; some more calls to accelerate faster and cheaper broadband services there…. Whatever they do for the UK’s broadband infrastructure never seems to be enough, and there’s a big problem with that narrative. It just isn’t true.

I’m not saying the media is manufacturing negative perceptions. Maybe it’s just the price you pay for having customers with high expectations and a reputation as one of the most respected telecom operators in the world.

No – BT isn’t religiously fanatical about FTTP

People who believe that BT is somehow opposed to fibre-to-the-premises (FTTP) should pay close attention to the new deal struck between Openreach and the UK’s Home Builders Federation to ensure superfast/ultrafast broadband connections to newly-built homes. The move is proof of BT’s willingness to take advantage of the broadband economics of greenfield sites, and support FTTP where it is commercially viable. Even more tellingly, it adds to the evidence of BT as arch-pragmatist; recognising the importance of being prudent with new technology when it comes to mass-market deployment.

Everybody knows that the UK lags behind the other G8 nations in FTTP penetration, and there are numerous historic and on-going reasons for this. I for one advocate FTTP everywhere (I wouldn’t last long on the board of the FTTP Council Europe if I didn’t) but even I’m pragmatic enough to appreciate that such progress takes time.

Being responsible and pragmatic about broadband evolution can mean faster – not slower – progress for subscribers

National competitiveness is a recurring theme in the UK broadband debate, and earlier this month a prominent manufacturing lobby group added its voice to calls for ‘better connectivity’ to support this aim.

Working towards rather than against this aim, BT’s position appears to be that – other than in those local cases where commercial viability gives the green light to immediate roll out – progress toward the eventual goal of FTTP should be made to deliver sustainably incremental performance improvements. Such an approach, leveraging existing infrastructure – where possible – alongside innovative new technologies like G.fast, is infinitely preferable to telling subscribers they must tread water for the decade or more it could take an operator to deliver fully-fledged FTTP in their area.

Differing broadband views from abroad

Taking national competitiveness from a different perspective are those armchair experts who routinely like to contrast the broadband fortunes of UK with those of its closest neighbour, France. France has experienced something of a broadband renaissance in the last 12 months, with government support and the action of incumbent operators driving an upswing in the deployment of FTTP.

What a lot of people don’t know is that – until very recently – the French regulator (ARCEP) had essentially outlawed the use of VDSL technology. Why does this matter? Well, in the glaring absence of established VDSL estate, French operators are only now presented with the opportunity to build one from scratch and extend the utility of their copper networks. Creating one could bring benefits but would take perhaps three or four years to mature, by which time operators would be presiding over a 10 year old technology. This makes it far more logical for France to put greater impetus behind faster FTTx penetration than the UK. It also hammers home the argument that more flexible and faster-to-market broadband options are more readily available to environments with advanced copper infrastructure.

Among the rest of Europe (with the notable exception of Spain, which has other obstacles in the way of fully leveraging its copper network, as well as higher urban concentrations of MTUs) network topologies and regulatory climates are all far more sympathetic to the BT thinking around broadband.

BT has played it smart with G.fast – and the world waits with bated breath

When G.fast was first introduced a couple of years ago, the most popular application proposed was for Gigabit connectivity over very short loops. We’ve seen this vision realised to great effect in numerous markets and scenarios, but the truth is that this approach isn’t going to work for everyone.

What BT has done is pull those capabilities back to ‘sub-Gigabit’ levels to deliver highly competitive services over longer distances, with the net result delivering many times better performance than currently available over the same infrastructure. That’s far-sighted, innovative and – some might say – courageous. What’s more, it’s making an awful lot of other operators in other parts of the world start thinking differently about their journey to FTTP.

BT’s success has yet to be proven, and there are plenty of nay-sayers who’ll continue to kick up bad headlines until hard, long-term evidence proves them wrong.

BT doesn’t have all the answers, but I can’t really see any obstacles to them achieving what they’ve set out to achieve, and within the pretty aggressive timescales they’ve set themselves.

Ronan Kelly is CTO of EMA and APAC at Adtran


February 3, 2016  3:28 PM

How DNS information can help cut millions from your infrastructure costs

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
CIOs, Load balancing, Network capacity, Nominet, traffic

A guest post by Chris Griffiths, director of new products and business development at Nominet.

One of the hardest things about infrastructure planning in our web-enabled world is estimating capacity needs, especially from a network perspective. Having a piece of content go viral can mean the difference between having a functioning set of web infrastructure and a completely broken one. We’ve all seen what happens when web platforms get overloaded in the old media world – the launch of a new blockbuster stressing the online booking systems of a cinema chain, the release of tickets for a major festival causing the website of its ticketing provider to crash, a funny cat video propagating on email grinding a corporate network to a halt – and so on. In the world of social media and streaming 4K content, these sorts of phenomena spread even more rapidly, even more virally, and can have dramatic effects. As such, the enterprises, service providers and channel businesses delivering IT services need to get smarter about capacity planning to meet the traffic forecast.

The limitations of retrospective analysis

Typically, traffic reports for a service provider, content delivery network or large enterprise might come quarterly, or monthly at best. They will also include retrospective analysis showing the loads the infrastructure has been under over a period of time. In a given month, the network might have been under an average load of 20% but spiked for a day or two at 80-90%… this in itself would trigger warning signals, and perhaps suggest the need for capacity investments for the following year. Indeed, most infrastructure investments are booked at least a year in advance.

The cost of overcoming uncertainty with over-capacity

This in turn means that massive assumptions need to be made about possible needs in the year ahead. Yes, 20% run rate means we’re fine for now, but that 90% spike means we might lack ‘burst’ resource in the event something goes viral or requires additional capacity for any reason. Typically, infrastructure managers will have over-capacity plans representing 50% or more resource than is required. This is down to the length of time it can take to get hardware out in the field; and, after all, the worst thing that can happen for an IT team in this context is to under resource – the productivity cost, never mind the potential reputational impact – is just too dramatic. If you’re a large enterprise with business critical services, a service provider delivering into sectors with high service level agreements, or a channel business providing hosted or managed services – you can’t afford the risk. It would critically damage your credibility with your customers. This is an industry-wide problem, as businesses and service providers find the resource costs of continued capacity planning and deployments needed to stay ahead of demand are escalating year on year.

Predicting future needs with real-time DNS insights

Real-time analysis of DNS requests, coupled with an exploration and analysis of historical traffic growth patterns via DNS data, can give much more granular assessment of what’s going on, allowing you to model growth much more effectively. Historically this wasn’t possible due to the volume of data in play, but modern, real-time DNS intelligence can play a critical role in capacity planning and in dynamic resourcing. After all, if the patterns of requests indicate a spike is imminent and it is flagged in real time, infrastructure teams can spin up temporary burst capacity, adjust load balancing, or otherwise refine their infrastructure provision to withstand the onslaught. And this can be done without quite the same scale of over-capacity investment… even a 10% improvement in capacity planning could translate to millions or tens of millions of savings across a year.

As CTOs and network teams continue to be challenged to do more with less, even while digital and web technologies become more central to the operations of a business, the marginal improvements this kind of insight can give will be key to delivering a competitive edge. CIOs must be creative in their use of DNS data to deliver the kind of value that’s expected of them.


January 27, 2016  11:31 AM

No, nobody is going to nationalise Openreach

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband, ISP, Ofcom, Openreach

It’s always been reasonably clear – provided you don’t work for BT of course – that the Broadband Delivery UK (BDUK) procurement and delivery model was not fit for purpose, lacked ambition, and hindered the work of small altnets and community projects.

So now that BDUK is pretty well-advanced, and we’re well on our way to 90%* fibre** coverage, it’s really welcome to see that the Department for Culture, Media and Sport (DCMS) is consulting on fresh approaches to BDUK as it approaches that 95% milestone, and the issue of touching those remaining premises, down those narrow lanes that seem to be getting longer and more rutted, or in those valleys that seem to be getting deeper and more isolated, can be ducked no longer.

You can read the consultation document here, and if you’re a stakeholder in any way I’d urge you give this some time and take an hour or two in the next month to formulate some responses, because you genuinely can influence the future of BDUK here, and help take superfast*** broadband connectivity to everyone.

For me, one of the most intriguing points raised, and the one I have focused on in my story for Computer Weekly, was the idea of setting up publicly-owned broadband providers as part of a new delivery and funding model for BDUK:

Public sector owned supplier: Under this approach, an arms-length company, owned by one or more Implementing Bodies, would invest in, and provide, broadband infrastructure services to end customers through service contracts.

The thought of buying broadband services from an asset owned by local councils interests me greatly. How would it work? Who would pay who? Council-owned ISPs are unlikely to be on the table, thank God, we’re obviously talking council owned assets supplying private sector suppliers, in this case ISPs.

So could we see the emergence of a model similar to that used by local authorities for contracting out rubbish collection to the likes of Serco, which claims to have a £1.5bn order-book of rubbish?

Giving the idea of public-ownership of national assets some further thought, it then occurred to me that one could theoretically bring the network under government control.

Which would surely mean nationalising Openreach and bringing BT’s infrastructure arm into public ownership.

Can it be done? There is certainly precedent. Just consider the state of the country’s railways, the vocal and influential movement for re-nationalisation, the extremely successful temporary running of the East Coast mainline franchise by the government, and the recent news that Transport for London (TfL) would like to take over the running of some of London’s failing surface rail franchises.

Actually, broadband is a lot like the railways, and BT is (or was) a lot like British Rail. And when you really start looking for parallels, Openreach is a lot like Network Rail – both run the infrastructure over which other companies, such as TalkTalk or Great Western, run the traffic.

So yes, I’m sure a lot of purists and hardcore Corbynites would love to see Openreach brought into state hands, like Network Rail.

Yes, it could be done. Will it be? No.

For starters, it would require the state to compensate BT shareholders to the tune of a lot of money indeed.

Secondly, Network Rail is hardly a picture of success. Just listen to the Evening Standard’s Nick Goodway, who wrote on exactly this topic when defending BT after Grant Shapps’ ‘Broadbad’ report laid into the telco.

Goodway argued that ever since British Rail was privatised under John Major, it has been responsible for a number of major failures – such as the 2007 Grayrigg train crash – and has sucked up millions of pounds of taxpayer money. It would be unwise, he contends, to go down that route a second time.

I can’t say I disagree with him. Didn’t it used to take months on end to get the GPO to install a phone line? Given it still often seems to take Openreach a similar length of time, we hardly need the government getting involved. At least under the current model we have the illusion Openreach isn’t a monopoly.

So stand easy, though the idea is intriguing, nobody is going to be nationalising Openreach any time soon.

*Yes, yes, I know, but it’s a borderline accurate headline stat so we’re running with it.
**We all know they mean fibre-to-the-cabinet.
***Such as it is.


January 25, 2016  10:30 AM

Fudged broadband report damages case for independent Openreach

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
MPS, Ofcom, Openreach

I wanted to put pen to paper today and explain a few things about the tone of our coverage of the British Infrastructure Group’s ‘Broadbad’ report, which was released over the weekend of 23-4 January.

In the report, put together by the Conservatives’ Grant Shapps, a group of over 100 MPs renewed called for the forcible split of Openreach from BT, which is set to be decided upon very soon now.

Some of our regular readers and commenters may feel that the story is critical not of BT, but of BT’s critics. In this case they would be right to feel that.

I have spent a long time deliberating over whether or not BT and Openreach should be forcibly prised apart by the regulator, because I genuinely think that on the whole, when it comes to the national broadband roll-out, BT has tried to make the best it could of a bad situation.

However, there is much to criticise – a blatant and appalling lack of ambition around fibre, the utter fiasco of the Devon and Somerset BDUK contracts, and the often shoddy treatment of community projects, to name just three. All this reflects badly on BT. There is no doubt regulatory change is needed and I hope it will happen.

In general I think there are strong arguments for splitting off Openreach, and regard BT’s occasional hints that it would hinder investment as smelling a bit like blackmail.

But obfuscation and misrepresentation is damaging to public discourse and that is why our piece today on Computer Weekly openly discusses some of the criticisms made in the wake of the report’s release.

For instance, the ‘Broadbad’ report has it that BT has taken £1.7bn of taxpayers’ money, which is absolute nonsense. The figure of £1.7bn is the total amount of funding backing the BDUK scheme, this amount of money has not yet been spent and, following the award of some second-phase contracts to other suppliers, will not all go to BT.

It also makes similarly dubious claims about a universal service obligation of 10Mbps, and presents data that is close to a year out of date!

This excellent blog by a long-time BT critic expands in-depth on a number of the other faults and misrepresentations contained within the British Infrastructure Group’s report.

I cannot in good conscience tell Computer Weekly readers that this report is an accurate reflection of the facts surrounding the national broadband roll-out. If we are going to criticise BT, we have to get our act together!

The report was rushed, it was fudged, it poorly presents good arguments, and ultimately it damages the case for an independent Openreach.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: