A guest post by Chris Griffiths, director of new products and business development at Nominet.
One of the hardest things about infrastructure planning in our web-enabled world is estimating capacity needs, especially from a network perspective. Having a piece of content go viral can mean the difference between having a functioning set of web infrastructure and a completely broken one. We’ve all seen what happens when web platforms get overloaded in the old media world – the launch of a new blockbuster stressing the online booking systems of a cinema chain, the release of tickets for a major festival causing the website of its ticketing provider to crash, a funny cat video propagating on email grinding a corporate network to a halt – and so on. In the world of social media and streaming 4K content, these sorts of phenomena spread even more rapidly, even more virally, and can have dramatic effects. As such, the enterprises, service providers and channel businesses delivering IT services need to get smarter about capacity planning to meet the traffic forecast.
The limitations of retrospective analysis
Typically, traffic reports for a service provider, content delivery network or large enterprise might come quarterly, or monthly at best. They will also include retrospective analysis showing the loads the infrastructure has been under over a period of time. In a given month, the network might have been under an average load of 20% but spiked for a day or two at 80-90%… this in itself would trigger warning signals, and perhaps suggest the need for capacity investments for the following year. Indeed, most infrastructure investments are booked at least a year in advance.
The cost of overcoming uncertainty with over-capacity
This in turn means that massive assumptions need to be made about possible needs in the year ahead. Yes, 20% run rate means we’re fine for now, but that 90% spike means we might lack ‘burst’ resource in the event something goes viral or requires additional capacity for any reason. Typically, infrastructure managers will have over-capacity plans representing 50% or more resource than is required. This is down to the length of time it can take to get hardware out in the field; and, after all, the worst thing that can happen for an IT team in this context is to under resource – the productivity cost, never mind the potential reputational impact – is just too dramatic. If you’re a large enterprise with business critical services, a service provider delivering into sectors with high service level agreements, or a channel business providing hosted or managed services – you can’t afford the risk. It would critically damage your credibility with your customers. This is an industry-wide problem, as businesses and service providers find the resource costs of continued capacity planning and deployments needed to stay ahead of demand are escalating year on year.
Predicting future needs with real-time DNS insights
Real-time analysis of DNS requests, coupled with an exploration and analysis of historical traffic growth patterns via DNS data, can give much more granular assessment of what’s going on, allowing you to model growth much more effectively. Historically this wasn’t possible due to the volume of data in play, but modern, real-time DNS intelligence can play a critical role in capacity planning and in dynamic resourcing. After all, if the patterns of requests indicate a spike is imminent and it is flagged in real time, infrastructure teams can spin up temporary burst capacity, adjust load balancing, or otherwise refine their infrastructure provision to withstand the onslaught. And this can be done without quite the same scale of over-capacity investment… even a 10% improvement in capacity planning could translate to millions or tens of millions of savings across a year.
As CTOs and network teams continue to be challenged to do more with less, even while digital and web technologies become more central to the operations of a business, the marginal improvements this kind of insight can give will be key to delivering a competitive edge. CIOs must be creative in their use of DNS data to deliver the kind of value that’s expected of them.
It’s always been reasonably clear – provided you don’t work for BT of course – that the Broadband Delivery UK (BDUK) procurement and delivery model was not fit for purpose, lacked ambition, and hindered the work of small altnets and community projects.
So now that BDUK is pretty well-advanced, and we’re well on our way to 90%* fibre** coverage, it’s really welcome to see that the Department for Culture, Media and Sport (DCMS) is consulting on fresh approaches to BDUK as it approaches that 95% milestone, and the issue of touching those remaining premises, down those narrow lanes that seem to be getting longer and more rutted, or in those valleys that seem to be getting deeper and more isolated, can be ducked no longer.
You can read the consultation document here, and if you’re a stakeholder in any way I’d urge you give this some time and take an hour or two in the next month to formulate some responses, because you genuinely can influence the future of BDUK here, and help take superfast*** broadband connectivity to everyone.
For me, one of the most intriguing points raised, and the one I have focused on in my story for Computer Weekly, was the idea of setting up publicly-owned broadband providers as part of a new delivery and funding model for BDUK:
Public sector owned supplier: Under this approach, an arms-length company, owned by one or more Implementing Bodies, would invest in, and provide, broadband infrastructure services to end customers through service contracts.
The thought of buying broadband services from an asset owned by local councils interests me greatly. How would it work? Who would pay who? Council-owned ISPs are unlikely to be on the table, thank God, we’re obviously talking council owned assets supplying private sector suppliers, in this case ISPs.
So could we see the emergence of a model similar to that used by local authorities for contracting out rubbish collection to the likes of Serco, which claims to have a £1.5bn order-book of rubbish?
Giving the idea of public-ownership of national assets some further thought, it then occurred to me that one could theoretically bring the network under government control.
Which would surely mean nationalising Openreach and bringing BT’s infrastructure arm into public ownership.
Can it be done? There is certainly precedent. Just consider the state of the country’s railways, the vocal and influential movement for re-nationalisation, the extremely successful temporary running of the East Coast mainline franchise by the government, and the recent news that Transport for London (TfL) would like to take over the running of some of London’s failing surface rail franchises.
Actually, broadband is a lot like the railways, and BT is (or was) a lot like British Rail. And when you really start looking for parallels, Openreach is a lot like Network Rail – both run the infrastructure over which other companies, such as TalkTalk or Great Western, run the traffic.
So yes, I’m sure a lot of purists and hardcore Corbynites would love to see Openreach brought into state hands, like Network Rail.
Yes, it could be done. Will it be? No.
For starters, it would require the state to compensate BT shareholders to the tune of a lot of money indeed.
Secondly, Network Rail is hardly a picture of success. Just listen to the Evening Standard’s Nick Goodway, who wrote on exactly this topic when defending BT after Grant Shapps’ ‘Broadbad’ report laid into the telco.
Goodway argued that ever since British Rail was privatised under John Major, it has been responsible for a number of major failures – such as the 2007 Grayrigg train crash – and has sucked up millions of pounds of taxpayer money. It would be unwise, he contends, to go down that route a second time.
I can’t say I disagree with him. Didn’t it used to take months on end to get the GPO to install a phone line? Given it still often seems to take Openreach a similar length of time, we hardly need the government getting involved. At least under the current model we have the illusion Openreach isn’t a monopoly.
So stand easy, though the idea is intriguing, nobody is going to be nationalising Openreach any time soon.
*Yes, yes, I know, but it’s a borderline accurate headline stat so we’re running with it.
**We all know they mean fibre-to-the-cabinet.
***Such as it is.
I wanted to put pen to paper today and explain a few things about the tone of our coverage of the British Infrastructure Group’s ‘Broadbad’ report, which was released over the weekend of 23-4 January.
In the report, put together by the Conservatives’ Grant Shapps, a group of over 100 MPs renewed called for the forcible split of Openreach from BT, which is set to be decided upon very soon now.
Some of our regular readers and commenters may feel that the story is critical not of BT, but of BT’s critics. In this case they would be right to feel that.
I have spent a long time deliberating over whether or not BT and Openreach should be forcibly prised apart by the regulator, because I genuinely think that on the whole, when it comes to the national broadband roll-out, BT has tried to make the best it could of a bad situation.
However, there is much to criticise – a blatant and appalling lack of ambition around fibre, the utter fiasco of the Devon and Somerset BDUK contracts, and the often shoddy treatment of community projects, to name just three. All this reflects badly on BT. There is no doubt regulatory change is needed and I hope it will happen.
In general I think there are strong arguments for splitting off Openreach, and regard BT’s occasional hints that it would hinder investment as smelling a bit like blackmail.
But obfuscation and misrepresentation is damaging to public discourse and that is why our piece today on Computer Weekly openly discusses some of the criticisms made in the wake of the report’s release.
For instance, the ‘Broadbad’ report has it that BT has taken £1.7bn of taxpayers’ money, which is absolute nonsense. The figure of £1.7bn is the total amount of funding backing the BDUK scheme, this amount of money has not yet been spent and, following the award of some second-phase contracts to other suppliers, will not all go to BT.
It also makes similarly dubious claims about a universal service obligation of 10Mbps, and presents data that is close to a year out of date!
This excellent blog by a long-time BT critic expands in-depth on a number of the other faults and misrepresentations contained within the British Infrastructure Group’s report.
I cannot in good conscience tell Computer Weekly readers that this report is an accurate reflection of the facts surrounding the national broadband roll-out. If we are going to criticise BT, we have to get our act together!
The report was rushed, it was fudged, it poorly presents good arguments, and ultimately it damages the case for an independent Openreach.
Last Friday, just after 5 o’clock in the afternoon, the government published a series of responses to a number of European Commission (EC) consultations around the Digital Single Market.
Personally, I can’t imagine why a government as committed to openness and frank discussion as Prime Minister Cameron’s would want to put out important announcements just after the country’s journalists have headed off to the pub, but that’s just me. I’m sure he must have an excellent reason for it. Either that or he’s a fan of the West Wing.
Anyway, one of these consultations centred on the future needs around broadband speed and quality, issues of vital importance to the digital economy as regular reader will know.
The Commission sought input on the following points:
- Network access regulation: The review will assess whether the regulatory objectives are still fit for purpose or whether they should be complemented with a stronger emphasis on availability and take-up of high-quality connectivity as a policy objective. It will ask whether the operators who are investing significant amounts of money in the very highest capacity networks need greater assurances of a long-term return on investment. The difficulty in relying on infrastructure competition to drive network investment in more rural areas points to a possible need to reassess the appropriate degree of complementarity between sector-specific access regulation and other measures which could enable efficient public intervention.
- Spectrum management: to promote the deployment of high speed wireless networks and the further development of electronic communications and innovation, the review should focus on how greater consistency could be achieved by different means and through different levels of harmonisation or coordination (more efficient technical harmonisation; more convergent assignment conditions and timing to support investment);
- Communication Services: to look at ways of updating sector-specific rules if they are still needed, while ensuring a level regulatory playing field for all players to the extent that they provide comparable services.
- Universal service: the review will evaluate whether the current scope of mandatory services is consistent with market and technological developments. In particular, the role of broadband as part of universal service and its implications for the financing mechanism will have to be carefully assessed.
- Institutional set-up and governance: this covers the need to enhance regulatory consistency across the Member States and to deliver convergent market outcomes while taking account of different local and national conditions. The review will explore more efficient and simpler arrangements for co-operation between regulators at EU and national level.
Our own government, however, decided it was going to set its own homework, and duly turned in the year-old results of a UK-specific consultation that formed the basis of a digital communications strategy first announced in March 2015.
We will now summarise the UK government’s contribution:
- We used to have dial-up. Now, not so much.
- The Internet of Things. That’s a thing.
- Demand is growing.
- Here are statistics from a friendly analyst that says this with numbers.
- Nobody we bothered to ask cares about speed.
- Everybody we bothered to ask has a vested interest in not delivering FTTP.
- Moaners will get stuck with satellite.
- Private investment good. Austerity good.
Which is exactly the same thing they’ve been saying all along, brings nothing new to the conversation and, one more thing, even BT Openreach execs will agree that FTTP is the right solution if asked the right question – it’s true, I’ve seen them do it. No consensus my a***!
Frankly, it’s an embarrassment to the country, and the ministers responsible at BIS and DCMS should feel ashamed for trying to pass off last year’s science project as something new.
As news reached us of the tragic news of the death of rock icon David Bowie at the age of 69, we were reminded that ever the talented and fearless innovator when it came to music, Bowie also had an eye for technological innovation.
And eighteen years ago, what better way was there to innovate than to set up one’s very own internet service provider (ISP)?
Few people seem to remember it now, but back in 1998, Bowie did indeed set up his very own ISP, BowieNet, the world’s first and so far, probably only ISP ever to be run by a pop genius, unless Adele is working on something.
The ISP launched with an ambitious – for the time – webcast that featured performances from Ani DiFranco, the Jayhawks, Jesus and Mary Chain, Spacehog and the Specials, as well as highlights from Bowie’s 50th birthday bash at Madison Square Garden in New York.
It was to offer high-speed internet access across the world, offering “uncensored” internet access and naturally, an online community and exclusive content curated by Rolling Stone for fans, as well as access to Bowie himself through live chats and video feeds direct from the studio using FullView, a webcam service designed by Lucent’s Bell Labs.
“Initial applications call for the camera to be used for in-studio question-and-answer sessions with Bowie, as well as live ‘you are there’ rehearsal sessions with Bowie and his band,” said the press release, which incredibly is still available.
For $19.95 a month, users also got a a CD-ROM with two classic live audio and video tracks never before released, their own customisable homepage with a generous 20MB allowance, and a your firstname.lastname@example.org email address. The service supported both Internet Explorer and Netscape, at the time the powerhouse of browsers.
“The move would put Bowie near the front of the race to offer the kind of specialised, boutique access to the internet that is expected to challenge larger, broader ISPs,” opined an MTV journalist.
“I wanted to create an environment where not just my fans, but all music fans could be part of a single community where vast archives of music and information could be accessed, views stated and ideas exchanged,” said Bowie at the time.
All things considered, BowieNet – which came to the UK a few months later – had a good run of it, surviving until 2012, when the shutters were finally brought down on the service.
“And one bloke said: ‘Your big end’s gone, mate, the whole thing’s a write-off.’.”
Rest in peace, David. You will be greatly missed by the Computer Weekly team.
After BT Openreach lost Joe Garner to the Nationwide Building Society at a critical juncture in its history, as I have previously written, it is entirely unsurprising that BT should turn to Clive Selley, a man with insider knowledge to take the wheel.
Selley is a safe pair of hands to steer Openreach through the choppy waters that lie ahead over the next couple of months, and an eminently sensible choice from a BT perspective.
My predictions for Clive’s tenure are as follows.
- BT will close ranks around an insider and we will hear less candour from Openreach.
- BT will continue to insist it is delivering measurable service improvements with convincing top line statistics that ignore the voices of those left dangling.
- Actually, nothing will change.
Oh, and …
- If Ofcom forces the separation of BT and Openreach next month, Mr Selley will quit faster than you can say Gigaclear.
A guest blog by Neil Fraser, communications and information provider and leader at ViaSat UK
In March’s budget, George Osborne restated the government’s drive to push high-speed broadband across the UK: including testing satellite connections for remote communities and pledging ultra-fast, 100Mbps broadband to nearly every home in Britain.
However, broadband home internet is only one part of the equation. Last December, the government pledged mobile phone coverage for 85% of the UK’s land mass by 2017 in a £5 billion deal. While this will be more than the 69% currently covered, it doesn’t account for signal blackspots in areas that should be more than adequately serviced.
Even in London, a journey from the centre of town to the M25 can be accompanied by a wavering or even dropped signal and a consequent inability to use data services. For a nation aiming to sell itself to the world on the strength of its technology and communications, and with 4G and beyond a key part of that platform, the inability to guarantee high speed mobile broadband in the capital itself does not inspire confidence.
Who is missing out?
There are several implications of these blackspots. For the consumer, they mean interruptions to increasingly demanding online services. While five years ago, a mobile phone might have been used to check email on the move, the advent of 4G means that video and other data-intensive applications are an expected part of mobile life. If these expectations are disrupted for any reason, then confidence in mobile services, and the willingness to pay for higher speeds and more data, will fall.
For the economy, the fact that a high-speed service will be limited to only certain locations, whether the home, office or coffee shop, can be a barrier to investment. For instance, a business looking to locate within the UK will most likely make the level of connectivity available a key part of its decision process, meaning areas plagued by blackspots will regularly lose out compared to their more connected competition.
For a government that has stated the importance of investment in the regions beyond London, and is pushing the Digital By Default agenda for services to be always-available
online, ensuring that these spots are eliminated simply makes sound business sense.
Furthermore, the importance of removing these blackspots will increase as emergency services review alternatives to the existing Airwave national public safety network. If services do move from the existing Tetra-based network to a system such as cellular which provides greater access to broadband data, any blackspots will prove not just inconvenient and costly, but also dangerous.
Filling the gaps
The continued existence of blackspots is largely down to a very simple issue: not enough cellular base stations. Whilst increasing the capacity of these stations to provide 4G and beyond is one thing, increasing their range is entirely another. Similarly, placing base stations in every conceivable location to ensure continuous coverage is often impossible due to issues of cost and access. There is also the issue of what happens if a base station is damaged or otherwise inoperable; at which point a new, albeit temporary, blackspot is created.
To avoid this, other technologies must be used to supplement the existing signal and ensure that users enjoy consistent and uninterrupted speeds. For instance, Wi-Fi on the London Underground network has allowed mobile users to stay in touch both above and below the surface, regardless of local blackspots. This alone does not address every issue; anyone using the tube can testify that there will be uncomfortable periods of switching between cellular coverage and local Wi-Fi when passing through a coverage area. However, it does help to provide the start of more comprehensive coverage.
Another way to cover off coverage black spots is with satellite broadband. Once seen as a choice of last resort for users too remote to use any other form of communication, satellite has come on leaps and bounds. The advent of higher speed Ka-band satellites, such as Eutelsat over Europe, has increased network capacity from single digits to hundreds of Gigabits per second; meaning data bandwidth for services is measured in the tens of Megabits per second, rather than kilobits. Thanks to this, a satellite service can now guarantee a high speed connection to any area within its coverage, eliminating the last few blackspots. At the same time, receivers are constantly shrinking in size; and can be placed in black spot areas for little cost or direct impact in order to broadcast a signal.
What this means is that the image of satellite as an expensive, slow service that needs a vast antenna to be of use is woefully out of date. Indeed, costs to provide high-speed satellite broadband are now comparable to other high-speed services, with the added benefit of removing access restrictions.
This isn’t to say that cellular should completely give way to satellite or Wi-Fi. Essentially, there is currently no single technology that can guarantee the complete, consistent, high-speed and cost-effective mobile coverage that the 21st century demands. Instead, a combination of technologies will help ensure that all potential areas are covered and that there is redundancy in case of one or more parts of the system failing.
Using a mix of technologies means that, whether in the Orkneys or Oxford Street, mobile users will have the high-speed connection they demand.
When I saw the email from BT’s press office telling me that Joe Garner was leaving Openreach to become CEO of Nationwide, I nearly hit the roof, and I expect Gavin Patterson did as well.
Joe Garner is an enormously personable, open and friendly guy with whom I’ve got on well on the handful of occasions we’ve met, and to be fair he’s done a fantastic job at turning the Openreach ship – such as it is – around but I have to ask the question, why now?
Openreach is facing a rough few months. Depending on how the Ofcom review falls out, we may be heading into a time of legal uncertainty over the organisation’s future as part of the BT Group. I have not yet made up my mind if I would prefer to see it separated from BT or not, but the weight of opinion in favour of a split would seem to suggest that Ofcom will take the possibility very seriously indeed.
Furthermore, the national broadband network is a critical element in building the future digital economy of the UK. It needs ambition, commitment, and a relentless focus on bringing the very best possibly solution to the biggest number of people. Candidly, we deserve and should expect long-term strategic leadership on this.
My hope for Openreach would be for the appointment of a successor who is ready to lead the organisation through a potentially tumultuous few years.
I am fed up with Virgin Media.
This morning they announced a £3bn investment to bring fibre broadband to 4 million new homes.
But every day Virgin Media is neglecting its existing customers.
For a little while now, Virgin Media has been telling me I will be upgraded for free* to ‘supercharged’ 50Mbps speeds by the end of February 2015. Today, on the day that Virgin Media makes its announcement, I logged in to find that this has slipped to July to December 2015. Suitably vague.
I imagine their oh-so-amusingly named vans (stop doing that Virgin, it’s just awful when brands try to be cute) are going to be driving round some new neighbourhoods with new customers!
Since joining Virgin Media in 2010 my prices have risen by an average of 9% a year, substantially above the CPI average of under 3% a year.
All this time, an ongoing ‘overutilisation fault’ that never seems to be fixed means my broadband service has degraded.
Overutilisation. Hang on. Sounds like that means Virgin Media has too many customers on its network! But it wants 4 million new ones? Hang on. That doesn’t compute…
And all this on the day Virgin Media reveals a £3bn investment to extend the network to 4 million new households.
Well, that certainly explains why increased prices for existing customers haven’t actually made the existing service any better.
*With a 9% annual price hike it isn’t actually free is it?
The FCC wants to change the actual definition of broadband from 4Mbps to 25Mbps, which would leave millions of Americans who thought they had broadband with, well, not dial-up obviously, but not quite broadband, either.
The big hope is that this move will make the US broadband market more competitive, and force the big providers to improve their average speeds, for which read, put fibre in the ground.
Wouldn’t it be nice if Ofcom would consider following suit?
Ofcom’s new chief exec Sharon White will formally take over at the comms regulator very soon. Forcing the issue of broadband speed early on would be a good way for her to demonstrate that Ofcom means business.
Oh, who am I kidding? Upping the UK definition of basic broadband from 2Mbps to 24Mbps – which is conveniently BDUK’s current target for superfast – would upset BT and make it harder for it to dismiss complaints over speed (and lack of fibre).
We can’t have that, can we?