The Full Spectrum

June 17, 2016  11:14 AM

Total smartphone service: put public Wi-Fi in the mix

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Mobile, Wi-Fi

The recent announcement that UK regulator Ofcom plans to make additional spectrum available for Wi-Fi is great news for consumers, writes DeviceScape’s Dave Fraser. Because Wi-Fi operates in unlicensed spectrum it is very much the people’s network; anyone with a broadband connection is able to get good quality wireless access to the content and services they love.

It’s the perfect complement to cellular services operating in licensed spectrum, not least because Wi-Fi has become established as the de facto access technology for indoor use. We spend most of our time and consume the vast majority of our smartphone data indoors, in spaces where it is often difficult to get a good quality cellular data signal.

Another uniquely populist characteristic of Wi-Fi is that it enables people and entities to share connectivity with one another. If you want a measure of the importance of Wi-Fi sharing, look no further than the fact that etiquette bible Debretts has expressed a position on it.

So, if you were to design a connectivity service from scratch in 2016, based on an understanding of how consumers use their smartphones, shared public Wi-Fi would be a necessary component of that service.

If you were to design a connectivity service from scratch in 2016, shared public Wi-Fi would be a necessary component of that service.

Shared Wi-Fi is most valuable to consumers in public indoor locations, because they visit many more of these than they do other people’s homes or offices. That it is generally available to all consumers, and not just those who subscribe to one particular operator, emphasises its egalitarian nature.

No surprise, then, that it has become a staple of the customer experience offered by numerous retail, service and hospitality businesses, from independent corner cafés right up to the UK’s largest retailers. These businesses understand their customers’ desire for unbroken connectivity and cater to it just as they cater to other basic needs.

So, if you were to design a connectivity service from scratch in 2016, based on an understanding of how consumers use their smartphones, shared public Wi-Fi would be a necessary component of that service.

The reality today, though, is that almost all smartphone connectivity services are legacies of a time before wireless and smartphones, when the primary functional requirement of a mobile phone was that it connected to the mobile network. Then, as now, a licence to operate a mobile network assured membership of a club whose exclusivity was guaranteed by the high costs of entry and operation, and crucial to the retrieval of those costs.

A companion legacy is the mobile sector’s inclination towards a divided, network-centric view of the world, in which carrier-grade cellular trumps the Wild West environment of Wi-Fi every time. It’s certainly well understood that there can be significant variations in QoS in the public Wi-Fi arena. And while it has phenomenal indoor coverage to its credit, the process of connecting to these Wi-Fi networks can be frustrating and off-putting for consumers.

But instead of viewing these issues as grounds to discriminate against public Wi-Fi, mobile operators should seize the opportunity to improve their customers’ experience of this hugely valuable, if sometimes variable, resource. The issues of quality and ease of access can be managed to ensure quality of experience across a service that combines Wi-Fi and cellular to play to both sets of strengths.
Meanwhile, you only have to look to the surge of enthusiasm among operators for Wi-Fi calling over the last two years for evidence (if your own experience is not enough) that cellular itself is often found wanting.

The truth is that consumers are reliant on both wireless and cellular, rendering the “competing networks” mentality outdated. In the smartphone era the ability to connect to the mobile network is simply one of a number of important underlying enablers required of the device and — judged in terms of time spent connected — it may not even be first among equals. What end users need is a service tailored to their movements and habits, and that has to be based upon a blend of cellular and Wi-Fi.

Caroline Gabriel, research director and co-founder at Rethink Technology Research, recently observed that: “The more ubiquitous and carrier-class public Wi-Fi becomes, the more the battle heats up between mobile operators and their non-cellular challengers, to see which can harness it more effectively.”

Gabriel was on the money. Recent moves by big cable players to integrate public Wi-Fi into their service offerings — as well as high profile forays into the space by Google — suggest that, openly at least, the battle is being brought by the non-cellular challengers. It is important to note that connectivity is not the product for these providers, it is simply the means by which they bring the product and the smartphone user together. They are free from any vested interest in one particular bearer.

What now remains to be seen is how mobile operators will respond. Will they embrace Wi-Fi — in particular public Wi-Fi — to enhance their customer experience, and increase the stickiness of their service? Or will they continue to abide by the tenets of network separation and cellular supremacy, managing the experience only where it relates to the mobile network?

Old habits are certainly hard to shake. But the greater portion of a customer’s connectivity experience any operator is able to manage, influence, and improve, the better chance they have of securing that customer’s loyalty. For this reason, as a more diverse field of competitors bid to own the customer, services based on a blend of cellular and Wi-Fi are taking root: If you want to serve the people, you need the people’s network.

Dave Fraser is CEO of DeviceScape

May 6, 2016  2:50 PM

Attack the government over broadband, but do it properly

Alex Scroxton Alex Scroxton Profile: Alex Scroxton

This morning I came close to running a story that would have claimed the government had u-turned on a pledge to provide superfast – 24Mbps and above – broadband to every home in the country by 2020. This was after stories appeared in both The Times and The Telegraph saying that the government had given up on the idea of automatically meeting the needs of the final 5%, those left out of the commercial and Broadband Delivery UK (BDUK) roll-outs.

These stories, and a number of others, were based on a month-and-a-half old consultation document put out by the government, consulting on plans for a universal service obligation (USO) of 10Mbps, which I covered at the time.

In the consultation document, the government did indeed say that an additional broadband roll-out to the final 5% was not proportionate and would not represent value. This is because it was unlikely on the evidence available that every single one of them would want it. Hence the idea of a 10Mbps USO which should it go ahead, those who want it will be able to request and receive.

The thing is, this is not a u-turn as such, because while the needs of the final 5% have been disgracefully neglected by the government, the document merely notes that given the cost to the taxpayer in reaching those remote places, the government believes it makes more sense to establish a request-based scheme to reach them.

Furthermore, it may be that eventual technological advances will bring down the cost of deployment and make a universal roll-out more cost-effective, we simply don’t know yet. Researchers at UCL reckon they’ve hit on a way to reduce the costs with a new type of optical receiver, so work is going on here.

And all this is without considering the money that BT is putting back into the BDUK roll-out as a result of hitting its take-up targets. It hopes this will extend the programme beyond 95%.

In essence, nothing has yet been decided, let alone u-turned upon.

Look, it is right that the government is held to account over the state of the rural broadband roll-out, and it is absolutely not right that 10Mbps will be sufficient. Actually, I think 10Mbps is laughable and David Cameron and Ed Vaizey should be ashamed of themselves for even considering something so unambitious.

This is an emotive issue, particularly for those that want and cannot receive a broadband connection, but I have always believed it does one’s cause no good at all to base arguments on provable inaccuracies.

March 2, 2016  10:51 AM

Why Europe is coming round to BT’s broadband thinking

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband, FTTP, Ofcom, Openreach

by Ronan Kelly

BT has been in the UK headlines quite a lot these past few weeks. News of a rare service interruption aside, BT’s recent press cuttings file is par for the course. A little background chatter about potential regulatory tightening here; some more calls to accelerate faster and cheaper broadband services there…. Whatever they do for the UK’s broadband infrastructure never seems to be enough, and there’s a big problem with that narrative. It just isn’t true.

I’m not saying the media is manufacturing negative perceptions. Maybe it’s just the price you pay for having customers with high expectations and a reputation as one of the most respected telecom operators in the world.

No – BT isn’t religiously fanatical about FTTP

People who believe that BT is somehow opposed to fibre-to-the-premises (FTTP) should pay close attention to the new deal struck between Openreach and the UK’s Home Builders Federation to ensure superfast/ultrafast broadband connections to newly-built homes. The move is proof of BT’s willingness to take advantage of the broadband economics of greenfield sites, and support FTTP where it is commercially viable. Even more tellingly, it adds to the evidence of BT as arch-pragmatist; recognising the importance of being prudent with new technology when it comes to mass-market deployment.

Everybody knows that the UK lags behind the other G8 nations in FTTP penetration, and there are numerous historic and on-going reasons for this. I for one advocate FTTP everywhere (I wouldn’t last long on the board of the FTTP Council Europe if I didn’t) but even I’m pragmatic enough to appreciate that such progress takes time.

Being responsible and pragmatic about broadband evolution can mean faster – not slower – progress for subscribers

National competitiveness is a recurring theme in the UK broadband debate, and earlier this month a prominent manufacturing lobby group added its voice to calls for ‘better connectivity’ to support this aim.

Working towards rather than against this aim, BT’s position appears to be that – other than in those local cases where commercial viability gives the green light to immediate roll out – progress toward the eventual goal of FTTP should be made to deliver sustainably incremental performance improvements. Such an approach, leveraging existing infrastructure – where possible – alongside innovative new technologies like, is infinitely preferable to telling subscribers they must tread water for the decade or more it could take an operator to deliver fully-fledged FTTP in their area.

Differing broadband views from abroad

Taking national competitiveness from a different perspective are those armchair experts who routinely like to contrast the broadband fortunes of UK with those of its closest neighbour, France. France has experienced something of a broadband renaissance in the last 12 months, with government support and the action of incumbent operators driving an upswing in the deployment of FTTP.

What a lot of people don’t know is that – until very recently – the French regulator (ARCEP) had essentially outlawed the use of VDSL technology. Why does this matter? Well, in the glaring absence of established VDSL estate, French operators are only now presented with the opportunity to build one from scratch and extend the utility of their copper networks. Creating one could bring benefits but would take perhaps three or four years to mature, by which time operators would be presiding over a 10 year old technology. This makes it far more logical for France to put greater impetus behind faster FTTx penetration than the UK. It also hammers home the argument that more flexible and faster-to-market broadband options are more readily available to environments with advanced copper infrastructure.

Among the rest of Europe (with the notable exception of Spain, which has other obstacles in the way of fully leveraging its copper network, as well as higher urban concentrations of MTUs) network topologies and regulatory climates are all far more sympathetic to the BT thinking around broadband.

BT has played it smart with – and the world waits with bated breath

When was first introduced a couple of years ago, the most popular application proposed was for Gigabit connectivity over very short loops. We’ve seen this vision realised to great effect in numerous markets and scenarios, but the truth is that this approach isn’t going to work for everyone.

What BT has done is pull those capabilities back to ‘sub-Gigabit’ levels to deliver highly competitive services over longer distances, with the net result delivering many times better performance than currently available over the same infrastructure. That’s far-sighted, innovative and – some might say – courageous. What’s more, it’s making an awful lot of other operators in other parts of the world start thinking differently about their journey to FTTP.

BT’s success has yet to be proven, and there are plenty of nay-sayers who’ll continue to kick up bad headlines until hard, long-term evidence proves them wrong.

BT doesn’t have all the answers, but I can’t really see any obstacles to them achieving what they’ve set out to achieve, and within the pretty aggressive timescales they’ve set themselves.

Ronan Kelly is CTO of EMA and APAC at Adtran

February 3, 2016  3:28 PM

How DNS information can help cut millions from your infrastructure costs

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
CIOs, Load balancing, Network capacity, Nominet, traffic

A guest post by Chris Griffiths, director of new products and business development at Nominet.

One of the hardest things about infrastructure planning in our web-enabled world is estimating capacity needs, especially from a network perspective. Having a piece of content go viral can mean the difference between having a functioning set of web infrastructure and a completely broken one. We’ve all seen what happens when web platforms get overloaded in the old media world – the launch of a new blockbuster stressing the online booking systems of a cinema chain, the release of tickets for a major festival causing the website of its ticketing provider to crash, a funny cat video propagating on email grinding a corporate network to a halt – and so on. In the world of social media and streaming 4K content, these sorts of phenomena spread even more rapidly, even more virally, and can have dramatic effects. As such, the enterprises, service providers and channel businesses delivering IT services need to get smarter about capacity planning to meet the traffic forecast.

The limitations of retrospective analysis

Typically, traffic reports for a service provider, content delivery network or large enterprise might come quarterly, or monthly at best. They will also include retrospective analysis showing the loads the infrastructure has been under over a period of time. In a given month, the network might have been under an average load of 20% but spiked for a day or two at 80-90%… this in itself would trigger warning signals, and perhaps suggest the need for capacity investments for the following year. Indeed, most infrastructure investments are booked at least a year in advance.

The cost of overcoming uncertainty with over-capacity

This in turn means that massive assumptions need to be made about possible needs in the year ahead. Yes, 20% run rate means we’re fine for now, but that 90% spike means we might lack ‘burst’ resource in the event something goes viral or requires additional capacity for any reason. Typically, infrastructure managers will have over-capacity plans representing 50% or more resource than is required. This is down to the length of time it can take to get hardware out in the field; and, after all, the worst thing that can happen for an IT team in this context is to under resource – the productivity cost, never mind the potential reputational impact – is just too dramatic. If you’re a large enterprise with business critical services, a service provider delivering into sectors with high service level agreements, or a channel business providing hosted or managed services – you can’t afford the risk. It would critically damage your credibility with your customers. This is an industry-wide problem, as businesses and service providers find the resource costs of continued capacity planning and deployments needed to stay ahead of demand are escalating year on year.

Predicting future needs with real-time DNS insights

Real-time analysis of DNS requests, coupled with an exploration and analysis of historical traffic growth patterns via DNS data, can give much more granular assessment of what’s going on, allowing you to model growth much more effectively. Historically this wasn’t possible due to the volume of data in play, but modern, real-time DNS intelligence can play a critical role in capacity planning and in dynamic resourcing. After all, if the patterns of requests indicate a spike is imminent and it is flagged in real time, infrastructure teams can spin up temporary burst capacity, adjust load balancing, or otherwise refine their infrastructure provision to withstand the onslaught. And this can be done without quite the same scale of over-capacity investment… even a 10% improvement in capacity planning could translate to millions or tens of millions of savings across a year.

As CTOs and network teams continue to be challenged to do more with less, even while digital and web technologies become more central to the operations of a business, the marginal improvements this kind of insight can give will be key to delivering a competitive edge. CIOs must be creative in their use of DNS data to deliver the kind of value that’s expected of them.

January 27, 2016  11:31 AM

No, nobody is going to nationalise Openreach

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband, ISP, Ofcom, Openreach

It’s always been reasonably clear – provided you don’t work for BT of course – that the Broadband Delivery UK (BDUK) procurement and delivery model was not fit for purpose, lacked ambition, and hindered the work of small altnets and community projects.

So now that BDUK is pretty well-advanced, and we’re well on our way to 90%* fibre** coverage, it’s really welcome to see that the Department for Culture, Media and Sport (DCMS) is consulting on fresh approaches to BDUK as it approaches that 95% milestone, and the issue of touching those remaining premises, down those narrow lanes that seem to be getting longer and more rutted, or in those valleys that seem to be getting deeper and more isolated, can be ducked no longer.

You can read the consultation document here, and if you’re a stakeholder in any way I’d urge you give this some time and take an hour or two in the next month to formulate some responses, because you genuinely can influence the future of BDUK here, and help take superfast*** broadband connectivity to everyone.

For me, one of the most intriguing points raised, and the one I have focused on in my story for Computer Weekly, was the idea of setting up publicly-owned broadband providers as part of a new delivery and funding model for BDUK:

Public sector owned supplier: Under this approach, an arms-length company, owned by one or more Implementing Bodies, would invest in, and provide, broadband infrastructure services to end customers through service contracts.

The thought of buying broadband services from an asset owned by local councils interests me greatly. How would it work? Who would pay who? Council-owned ISPs are unlikely to be on the table, thank God, we’re obviously talking council owned assets supplying private sector suppliers, in this case ISPs.

So could we see the emergence of a model similar to that used by local authorities for contracting out rubbish collection to the likes of Serco, which claims to have a £1.5bn order-book of rubbish?

Giving the idea of public-ownership of national assets some further thought, it then occurred to me that one could theoretically bring the network under government control.

Which would surely mean nationalising Openreach and bringing BT’s infrastructure arm into public ownership.

Can it be done? There is certainly precedent. Just consider the state of the country’s railways, the vocal and influential movement for re-nationalisation, the extremely successful temporary running of the East Coast mainline franchise by the government, and the recent news that Transport for London (TfL) would like to take over the running of some of London’s failing surface rail franchises.

Actually, broadband is a lot like the railways, and BT is (or was) a lot like British Rail. And when you really start looking for parallels, Openreach is a lot like Network Rail – both run the infrastructure over which other companies, such as TalkTalk or Great Western, run the traffic.

So yes, I’m sure a lot of purists and hardcore Corbynites would love to see Openreach brought into state hands, like Network Rail.

Yes, it could be done. Will it be? No.

For starters, it would require the state to compensate BT shareholders to the tune of a lot of money indeed.

Secondly, Network Rail is hardly a picture of success. Just listen to the Evening Standard’s Nick Goodway, who wrote on exactly this topic when defending BT after Grant Shapps’ ‘Broadbad’ report laid into the telco.

Goodway argued that ever since British Rail was privatised under John Major, it has been responsible for a number of major failures – such as the 2007 Grayrigg train crash – and has sucked up millions of pounds of taxpayer money. It would be unwise, he contends, to go down that route a second time.

I can’t say I disagree with him. Didn’t it used to take months on end to get the GPO to install a phone line? Given it still often seems to take Openreach a similar length of time, we hardly need the government getting involved. At least under the current model we have the illusion Openreach isn’t a monopoly.

So stand easy, though the idea is intriguing, nobody is going to be nationalising Openreach any time soon.

*Yes, yes, I know, but it’s a borderline accurate headline stat so we’re running with it.
**We all know they mean fibre-to-the-cabinet.
***Such as it is.

January 25, 2016  10:30 AM

Fudged broadband report damages case for independent Openreach

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
MPS, Ofcom, Openreach

I wanted to put pen to paper today and explain a few things about the tone of our coverage of the British Infrastructure Group’s ‘Broadbad’ report, which was released over the weekend of 23-4 January.

In the report, put together by the Conservatives’ Grant Shapps, a group of over 100 MPs renewed called for the forcible split of Openreach from BT, which is set to be decided upon very soon now.

Some of our regular readers and commenters may feel that the story is critical not of BT, but of BT’s critics. In this case they would be right to feel that.

I have spent a long time deliberating over whether or not BT and Openreach should be forcibly prised apart by the regulator, because I genuinely think that on the whole, when it comes to the national broadband roll-out, BT has tried to make the best it could of a bad situation.

However, there is much to criticise – a blatant and appalling lack of ambition around fibre, the utter fiasco of the Devon and Somerset BDUK contracts, and the often shoddy treatment of community projects, to name just three. All this reflects badly on BT. There is no doubt regulatory change is needed and I hope it will happen.

In general I think there are strong arguments for splitting off Openreach, and regard BT’s occasional hints that it would hinder investment as smelling a bit like blackmail.

But obfuscation and misrepresentation is damaging to public discourse and that is why our piece today on Computer Weekly openly discusses some of the criticisms made in the wake of the report’s release.

For instance, the ‘Broadbad’ report has it that BT has taken £1.7bn of taxpayers’ money, which is absolute nonsense. The figure of £1.7bn is the total amount of funding backing the BDUK scheme, this amount of money has not yet been spent and, following the award of some second-phase contracts to other suppliers, will not all go to BT.

It also makes similarly dubious claims about a universal service obligation of 10Mbps, and presents data that is close to a year out of date!

This excellent blog by a long-time BT critic expands in-depth on a number of the other faults and misrepresentations contained within the British Infrastructure Group’s report.

I cannot in good conscience tell Computer Weekly readers that this report is an accurate reflection of the facts surrounding the national broadband roll-out. If we are going to criticise BT, we have to get our act together!

The report was rushed, it was fudged, it poorly presents good arguments, and ultimately it damages the case for an independent Openreach.

January 18, 2016  1:30 PM

The government’s EC broadband consultation response is truly pathetic

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband, European Commission, fibre, FTTP

Last Friday, just after 5 o’clock in the afternoon, the government published a series of responses to a number of European Commission (EC) consultations around the Digital Single Market.

Personally, I can’t imagine why a government as committed to openness and frank discussion as Prime Minister Cameron’s would want to put out important announcements just after the country’s journalists have headed off to the pub, but that’s just me. I’m sure he must have an excellent reason for it. Either that or he’s a fan of the West Wing.

Anyway, one of these consultations centred on the future needs around broadband speed and quality, issues of vital importance to the digital economy as regular reader will know.

The Commission sought input on the following points:

  • Network access regulation: The review will assess whether the regulatory objectives are still fit for purpose or whether they should be complemented with a stronger emphasis on availability and take-up of high-quality connectivity as a policy objective. It will ask whether the operators who are investing significant amounts of money in the very highest capacity networks need greater assurances of a long-term return on investment. The difficulty in relying on infrastructure competition to drive network investment in more rural areas points to a possible need to reassess the appropriate degree of complementarity between sector-specific access regulation and other measures which could enable efficient public intervention.
  • Spectrum management: to promote the deployment of high speed wireless networks and the further development of electronic communications and innovation, the review should focus on how greater consistency could be achieved by different means and through different levels of harmonisation or coordination (more efficient technical harmonisation; more convergent assignment conditions and timing to support investment);
  • Communication Services: to look at ways of updating sector-specific rules if they are still needed, while ensuring a level regulatory playing field for all players to the extent that they provide comparable services.
  • Universal service: the review will evaluate whether the current scope of mandatory services is consistent with market and technological developments. In particular, the role of broadband as part of universal service and its implications for the financing mechanism will have to be carefully assessed.
  • Institutional set-up and governance: this covers the need to enhance regulatory consistency across the Member States and to deliver convergent market outcomes while taking account of different local and national conditions. The review will explore more efficient and simpler arrangements for co-operation between regulators at EU and national level.

Our own government, however, decided it was going to set its own homework, and duly turned in the year-old results of a UK-specific consultation that formed the basis of a digital communications strategy first announced in March 2015.

We will now summarise the UK government’s contribution:

  • We used to have dial-up. Now, not so much.
  • The Internet of Things. That’s a thing.
  • Demand is growing.
  • Here are statistics from a friendly analyst that says this with numbers.
  • Nobody we bothered to ask cares about speed.
  • Everybody we bothered to ask has a vested interest in not delivering FTTP.
  • Moaners will get stuck with satellite.
  • Private investment good. Austerity good.

Which is exactly the same thing they’ve been saying all along, brings nothing new to the conversation and, one more thing, even BT Openreach execs will agree that FTTP is the right solution if asked the right question – it’s true, I’ve seen them do it. No consensus my a***!

Frankly, it’s an embarrassment to the country, and the ministers responsible at BIS and DCMS should feel ashamed for trying to pass off last year’s science project as something new.

January 11, 2016  11:23 AM

Remembering David Bowie’s very own ISP, BowieNet

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
BowieNet, innovation, Internet, ISP

As news reached us of the tragic news of the death of rock icon David Bowie at the age of 69, we were reminded that ever the talented and fearless innovator when it came to music, Bowie also had an eye for technological innovation.

And eighteen years ago, what better way was there to innovate than to set up one’s very own internet service provider (ISP)?

Few people seem to remember it now, but back in 1998, Bowie did indeed set up his very own ISP, BowieNet, the world’s first and so far, probably only ISP ever to be run by a pop genius, unless Adele is working on something.

The ISP launched with an ambitious – for the time – webcast that featured performances from Ani DiFranco, the Jayhawks, Jesus and Mary Chain, Spacehog and the Specials, as well as highlights from Bowie’s 50th birthday bash at Madison Square Garden in New York.

It was to offer high-speed internet access across the world, offering “uncensored” internet access and naturally, an online community and exclusive content curated by Rolling Stone for fans, as well as access to Bowie himself through live chats and video feeds direct from the studio using FullView, a webcam service designed by Lucent’s Bell Labs.

“Initial applications call for the camera to be used for in-studio question-and-answer sessions with Bowie, as well as live ‘you are there’ rehearsal sessions with Bowie and his band,” said the press release, which incredibly is still available.

For $19.95 a month, users also got a a CD-ROM with two classic live audio and video tracks never before released, their own customisable homepage with a generous 20MB allowance, and a your email address. The service supported both Internet Explorer and Netscape, at the time the powerhouse of browsers.

“The move would put Bowie near the front of the race to offer the kind of specialised, boutique access to the internet that is expected to challenge larger, broader ISPs,” opined an MTV journalist.

I wanted to create an environment where not just my fans, but all music fans could be part of a single community where vast archives of music and information could be accessed, views stated and ideas exchanged,” said Bowie at the time.

All things considered, BowieNet – which came to the UK a few months later – had a good run of it, surviving until 2012, when the shutters were finally brought down on the service.

“Some say that it was a Mayan prophecy for 2012 and was forever thus. Others suggest it may have been deliberate sabotage by gremlins, finally successful after a sustained attack over many years,” said a statement on Bowie’s Facebook page at the time.

“And one bloke said: ‘Your big end’s gone, mate, the whole thing’s a write-off.’.”

Rest in peace, David. You will be greatly missed by the Computer Weekly team.

January 11, 2016  10:04 AM

No surprise whatsoever as Openreach names new CEO

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Ofcom, Openreach

After BT Openreach lost Joe Garner to the Nationwide Building Society at a critical juncture in its history, as I have previously written, it is entirely unsurprising that BT should turn to Clive Selley, a man with insider knowledge to take the wheel.

Selley is a safe pair of hands to steer Openreach through the choppy waters that lie ahead over the next couple of months, and an eminently sensible choice from a BT perspective.

My predictions for Clive’s tenure are as follows.

  • BT will close ranks around an insider and we will hear less candour from Openreach.
  • BT will continue to insist it is delivering measurable service improvements with convincing top line statistics that ignore the voices of those left dangling.
  • Actually, nothing will change.

Oh, and …

  • If Ofcom forces the separation of BT and Openreach next month, Mr Selley will quit faster than you can say Gigaclear.

December 2, 2015  3:08 PM

Will satellite help us escape the dreaded blackspots?

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
4G, Broadband, Internet access, ViaSat, Wi-Fi

A guest blog by Neil Fraser, communications and information provider and leader at ViaSat UK

In March’s budget, George Osborne restated the government’s drive to push high-speed broadband across the UK: including testing satellite connections for remote communities and pledging ultra-fast, 100Mbps broadband to nearly every home in Britain.

However, broadband home internet is only one part of the equation. Last December, the government pledged mobile phone coverage for 85% of the UK’s land mass by 2017 in a £5 billion deal. While this will be more than the 69% currently covered, it doesn’t account for signal blackspots in areas that should be more than adequately serviced.

Even in London, a journey from the centre of town to the M25 can be accompanied by a wavering or even dropped signal and a consequent inability to use data services. For a nation aiming to sell itself to the world on the strength of its technology and communications, and with 4G and beyond a key part of that platform, the inability to guarantee high speed mobile broadband in the capital itself does not inspire confidence.

Who is missing out?

There are several implications of these blackspots. For the consumer, they mean interruptions to increasingly demanding online services. While five years ago, a mobile phone might have been used to check email on the move, the advent of 4G means that video and other data-intensive applications are an expected part of mobile life. If these expectations are disrupted for any reason, then confidence in mobile services, and the willingness to pay for higher speeds and more data, will fall.

For the economy, the fact that a high-speed service will be limited to only certain locations, whether the home, office or coffee shop, can be a barrier to investment. For instance, a business looking to locate within the UK will most likely make the level of connectivity available a key part of its decision process, meaning areas plagued by blackspots will regularly lose out compared to their more connected competition.

For a government that has stated the importance of investment in the regions beyond London, and is pushing the Digital By Default agenda for services to be always-available
online, ensuring that these spots are eliminated simply makes sound business sense.

Furthermore, the importance of removing these blackspots will increase as emergency services review alternatives to the existing Airwave national public safety network. If services do move from the existing Tetra-based network to a system such as cellular which provides greater access to broadband data, any blackspots will prove not just inconvenient and costly, but also dangerous.

Filling the gaps

The continued existence of blackspots is largely down to a very simple issue: not enough cellular base stations. Whilst increasing the capacity of these stations to provide 4G and beyond is one thing, increasing their range is entirely another. Similarly, placing base stations in every conceivable location to ensure continuous coverage is often impossible due to issues of cost and access. There is also the issue of what happens if a base station is damaged or otherwise inoperable; at which point a new, albeit temporary, blackspot is created.

To avoid this, other technologies must be used to supplement the existing signal and ensure that users enjoy consistent and uninterrupted speeds. For instance, Wi-Fi on the London Underground network has allowed mobile users to stay in touch both above and below the surface, regardless of local blackspots. This alone does not address every issue; anyone using the tube can testify that there will be uncomfortable periods of switching between cellular coverage and local Wi-Fi when passing through a coverage area. However, it does help to provide the start of more comprehensive coverage.

Another way to cover off coverage black spots is with satellite broadband. Once seen as a choice of last resort for users too remote to use any other form of communication, satellite has come on leaps and bounds. The advent of higher speed Ka-band satellites, such as Eutelsat over Europe, has increased network capacity from single digits to hundreds of Gigabits per second; meaning data bandwidth for services is measured in the tens of Megabits per second, rather than kilobits. Thanks to this, a satellite service can now guarantee a high speed connection to any area within its coverage, eliminating the last few blackspots. At the same time, receivers are constantly shrinking in size; and can be placed in black spot areas for little cost or direct impact in order to broadcast a signal.

What this means is that the image of satellite as an expensive, slow service that needs a vast antenna to be of use is woefully out of date. Indeed, costs to provide high-speed satellite broadband are now comparable to other high-speed services, with the added benefit of removing access restrictions.

Mixing technologies

This isn’t to say that cellular should completely give way to satellite or Wi-Fi. Essentially, there is currently no single technology that can guarantee the complete, consistent, high-speed and cost-effective mobile coverage that the 21st century demands. Instead, a combination of technologies will help ensure that all potential areas are covered and that there is redundancy in case of one or more parts of the system failing.

Using a mix of technologies means that, whether in the Orkneys or Oxford Street, mobile users will have the high-speed connection they demand.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: