The Full Spectrum


November 11, 2016  2:45 PM

Advanced public Wi-Fi is key, but security is imperative

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Security, Wi-Fi, Wireless

In this guest blog by Cloud4Wi’s Jeff Abramowitz, we explore the need to balance public Wi-Fi network installations with advanced security measures to protect users.

Connecting to the internet in public spaces is second nature to most of us. Whether it’s in stores, restaurants, shopping centres, or even railway stations, we take to our mobile devices to compare prices, share our latest purchases via social media, find promotions, or just to pass the time.

The challenge that tends to confront us, of course, is which Wi-Fi network we’ll be able to access with greatest ease. Our phones instantly show us the choice of available networks, and research shows that more people are prioritising speed over safety, with more than 70% of people ranking connection speed a higher priority than the security of the connection.

Security experts recently reported that two thirds of people don’t know if the public Wi-Fi they are using is secure or not, despite 80% understanding the dangers of sharing private data over unsecured networks. But in the event that data from a phone is compromised, how many organisations would genuinely accept responsibility for the breach?

The reality is that the public Wi-Fi provider – be it retailer, hotelier, restauranteur, etc. – is the de facto internet security guard and will inevitably assume the role of villain if anything goes wrong while people are using their network to get connected.

Unfortunately, many of these organisations simply do not have sufficient security measures in place, leaving the public exposed to the possibility of device infection, data theft, unauthorised data sharing, or even financial loss.

Wi-Fi education is vital

Furthermore, those that do have adequate security and compliance measures in place don’t always do enough to promote safe public Wi-Fi usage to their customers. Even if a customer unwittingly compromises their own data, the owner of the Wi-Fi network may still be blamed for the incident.

High street brands offering Wi-Fi need to better promote secure usage to their customers. Having the capacity to provide guests with it is important, but without appropriate security measures in place, businesses may inversely harm their reputation, brand recognition, and, ultimately, revenue.

End users need clear instructions on how to log on to the secured Wi-Fi network, and once online, they need to be notified about how aggregated and personal data will be handled and stored. In the case of personal data, customers should be informed about the type of data collected, how it will be used and with whom will it be shared, and where and for how long this data will be stored.

With regards to device tracking, Wi-Fi providers that have agreed to the Future of Privacy Forum’s Mobile Location Analytics Code of Conduct will honour requests from consumers wanting to opt-out of having their location linked to their mobile device.

Equally, public Wi-Fi providers must ensure they are doing all they can to keep their public Wi-Fi secure from being compromised. Creating encrypted passwords and verifying forgotten log-ins ensure only legitimate users can access the network.

Public Wi-Fi providers can also include an idle timeout, logging the user out if their device has been left unused for a certain amount of time. And since the majority of cyber-attacks are the result of uneducated end-user choices, companies should take advantage of apps or the login portal to highlight security dos and don’ts to customers.

Ultimately, businesses need to maintain their credibility by disclosing as much information to their customers as possible. Providing secure guest Wi-Fi is only one aspect of the solution, and brands offering it have a responsibility to help educate their customers on how to use public Wi-Fi sensibly and securely.


Jeff Abramowitz is president of Cloud4Wi and previously founded cloud network management firm PowerCloud.

November 3, 2016  10:13 AM

ISDN’s days are numbered: What should you do?

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
BT, ISDN, PSTN

With BT Wholesale having announced that from 2020 you will no longer be able to purchase integrated services digital network (ISDN) and public switched telephone network (PSTN) circuits as it targets a 2025 switch off date, questions are naturally being asked. Will BT really flick the switch in 2025? What needs to be in place before that can happen, and what are the options for those currently on ISDN/PSTN circuits? In this guest blog post, Bamboo’s Lorrin White explores some of the next steps for customers.

Last year, BT boldly announced its intention to switch off its PSTN and ISDN networks by 2025. This was a smart move. In a world that is fast embracing IP as the standard protocol for all communications services, it was important for BT to declare its intentions to remove the legacy from its network, while giving customers a whole decade to make the switch (if they haven’t done so already).

What does this announcement really mean?

Let’s start by looking at what PSTN and ISDN really are. PSTN is the same phone line most people have at home, whereby analogue voice data flows over circuit-switched copper phone lines. While it may have evolved over the years, PSTN is a very, very old technology, operating on the same fundamental principles as the very first public phone networks of the late 19th Century. It is worth noting that PSTN does not just power voice, as asymmetric digital subscriber line (ADSL) and fibre-to-the-cabinet (FTTC) both operate on it. As of yet BT have not suggested any replacement technology for these, so one can assume that BT’s planned obsolescence of PSTN applies to voice only in this instance.

ISDN, by contrast, is a spritely young thing from the late 1980s. ISDN allows both voice and data services to be delivered over digital lines simultaneously. When it launched, ISDN was well-suited to businesses, as it could support early video-conferencing systems at the same time as an analogue phone line. For a time, it could also offer the fastest internet access available (128 kbps). Naturally, since ISDN is no longer the place to go for video-conferencing or a fast internet connection, its USP has quickly been eroded.

So, BT is killing old tech. What is the ‘new’ tech that is replacing it?

In a nutshell, BT is moving its entire voice network to voice over IP (VoIP). VoIP is hardly ‘new’. But this is a good thing. VoIP has been a proven platform for voice for some time now. It works. If your business has renewed its telephony sometime in the last few years, you may have been told about it (but don’t be surprised if you haven’t, since IP is a whole new game that has been growing steadily in the background, with more and more businesses realising the benefits demonstrated by the early adopters).

VoIP has many advantages over PSTN and ISDN too; it is much quicker to provision new lines, you can reduce your line rental due to needing fewer physical lines, and it is vastly scalable and flexible – for example you can redirect calls to different parts of the country at the flick of a switch, or have a single phone number follow you around the world irrespective of where you’re working.

Why is BT flicking the switch?

Why do you no longer use your Nokia 3210? Same reason, but on a much bigger scale. Also, maintaining multiple legacy networks is very expensive for BT. By converging all services – voice, data, video, and even broadcasting – to the IP protocol, BT only has to maintain one network, not several.

It is also worth bearing in mind that 2025 might not be Doomsday for ISDN. The date is not set in stone. It is BT’s intention to stop selling PSTN and ISDN by 2020 and shut it down completely by 2025 – but this is assuming it has managed to switch all customers over to IP services before then. This means that a viable alternative must be available to everyone well before 2025. For many businesses today, ISDN is still the best they can get. According to Ofcom, there are 33.2 million fixed landlines in the UK (including ISDN), and approximately 7.6 million of these belong to businesses. BT will not turn them off before they have an alternative firmly in place.

What should you do?

Businesses will no longer be able to buy any systems that use PSTN or ISDN by 2020. While 2025 may seem a long way off, 2020 is only just over three years away. If your current traditional telephony contract is up for renewal within the next few years, now is the time to start exploring the benefits of VoIP and SIP technologies.

Assuming you are in an area that can purchase a VoIP system, there are two things you need to consider:

  1. Is your internet connection good enough to deliver VoIP?

While VoIP does not use very much data when compared with other services like video, you must ensure you have enough bandwidth to deliver voice on top of everything else your office does. Some say you need 5Mbps down and 2Mbps up as a bare minimum for a small office, but really the bandwidth you need depends on your individual needs and Quality of Service (QoS) priorities. Bottom line; if you don’t have enough bandwidth or a QoS commitment you could experience poor audio quality or intermittent service and miss out on the full benefits.

  1. Does your office phone system support VoIP?

Most new office phone systems already support VoIP, but if yours doesn’t, you can either replace your entire phone system with an IP one (worthwhile if your handsets are looking tired), or just invest in an IP-enabled on-premise PBX (the box that connects your internal phone system to the external phone network). A hosted telephony system is also a good way to make the switch.

Is it ever worth buying ISDN today?

Given all of the advantages of VoIP over ISDN, in most cases we would recommend investigating whether VoIP is right for your business, and if not now, considering how you will make the move in a few years’ time. And there are still some circumstances where ISDN is a good solution, for example as a disaster recovery or failover option.

Whether or not the 2025 date will stick we’ll wait and see. The final date is dependent on how successful UK wide fibre rollouts are, as without the connectivity to run there is no real alternative to ISDN. Connectivity in the UK is getting faster and faster, so who knows it may happen even sooner than 2025. But while the date may move by a few years here or there, the one certainty is this; ISDN and PSTN are outdated technologies that are simply not as good as modern VoIP. So don’t stay in the past.

Lorrin White is managing director of Bamboo Technology Group


October 4, 2016  3:26 PM

Can the EC telecoms directive make it into UK law before Brexit?

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Brexit, Broadband, EU, telecoms

Last month, European Commission (EC) president Jean-Claude Juncker laid out three key objectives for a new telecoms framework, to be met by 2025.

These are: to give schools, universities, research centres, transport hubs, public services and digital enterprises access to ultrafast broadband capable of delivering speeds of at least 1Gbps; to give every household in the European Union (EU) access to broadband capable of delivering speeds of at least 100Mbps, that can be upgraded to gigabit connectivity later; and to give all urban areas, major roads and railways 5G coverage, with a 5G network to be made available in at least one major city in each EU state by 2020.

This is one of the largest reforms of European telecoms framework for years. It sets ambitious targets. It is a key signal that the EC sees access to ultrafast, fibre-to-the-premises (FTTP) broadband as a necessity as we move towards the connected future. In a way, the reforms mirror signals sent up by Ofcom in its market review earlier this year.

Speaking at an Adtran event earlier this week, consultant Tony Shortall, an expert on telecoms policy, said that the EC was clearly pushing regulators towards very high capacity broadband networks. At the same time, he said, it was was narrowing the range of potential technology solutions. In Shortall’s words, “they don’t say it’s fibre, but they do say it’s not VDSL”.

So what is the process from here on out? The EC’s proposals will go before the European parliament this month, which will then review and develop a position. The Council of Europe will also take a common position, and negotiations will go from there. With a favourable wind, the adoption of the proposals into European law could take place in early 2018.

The Brexit problem

But there is a problem: we might finally have a set date for Brexit. Over the weekend prime minister Theresa May laid out plans to trigger Article 50 in March 2017, which means Brexit will become reality in March 2019.

At the same time, May announced key legislation, dubbed the Great Repeal Bill, that will see all EU legislation transposed into UK law. This means that future governments will be able to keep the good laws, and get rid of the bad laws.

Make no mistake, the EC’s telecoms proposals are good laws. They are by no means examples of the sort of Brussels bureaucracy that 52% of us voted to reject. Far from it. They are an excellent example of the sort of proactive legislation that the EC was designed for, and could bring benefits to millions of EU citizens.

Many business leaders expect that the UK economy will to take one hell of a beating once Brexit actually takes place. So as Computer Weekly has argued more than once, it is absolutely vital the government takes action to bolster Britain’s connectivity. We must act now to give our hard-working businesses, the lifeblood of the economy, as competitive an advantage as possible.

Pushing to adopt the EC’s proposals into European law in time for them to be transposed into British law by March 2019 must now be a key objective for this government.


August 1, 2016  10:38 AM

ISPs should prepare to fight to own home Wi-Fi services

Alex Scroxton Alex Scroxton Profile: Alex Scroxton

For many consumers, a Wi-Fi router is no more than an ugly but necessary evil to ensure they have untethered access to the internet on all of their devices, says XCellAir co-founder Todd Mersch.

For that reason, the majority – at least 60% according to an IHS Report – are more than happy to let service providers bundle a router with their service. It means one less trip to an overcrowded electronics store to buy an alien-looking, antennae-laden device, boasting baffling features such as “Turbo QAM”.

Have you been to a Currys lately? There are huge displays right as you walk in the door of – drum roll – Wi-Fi routers. These used to be tucked away on some back corner shelf, next to the cables and PC components. So what’s going on?

Consumers are getting smarter. They have realised Wi-Fi is the performance driver for their home network – a network that increasingly must cope with not only multiple devices per resident, but also video streaming and Internet of Things (IoT) applications.

This, combined with a largely unmanaged, inconsistent and therefore frustrating Wi-Fi service from their operator, has them looking elsewhere for a better experience – and potentially switching internet service providers as evidenced in a recent Consumer Reports survey.

In turn, this burgeoning demand means there are growing numbers of new entrants providing direct-to-consumer smart Wi-Fi equipment. Companies like Eero and Luma have been launched with that very aim of fixing Wi-Fi in the home. Most of them focus on covering the whole home (in Eero’s case this comes with a $500 price tag), but when combined with cloud-based management tools consumers are now empowered to manage their own network.

Todd Mersch XCellAir

Service providers can go from zero to hero by offering a managed, consistent, and high performance home Wi-Fi service, says Mersch.

Good for consumers?

Sounds cool – right? As a consumer you look like a tech savvy wireless guru and as a service provider your customers stop calling when their Wi-Fi does not work.

But this is a short-sighted view. For the individual it’s great if you know anything about Wi-Fi, want to own service problems and are happy to shell out a lot of money for the right to do it.

However the service provider quietly forfeits the most critical component of the customer relationship and guarantees its relegation to bit-pipe status.

On the flip side, operators have an opportunity to not only save this relationship but deepen it by offering a managed, consistent, and high performance home Wi-Fi service.

To understand how an operator can move from zero to hero, you first have to have a picture of what drives the erratic and frustrating performance of today’s status quo. Wi-Fi issues can be put into a few categories.

Wi-Fi interference, congestion and coverage

First off is interference and congestion. This is where too many devices and routers are trying to use the same Wi-Fi spectrum at the same time. That challenge is exacerbated by consumers buying a more powerful router or adding unmanaged extenders.

It becomes a bit like shouting above the noise scenario – all it encourages is for everyone to raise their voices, and once everyone is shouting, no one is any better off. A whole bunch of unmanaged APs near each other means that channels are used inefficiently. Research we carried out last year found that in an average area, capacity unavailable thanks to inefficiencies was enough to stream another 25 high-definition videos.

Through intelligent and automated use of unlicensed spectrum, the operator can tap into the latent capacity and deliver better, more reliable performance. And this won’t be possible if consumers are using their own router.

The second big problem is coverage. In many larger homes, it is increasingly difficult to deliver whole home coverage with one router. Additionally, the actual placement of the Wi-Fi access point (AP) is often not ideal.

This does not mean every home needs multiple access points. The key is for the service provider to be able to identify what is driving the coverage issue – placement or the size of the area the AP is trying to cover – and proactively solve the problem. This can happen at installation as well as during operation.

Finally, there is the inherent fragility of the Wi-Fi hardware itself. In most cases, these products are mass produced at low-cost and are not designed to take the punishment we dole out. An operator has a unique skill set developed over decades of delivering highly-reliable service. By automating basic fault avoidance techniques – like resetting the router before your service is impacted – they can provide reliability not currently available.

By delivering reliable, high-performance, broad coverage and, importantly, a managed service – service providers can actively monetise Wi-Fi with services such as Wi-Fi calling, wireless video distribution and in-home IoT.

But if they do not deliver, this new breed of provider will take over the customer relationship, and rob the service provider of this opportunity.


Todd Mersh is co-founder and EVP of sales and marketing at California-based XCellAir


July 12, 2016  10:03 AM

Consumer broadband habits versus next-gen services

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Bandwidth, Broadband, streaming

G-PON as we know it today is rapidly approaching the end of the growth phase in its technology lifecycle, writes Adtran’s Ronan Kelly. There is a surge in gigabit broadband service offerings that began in the US and is becoming increasingly prevalent across Europe. From the consumer perspective, there is a wave of new technologies such as 4K and virtual reality, which while still in their early stages, are expected to gain significant traction in the coming years.

Whilst today’s gigabit-enabled consumer is not yet utilising the service to its full capacity, this will change in the next three to five years as applications begin to emerge that take advantage of the capacity on offer. When that happens, we will reach the tipping point where G-PON deployments are no longer prudent, and next generation PON technologies such as NGPON2 and XGS-PON become the de facto standard.

If an average consumers’ available broadband data were quadrupled today their usage patterns would not change for three to four months. This is because their experience when using online services is largely informed by service capabilities and once a better service is available it takes time to find out what they can do with it.

As upgrades are rolled out, consumers tend to continue using their bandwidth as before, albeit with a better experience, whilst slowly exploring services that were not previously usable. With that in mind, everything we are using today is largely created for the bandwidth that has been available for the last four or five years.

What’s behind the change?

Historically, bandwidth always comes before an application; nobody develops an app that needs more bandwidth than is available to the mass market, otherwise it would be useless. This is no great secret.

Look at the technologies we take for granted today, such as FaceTime, video streaming and massive attachments like photographs. We would have a very different experience trying to use them on the 1MB connections we had in 2011. But with exponentially faster bandwidth on the horizon, we are only just beginning to see what developers can create. With the emergence of cloud-first services and entertainment such as virtual reality, high quality video services like Netflix and Amazon, 4K and higher resolutions will be the norm on all screens.

If you walked into Currys two years ago you would have struggled to find more than a handful of 4K televisions available in a small premium suite. Today, however, you’re spoilt for choice on what 4K screens you can purchase, and HD TVs are tucked away in a corner. With PC displays now matching 4K resolutions, and tablets set to follow, it’s obvious that the consumer electronic manufacturers have historically driven uptake in the consumer space.

It’s true to say that it’s always a race in the consumer electronics space. The moment one manufacturer releases a tablet with 4K capabilities, the others quickly follow suit. Before you know it, the market has shifted, leaving the vast majority of consumers with 4K ready technology. Similarly, when a consumer upgrades to the iPhone 6S and starts recording video, that video is now recording in 4K. Other consumers then move to the next phone, start recording and without any conscious thought drive more change.

With screen resolution constantly increasing, and us being at the benchmark level for 4K within those screens, when people stream content via services like Netflix or Amazon they already have a 4K offering. Consumers start using these services on these devices and, by default, again start to change and influence their broadband usage behaviour.

It took HD 10 years from when it first appeared to get to the point when there was a decent amount of content available for consumers, because satellite and cable TV companies dictated its proliferation. Until they upgraded their background infrastructure, users couldn’t watch HD content on anything but a Blu Ray player. 4K is at this early stage; there is currently next to no live broadcast content yet available and all 4K offerings come via streaming.

Back when HD launched, streaming services didn’t exist. Coupled with the flat screen TV revolution, the HD capability became a common feature in lounges across the country faster than HD broadcast content was made available. Due to the absence of competing streaming services, there was no major pressure for content providers to start broadcasting in HD. It’s different now – major pressure is coming from the over-the-top (OTT) providers, which is why we’re starting to see a more rapid push from some of the dominant players, who are gearing up to have 4K content ready even though the consumers are not adopting as quickly as they did when flatscreens first became available.

The agent of change is now different. Me getting rid of the huge TV taking up space in my living room was a compelling argument to move to flatscreen, and having HD as part of that transaction was a nice bonus, but it wasn’t the primary driver for change. Now, the consumers have the flatscreen and a 4K flatscreen coming along isn’t quite as compelling, so the rate of change, unless pushed by the electronics manufacturers, will be slower. Typically TVs have a 10-15 year lifespan before they’re replaced. If we reflect on the timeline of the flatscreen revolution, the early adopters are now well in that window. Will this serve as the catalyst that accelerates 4K adoption?

While some of these factors have long been the cause of change in bandwidth speed, the surge in adoption of services from OTT services is driving demand for even higher speeds. As consumers move to 4K, they will be met with a very different experience; every TV is now smart, so more streaming content will be accessible than ever. Similarly the gaming industry now pushes consumers to download or stream games, so the distribution model for that industry has totally shifted to a cloud-based model, much like the music industry.

Looking ahead

Looking ahead there are technologies coming that will put a huge strain on the network, the connected car being at the forefront of that charge. Autonomous vehicles offer countless benefits, the most obvious being the socioeconomic impacts: with cars essentially driving themselves, the ‘driver’ takes back the productive or leisure time which had previously been lost to travel.

However, we still have a long way to go to achieve this. The capacity requirement to implement this is typically up to 27mbps per car. Take 100 square metres of London on any given day and imagine the number of cars using broadband, and you’ll quickly get a sense of the capacity that we’ll need.

With these factors in mind, it’s vital to continue to develop next-generation broadband technologies, such as,NG-PON2, XGS-PON, and G.fast, that use existing infrastructure and enable gigabit speeds in a timeframe that matches demand. As technology advances, so too must our available broadband offering. If the two cannot progress at similar speeds, or if broadband cannot keep up with the digital consumer, then society will suffer.


Ronan Kelly is CTO at Adtran and a frequent commentator on broadband issues.


June 28, 2016  8:58 AM

Broadband must not be forgotten after Brexit

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Brexit, Broadband, fibre, G.fast

In the wake of last week’s vote to leave the European Union (EU), if we are indeed going to invoke Article 50 of the Lisbon Treaty later in the year – and despite the lingering hope that there might be some sort of loophole to wiggle through, Brexit now seems very likely – there must be a concerted effort to make the best of it.

As our economy crashes and burns, our banks look to relocate and the last of our manufacturing base flees the country, we are going to need to think strategically in order to remain competitive against a 27 member trading bloc that our country has gravely offended, and that will want to see us fail.

And make no mistake, the EU will take us to the cleaners in the negotiations to come… who can really blame them?

One way that we can compete on a level footing to ensure we are light years, and light speeds ahead of them when it comes. We are going to have to have, hands down, the very best connectivity possible. South Korean-style, if possible.

Yes, ultimately this means universal fibre-to-the-premises (FTTP) to every home, office and factory in the country, regardless of location. But since we now have even less money than we thought we did, this possibility is now even more remote than it was before the referendum. For now, it’s not realistic to push for it.

Yet the technology already exists to go ultrafast at low cost. G.fast technology is not true fibre, but it can deliver speeds of over 100Mbps, in the field, right now, with just a little upgrade in the cabinet. We don’t need to dig the roads up, we don’t need to wait for wayleaves. As the post-Brexit squeeze tightens its grip, G.fast suddenly begins to look much more compelling. I believe it is time for us to get behind it.

We face obstacles, for sure. Just a week before the decisive vote, Computer Weekly met with Openreach’s CEO Clive Selley, who said Brexit would materially damage his ability to invest in new engineers and commercial roll-out of broadband. This is a significant concern, and something that must not be allowed to happen. The onus is on Selley to stick up for Openreach and protect it at all costs.

Equally, whoever inherits the government in the next few months will have to take charge of broadband policy. Ed Vaizey, the current minister in charge, is a true supporter of the digital economy and connectivity, but we have to acknowledge that he may be replaced under new leadership. If a new minister is put in charge at DCMS, they must be a powerful and credible voice for our sector.

Britain faces an uphill battle to stay competitive outside the EU bloc. Ultrafast broadband networks will help to keep the British economy competitive in the post Brexit world. We cannot lose our focus now.


June 22, 2016  10:33 AM

Telematics, insurance and the driverless car evolution

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Insurance, Internet of Things, telematics

The political momentum behind the connected future enabled by the internet of things (IoT) is clear, writes Jonathan Hewett of Octo Telematics. Increasingly we see insurance positioned at the frontline of this technology shift as players across the auto industry recognise the key role it will play as we transition to the world of the driverless car.

In the Queen’s Speech, the UK became the first country in the world to announce its intention to legislate on insurance requirements for driverless cars. The Modern Transport Bill will “ensure the UK is at the forefront of technology for new forms of transport, including autonomous and electric vehicles”.

While some commentators contend that driverless cars will make motor insurance unnecessary, as summed up by roads minister Andrew Jones, the suggestion “is a lot of pie in the sky.” Insurance will remain fundamental, but policies of the future will look very different to those we buy today.

By harnessing driver data and applying data analytics, insurers can benefit from the insight they need to forecast risk.

By harnessing driver data and applying data analytics, driverless car insurers can benefit from the insight they need to forecast risk, says Hewett.

Currently, insurers have to price their policies based on proxies and assumptions. The risk premium for car insurance, is currently determined by factors including the driver’s age, postcode or the model of car they own, regardless of their actual driving behaviour.

However, by harnessing driver data and applying sophisticated data analytics, insurers can benefit from the crucial insight they need to forecast individual risk. This is an increasingly essential capability in the shift towards connected and driverless cars.

Insight into government thinking around driverless car insurance suggests that future policies would go even further than this, essentially holding vehicles, and by extension their manufacturers, liable for accidents. This is a very different proposition for insurers to consider. The focus is taken away from the driver and their involvement. Instead it falls on the robustness of the algorithms and technology provided by manufacturers that enable vehicles to make decisions and drive.

Accordingly, as autonomous vehicles become more widely adopted, the driverless car industry will undoubtedly require third parties to verify potential insurance claims and determine liability. In these situations, on-board telematics technology will be crucial in enabling an accurate data-informed reconstruction of any incident. For instance, should a Google and Tesla car crash into one another as they simultaneously avoid a pedestrian, the data gathered will be essential to determine which car is at fault and which manufacturer is liable. However for reasons of impartiality a third party will be necessary to hold that data and provide the analytics.

The momentum behind autonomous vehicles has the power to revolutionise the insurance industry, but it could be decades before regulators allow vehicles to be built without manual controls and the fact remains that we are still far from a driverless reality. Nonetheless, some of the technologies integral to the autonomous future are already enabling more equitable insurance and helping to shape better, more self-aware drivers.

Much of the technology that will take the driverless car from concept to reality already exists and is being utilised to assist insurers in processing claims faster and more accurately. And the market is growing fast: by 2020, consultancy Ptolemus estimates that nearly 100 million vehicles will be insured with telematics polices, growing to nearly 50% of world’s vehicles by 2030.

Telematics is providing both manufacturers and insurers with the opportunity to innovate and to develop new products, partnerships and approaches that are grounded in actual data. Not only will data-driven insurance will be a necessary norm for determining crash liability in the driverless future, but it will also be an essential step in making the transition to that future a safe reality.


Jonathan Hewett is global chief marketing officer at Octo Telematics


June 17, 2016  11:14 AM

Total smartphone service: put public Wi-Fi in the mix

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Mobile, Wi-Fi

The recent announcement that UK regulator Ofcom plans to make additional spectrum available for Wi-Fi is great news for consumers, writes DeviceScape’s Dave Fraser. Because Wi-Fi operates in unlicensed spectrum it is very much the people’s network; anyone with a broadband connection is able to get good quality wireless access to the content and services they love.

It’s the perfect complement to cellular services operating in licensed spectrum, not least because Wi-Fi has become established as the de facto access technology for indoor use. We spend most of our time and consume the vast majority of our smartphone data indoors, in spaces where it is often difficult to get a good quality cellular data signal.

Another uniquely populist characteristic of Wi-Fi is that it enables people and entities to share connectivity with one another. If you want a measure of the importance of Wi-Fi sharing, look no further than the fact that etiquette bible Debretts has expressed a position on it.

So, if you were to design a connectivity service from scratch in 2016, based on an understanding of how consumers use their smartphones, shared public Wi-Fi would be a necessary component of that service.

If you were to design a connectivity service from scratch in 2016, shared public Wi-Fi would be a necessary component of that service.

Shared Wi-Fi is most valuable to consumers in public indoor locations, because they visit many more of these than they do other people’s homes or offices. That it is generally available to all consumers, and not just those who subscribe to one particular operator, emphasises its egalitarian nature.

No surprise, then, that it has become a staple of the customer experience offered by numerous retail, service and hospitality businesses, from independent corner cafés right up to the UK’s largest retailers. These businesses understand their customers’ desire for unbroken connectivity and cater to it just as they cater to other basic needs.

So, if you were to design a connectivity service from scratch in 2016, based on an understanding of how consumers use their smartphones, shared public Wi-Fi would be a necessary component of that service.

The reality today, though, is that almost all smartphone connectivity services are legacies of a time before wireless and smartphones, when the primary functional requirement of a mobile phone was that it connected to the mobile network. Then, as now, a licence to operate a mobile network assured membership of a club whose exclusivity was guaranteed by the high costs of entry and operation, and crucial to the retrieval of those costs.

A companion legacy is the mobile sector’s inclination towards a divided, network-centric view of the world, in which carrier-grade cellular trumps the Wild West environment of Wi-Fi every time. It’s certainly well understood that there can be significant variations in QoS in the public Wi-Fi arena. And while it has phenomenal indoor coverage to its credit, the process of connecting to these Wi-Fi networks can be frustrating and off-putting for consumers.

But instead of viewing these issues as grounds to discriminate against public Wi-Fi, mobile operators should seize the opportunity to improve their customers’ experience of this hugely valuable, if sometimes variable, resource. The issues of quality and ease of access can be managed to ensure quality of experience across a service that combines Wi-Fi and cellular to play to both sets of strengths.
Meanwhile, you only have to look to the surge of enthusiasm among operators for Wi-Fi calling over the last two years for evidence (if your own experience is not enough) that cellular itself is often found wanting.

The truth is that consumers are reliant on both wireless and cellular, rendering the “competing networks” mentality outdated. In the smartphone era the ability to connect to the mobile network is simply one of a number of important underlying enablers required of the device and — judged in terms of time spent connected — it may not even be first among equals. What end users need is a service tailored to their movements and habits, and that has to be based upon a blend of cellular and Wi-Fi.

Caroline Gabriel, research director and co-founder at Rethink Technology Research, recently observed that: “The more ubiquitous and carrier-class public Wi-Fi becomes, the more the battle heats up between mobile operators and their non-cellular challengers, to see which can harness it more effectively.”

Gabriel was on the money. Recent moves by big cable players to integrate public Wi-Fi into their service offerings — as well as high profile forays into the space by Google — suggest that, openly at least, the battle is being brought by the non-cellular challengers. It is important to note that connectivity is not the product for these providers, it is simply the means by which they bring the product and the smartphone user together. They are free from any vested interest in one particular bearer.

What now remains to be seen is how mobile operators will respond. Will they embrace Wi-Fi — in particular public Wi-Fi — to enhance their customer experience, and increase the stickiness of their service? Or will they continue to abide by the tenets of network separation and cellular supremacy, managing the experience only where it relates to the mobile network?

Old habits are certainly hard to shake. But the greater portion of a customer’s connectivity experience any operator is able to manage, influence, and improve, the better chance they have of securing that customer’s loyalty. For this reason, as a more diverse field of competitors bid to own the customer, services based on a blend of cellular and Wi-Fi are taking root: If you want to serve the people, you need the people’s network.


Dave Fraser is CEO of DeviceScape


May 6, 2016  2:50 PM

Attack the government over broadband, but do it properly

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband

This morning I came close to running a story that would have claimed the government had u-turned on a pledge to provide superfast – 24Mbps and above – broadband to every home in the country by 2020. This was after stories appeared in both The Times and The Telegraph saying that the government had given up on the idea of automatically meeting the needs of the final 5%, those left out of the commercial and Broadband Delivery UK (BDUK) roll-outs.

These stories, and a number of others, were based on a month-and-a-half old consultation document put out by the government, consulting on plans for a universal service obligation (USO) of 10Mbps, which I covered at the time.

In the consultation document, the government did indeed say that an additional broadband roll-out to the final 5% was not proportionate and would not represent value. This is because it was unlikely on the evidence available that every single one of them would want it. Hence the idea of a 10Mbps USO which should it go ahead, those who want it will be able to request and receive.

The thing is, this is not a u-turn as such, because while the needs of the final 5% have been disgracefully neglected by the government, the document merely notes that given the cost to the taxpayer in reaching those remote places, the government believes it makes more sense to establish a request-based scheme to reach them.

Furthermore, it may be that eventual technological advances will bring down the cost of deployment and make a universal roll-out more cost-effective, we simply don’t know yet. Researchers at UCL reckon they’ve hit on a way to reduce the costs with a new type of optical receiver, so work is going on here.

And all this is without considering the money that BT is putting back into the BDUK roll-out as a result of hitting its take-up targets. It hopes this will extend the programme beyond 95%.

In essence, nothing has yet been decided, let alone u-turned upon.

Look, it is right that the government is held to account over the state of the rural broadband roll-out, and it is absolutely not right that 10Mbps will be sufficient. Actually, I think 10Mbps is laughable and David Cameron and Ed Vaizey should be ashamed of themselves for even considering something so unambitious.

This is an emotive issue, particularly for those that want and cannot receive a broadband connection, but I have always believed it does one’s cause no good at all to base arguments on provable inaccuracies.


March 2, 2016  10:51 AM

Why Europe is coming round to BT’s broadband thinking

Alex Scroxton Alex Scroxton Profile: Alex Scroxton
Broadband, FTTP, Ofcom, Openreach

by Ronan Kelly

BT has been in the UK headlines quite a lot these past few weeks. News of a rare service interruption aside, BT’s recent press cuttings file is par for the course. A little background chatter about potential regulatory tightening here; some more calls to accelerate faster and cheaper broadband services there…. Whatever they do for the UK’s broadband infrastructure never seems to be enough, and there’s a big problem with that narrative. It just isn’t true.

I’m not saying the media is manufacturing negative perceptions. Maybe it’s just the price you pay for having customers with high expectations and a reputation as one of the most respected telecom operators in the world.

No – BT isn’t religiously fanatical about FTTP

People who believe that BT is somehow opposed to fibre-to-the-premises (FTTP) should pay close attention to the new deal struck between Openreach and the UK’s Home Builders Federation to ensure superfast/ultrafast broadband connections to newly-built homes. The move is proof of BT’s willingness to take advantage of the broadband economics of greenfield sites, and support FTTP where it is commercially viable. Even more tellingly, it adds to the evidence of BT as arch-pragmatist; recognising the importance of being prudent with new technology when it comes to mass-market deployment.

Everybody knows that the UK lags behind the other G8 nations in FTTP penetration, and there are numerous historic and on-going reasons for this. I for one advocate FTTP everywhere (I wouldn’t last long on the board of the FTTP Council Europe if I didn’t) but even I’m pragmatic enough to appreciate that such progress takes time.

Being responsible and pragmatic about broadband evolution can mean faster – not slower – progress for subscribers

National competitiveness is a recurring theme in the UK broadband debate, and earlier this month a prominent manufacturing lobby group added its voice to calls for ‘better connectivity’ to support this aim.

Working towards rather than against this aim, BT’s position appears to be that – other than in those local cases where commercial viability gives the green light to immediate roll out – progress toward the eventual goal of FTTP should be made to deliver sustainably incremental performance improvements. Such an approach, leveraging existing infrastructure – where possible – alongside innovative new technologies like G.fast, is infinitely preferable to telling subscribers they must tread water for the decade or more it could take an operator to deliver fully-fledged FTTP in their area.

Differing broadband views from abroad

Taking national competitiveness from a different perspective are those armchair experts who routinely like to contrast the broadband fortunes of UK with those of its closest neighbour, France. France has experienced something of a broadband renaissance in the last 12 months, with government support and the action of incumbent operators driving an upswing in the deployment of FTTP.

What a lot of people don’t know is that – until very recently – the French regulator (ARCEP) had essentially outlawed the use of VDSL technology. Why does this matter? Well, in the glaring absence of established VDSL estate, French operators are only now presented with the opportunity to build one from scratch and extend the utility of their copper networks. Creating one could bring benefits but would take perhaps three or four years to mature, by which time operators would be presiding over a 10 year old technology. This makes it far more logical for France to put greater impetus behind faster FTTx penetration than the UK. It also hammers home the argument that more flexible and faster-to-market broadband options are more readily available to environments with advanced copper infrastructure.

Among the rest of Europe (with the notable exception of Spain, which has other obstacles in the way of fully leveraging its copper network, as well as higher urban concentrations of MTUs) network topologies and regulatory climates are all far more sympathetic to the BT thinking around broadband.

BT has played it smart with G.fast – and the world waits with bated breath

When G.fast was first introduced a couple of years ago, the most popular application proposed was for Gigabit connectivity over very short loops. We’ve seen this vision realised to great effect in numerous markets and scenarios, but the truth is that this approach isn’t going to work for everyone.

What BT has done is pull those capabilities back to ‘sub-Gigabit’ levels to deliver highly competitive services over longer distances, with the net result delivering many times better performance than currently available over the same infrastructure. That’s far-sighted, innovative and – some might say – courageous. What’s more, it’s making an awful lot of other operators in other parts of the world start thinking differently about their journey to FTTP.

BT’s success has yet to be proven, and there are plenty of nay-sayers who’ll continue to kick up bad headlines until hard, long-term evidence proves them wrong.

BT doesn’t have all the answers, but I can’t really see any obstacles to them achieving what they’ve set out to achieve, and within the pretty aggressive timescales they’ve set themselves.

Ronan Kelly is CTO of EMA and APAC at Adtran


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: