With eight days until the UK’s scheduled exit from the European Union, a prime minister who has lost control, a paralysed political system, and Britain reduced to a laughing stock on the world stage as it limps into an entirely avoidable humiliation, the country feels like it’s on the edge of total breakdown.
Indeed, over the past 18 hours, over 750,000 angry Remainers – the sort of people who (for some reason) unaccountably insist on indulgences such as supermarkets with things to buy in them and access to life saving drugs – have signed a last-ditch petition on Parliament’s website to revoke Article 50 and forget about the whole thing.
Unfortunately for them, the unintentional side effect of thousands of people logging onto the same website at once is causing an abnormal spike in traffic to the government’s site, with the effect that the servers that host it are repeatedly crashing, resulting in visitors being served a 502 Bad Gateway error (which means one server has received an invalid response from another one upstream, i.e. it’s offline). In effect, an accidental distributed denial-of-service (DDoS) attack is taking place.
Our sister site WhatIs defines a DDoS attack is one in which multiple compromised computer systems attack a target such as a website and cause a denial of service for users by overwhelming the target with a incoming messages, connection requests or malformed packets, causing it to slow down or crash.
DDoS attacks are more usually associated with bad actors, state sponsored hackers, cybercriminals and so on – the massive Mirai IoT botnet attack in late 2016 was a DDoS attack – but it isn’t unheard of for sudden spikes in legitimate traffic to websites to have a similar effect.
Ruby on Rails expert and Unboxed (the firm behind the petitions website) CTO Andrew White explained on Twitter that the site had gone down because calculating the trending count became too much of a load on the database – Unboxed uses an AWS-hosted relational database service (RDS).
There is one way that you can help if you are signing the petition today, please consider others and resist the temptation to sit there refreshing it, and trust that GDS will have things back up and running in good time.
All the world loves robots, and when they are robots holding something a human might hold, doing an activity a human might do, and preferably doing it while looking a bit like a human (two legs, arms, a smiley cartoon face, etc) the world loves them even more.
And at Mobile World Congress in Barcelona last week, it felt like you couldn’t move without bumping into a robot doing something a human can do. There were dancing robots, drumming and piano-playing robots, packing robots and even a barista robot.
What’s 5G got to do with it?
The idea behind all these robotic demos is that the advent of 5G mobile networks in the not-too distant future will really kick-start the robotics revolution.
Operators promise 5G will bring vastly increased speeds, vastly reduced latency, and vastly improved possibilities around edge computing. This will create a perfect storm (if you’re a robot) for robots, because it enables them to accomplish more tasks, quicker, and with even greater precision than was already possible.
That’s the claim, anyway. The reality seems to be a little more clunky – let’s just say the dancing robot posed absolutely no threat to Anton du Beke, the packing robot made more mistakes than a tired Amazon warehouse worker who hasn’t been to the loo for eight hours, and the drumming robot … well … it might have been the best drummer in the Beatles.
Coffee to go
But one robot in particular caught the eyes of the thousands and thousands of Congress-goers feeling the strain the morning after an intense ‘networking’ session.
Using a smartphone application, users place their order in advance, then watch as the robot swings into action, whirling around, grinding, pouring, frothing and icing. When your coffee is ready, you simply approach the booth, type in a unique pin, and Beat delivers your drink through a self-service hatch.
Backed by KT’s 5G base stations, Beat’s human controllers monitor the robot in as near to real-time as makes no difference, checking in to see if something needs repairing, or if the bean supply is running low.
Unlike human baristas, who generally do not connect themselves to networks, edge-based compute power means Beat can prepare three coffees at once, up to 90 per hour, and if left idle with nobody looking at it, it even becomes its own marketing department, waving its arm to attract attention and flashing happy emojis at passers-by. Beat that, underpaid service industry workers!
Humans under threat?
So do robots have a future in the high-stakes world of coffee preparation? I’m not so sure – for while my iced Americano was a much-needed treat on a busy afternoon, Beat completely forgot to write my name on the side of the cup in black marker pen, and that’s not easily forgivable. Baristas of the world, I think your jobs are safe for now.
This is Alice Scotts Town, reporting from Barcelona, for Computer Weekly…
This year, my flight from London’s Stansted Airport to Barcelona for Mobile World Congress, plus two days of hour-and-a-half commutes through horrendous traffic between my hotel and the conference venue on a half-empty fume-belching diesel bus, generated at least 0.24 tonnes of carbon dioxide.
But it was probably more than that. In order for me to spend time in Barcelona this week, a vast, continent-spanning logistics operation was pressed into service, seamlessly and efficiently, but at such environmental cost! Technology companies spent hundreds of thousands of dollars on building materials for stands – some as big as a decent-sized family home – that otherwise would not have existed and will be torn down at the end of the week.
Then there’s my paper metro ticket, my receipts for lunch, the countless cardboard coffee cups, the plastics in my badge, the unwanted extra fabric lanyard (handed to me by Ericsson when I walked onto their stand covered in Huawei branding), the free pens and little presents tech vendors like to foist on the press corps, all consumed resources that did not need to be consumed.
Oh, I ate a lot of tapas, and I do love pa amb tomàquet – the Catalan speciality of bread smeared with tomatoes and often served with ham – so several pigs died for me, too. They were delicious.
I am a technology reporter at Mobile World Congress, and I am also an environmental villain.
Of course you can always make the argument that that plastic would have been used and consumed anyway, those pigs would still have died, Ryanair flight FR9810 would still have taken off without me, my hosts would still have rented a half empty coach for 12 journalists, and a driver with an apparently magnetic attraction to traffic jams.
So why am I bothered?
A carbon neutral MWC?
The GSMA – the mobile trade body that runs and organises the show – makes a virtue out of Mobile World Congress being a sustainable, carbon neutral event. It enables attendees to offset their carbon emissions by donating to green projects. It also provides a number of recommendations to enable attendees to minimise their environmental footprint, and it can boast a number of laudable achievements. The GSMA is taking sensible, rational steps, and I don’t want to suggest for a moment that they are not trying very hard.
But the GSMA’s actions don’t go far enough. There is no compulsion for attendees to offset their carbon emissions (and carbon offsetting is, in my opinion, pointless – the carbon was still emitted), it is not a requirement to take public transport to the venue, you can still print out your schedules rather than download the app, you don’t have to choose locally grown or sourced food options, and unless you’re coming from elsewhere in Spain, or possibly parts of France, you’re still going to fly.
The GSMA claims that Mobile World Congress was carbon neutral in 2015 and 2016. But was it? Really? When every variable is accounted for, I’d say that was highly, highly improbable.
What’s the point?
So I’d like us to at least begin to discuss the possibility that Mobile World Congress simply isn’t necessary, and as increasing speed, capacity, and capabilities of networks prove, travel for any reason other than personal growth and enrichment, that is to say, business travel, is becoming less necessary.
Okay, you’re right, Mobile World Congress is a fun couple of days. We meet, greet, catch up, learn, drink into the small hours…. These are good things (mostly) and their loss will be keenly felt. But nobody said the transition to a post-carbon economy would be without pain and loss. Nobody said that saving our home from catastrophic climate meltdown and ecological breakdown would not require radical actions.
And with temperatures back home in the UK hitting 20⁰ centigrade in February, ice melt speeding up by every measure, and the biosphere quite literally dying around us, I’d argue the time to take radical actions is now.
Which side are you on?
When you’re weighing up seven billion human lives and the preservation of a habitable planet for future generations, radical choices don’t seem so radical. The solutions to climate meltdown are right in front of us – we just need the popular will and the political backing to implement them. Would you rather your descendants lived in the utopian Star Trek future, or the barbaric Mad Max future?
So I’d like to ask you to ask yourself a few things. Why do we need Mobile World Congress? What social good does it serve? What ecological benefit does it bring? The GSMA talks so much about the power of social good and sustainability, and how mobile will solve these problems.
And yet, and yet, every February 100,000 people descend on Barcelona, snarling up the streets, choking the azure Mediterranean sky, and consuming, consuming, consuming.
As the UK’s full-fibre broadband roll-out gathers pace through 2018 and into 2019, many people continue to believe gigabit services are unnecessary overkill. CityFibre’s Caroline Hughes reckons they’re mistaken.
As a very young child, my brother and I worshiped our ‘now vintage’ Dragon 32 computer. It was no Sinclair ZX Spectrum or BBC Micro, but I’ve never forgotten the fun and enlightenment brought to my childhood by that space-age typewriter!
It came with a few cassette-style games that loaded slowly, loudly and often fifth time lucky, and the only connectivity it had was to the national grid via a power socket. Yet, my brother and I waited patiently, played happily and even spent hours typing in page upon page of computer-magazine-published code to make it draw the simplest of pictures.
In time, our beloved Dragon 32 was superseded in our home by an even more revered Amstrad CPC464. And since those days, countless computers and devices have come and gone in our lives – each more powerful, transformational and immersive than the last.
Little did I know back then that I would spend my career in the heart of the telecommunications industry, or that almost 40 years later I would be sat here reflecting on how far our use of computing technology has come and how dependent we all are on the speed and quality of the connectivity it now demands.
There was also no way I could ever have imagined that by now I would’ve worked for a company like CityFibre for almost three years; helping to realise its founders’ mission to roll-out whole-city full-fibre broadband networks across the UK and celebrating the gigabit speed services that are now setting new connectivity standards for homes and business across Britain.
Anyone who takes time to reflect on how far our industry has come in the last 10 years – let alone since the early home computing days of the 1980s – would be foolish to underestimate where the next 10 to 40 years might catapult us. Yet as CityFibre continues to march ahead, executing its plans to deliver full fibre connections to over 5 million homes by 2025, I see some astonishing comments, from some unexpected naysayers!
In most cases, the announcement that our fibre-to-the-premises (FTTP) services are coming to an area elicits a widespread, heartfelt welcome, huge sighs of relief, or both! But mixed among the overwhelmingly positive reactions are occasional comments that “homes simply don’t ‘need’ gigabit speeds” and that what we are doing is “overkill” or “just for publicity”. And while the average person can be forgiven for this, shockingly, these comments usually come from individuals working within the telecoms industry!
Surveys, like the one carried we out among UK gamers last year, reinforce the short-sightedness of such comments by putting a magnifying glass on the impact that poor connectivity has on this sub-segment of the consumer market. Among other facts, the survey data highlighted that, at today’s average UK connection speeds, downloading recent game releases such as Call of Duty: Black Ops 4 actually takes longer than it would to fly from the UK to the game developer’s headquarters in California!
But my main sympathy doesn’t actually sit with the poor soul waiting over 12 hours for their download. It sits firmly with the tortured mother, brother or flatmate who’s trying to do what is now relatively ‘normal stuff’ online at the same time, i.e. working from home, video chatting with family overseas or streaming a movie.
Need or want? Should it make a difference?
What fascinates me most about the survey though, is how it highlights that not all people who live in UK homes define what they ‘need’ from connectivity in the same way. In many spheres of our lives, what we ‘need’, what we ‘want’ and what ‘puts a great big grin on our face’ are often very different things. I never ‘needed’ that Dragon 32, but it inspired me, it gave me skills and insight, and it definitely made me smile! How we relax in the evening or spend our precious downtime on the weekend is personal to each of us, and, as time progresses, connectivity is unarguably evolving and enriching many of these experiences.
Connected devices are already rife within our homes and even if we can’t predict exactly what our personal digital environments will look like in 20 years’ time, past experience tells us that technology won’t stand still. And, without gigabit-capable fibre connections ready to underpin those inevitable advances, woe betide any ‘regular’ user! Let alone one who finds themselves living with someone whose spare time is spent immersing themselves in the latest gaming experience.
Well-connected homes enable so much more…
Of course, it’s not just about being able to support the entertainment, communication and technology desires of home owners. Gigabit speed networks also have a vital role to play in supporting more basic human needs. Two such examples are digital health and social care and remote learning. Both are areas that local governments are now targeting as part of their smart city and digital agendas. As the technologies in these areas advance, the services deployed will become wholly dependent on ubiquitous high-speed fixed and wireless connectivity. And any remaining digital divide at a community level will hinder the efficiencies and cost savings that are driving their inception.
Investment in ubiquitous gigabit capable networks also radically simplifies and reduces the cost of connecting the small cells that will need to be delivered at density across whole cities in support of enhanced 4G mobile, 5G and smart city IoT. And with that, it comes full circle back to the individual again! Consider the enhanced mobile data use demands of you, I and countless others sat in our self-driving cars as we use the valuable downtime to catch up on work, communicate with family and even stream entertainment and news.
The motivation for broadband is simple
Our industry has a duty to serve all – from the smallest child with their first tablet device to the early adopters of tomorrow. If, as part of this, we fail to look to the future, then we don’t deserve that responsibility.
Committed gamers, the ever-increasing number of connectivity-dependent homeworkers, ‘garage innovators’ and home-based business owners are just a few examples at one end of the consumer spectrum. Yes, they are sub-segments of a large and diverse market sector. But, for as long as I’ve been in this industry, these segments have shone a light on the future; highlighting connectivity requirements that, over time, have become considered a human right by almost everyone else.
If we cut back to where we are today, it’s clear that investing in and building a digital infrastructure capable of serving everyone and everything in an area with gigabit speeds, is neither wasteful nor something to be questioned. We are way beyond that! It’s common sense, enabling for the future and truly exciting, especially to anyone who knows what a full-fibre connection is theoretically capable of delivering – right here, right now!
Our duty is to deliver choice
Comparatively speaking, low- and mid-speed services will always be available! For those who only need to do tomorrow’s equivalent of making phone calls, downloading e-books, paying bills or sending emails, service providers will still be able to provide entry level services over a modern high-speed network. But for many, the desire for those higher speeds is already here and for a large percentage of the rest, technology will drive them there in the blink of an eye.
Even if technology evolves slower than we anticipate, it’s no excuse not to ensure we have the infrastructure and services ready for the future. As an industry our job is to predict, prepare for and embrace the future, feed the hunger and leave no one behind.
Caroline Hughes is head of marketing – portfolio and engagement at CityFibre.
The government recently announced plans that – despite facing delays and setbacks – a new Emergency Services Network (ESN) will be phased in from the beginning of 2019. It will replace the legacy infrastructure, Airwave, a radio-based terrestrial trunked radio (Tetra) network.
Now, Tetra, and similar professional mobile radio (PMR) networks, are a common approach to public safety comms the world over, delivering reliable voice services, basic messaging, and providing effective coverage over great distances. Tetra can carry data traffic, but is not appropriate for large data packet transmission, as it supports only narrowband connectivity and kilobit throughput rates.
ESN on the other hand, can provide multi-megabit per second data rates and multimedia capabilities as well as traditional voice and messaging services. Its completion (date TBC), will make the UK the first in the world to deliver critical voice and data for emergency services over a 4G network. Police, fire and rescue, and ambulance services will be able to take advantage of these services from early next year, with voice capability available at a later stage.
This will also be of a far higher quality than the legacy system, as LTE can change modulation and adapt it to the signal link quality, ensuring that even in bad link loss conditions, the network will continue to provide a voice or low-data connection. Emergency service personnel will be equipped with a multi-feature handset – the result of a £210 million government contract with Samsung – which will also include a ‘push-to-talk’ function, effectively turning them into enhanced radios.
The introduction of data services to the consumer and enterprise markets has been transformative – just think about how often you use WhatsApp, share images and videos, send voice notes and conference call over IP, for example.
In that light, it is hoped that digitalised communication and data services will also be a game-changer for the emergency services. Paramedics will be able to send videos and images to A&E staff to allow them to more effectively prepare for patients’ arrival. Police officers – many of whom are now equipped with body-worn cameras – will be able to live stream video. Real-time content transmission will be used to aid satellite surveillance.
A challenging transition
This sounds great, but the roll-out has not been without its challenges. Work has begun to build the first mobile mast for the network, near Lockerbie in Scotland, but the entire network is a while off completion and an exact end date hasn’t been specified.
Behind schedule and over-budget, completing the build of the standalone public safety network will likely take between five to 10 years. Network operator EE is developing a dedicated core system to support ESN, including hundreds of 4G sites to expand coverage in rural areas, while 800MHz spectrum will be deployed across thousands of other locations. The additional functionality like push-to-talk and group chat functionality, will require new core, signal processing and radio interface protocols to be developed, tested and deployed. It’s therefore of little surprise that the government has been hesitant to confirm a date for the voice services element of ESN to be fully up and running.
A second challenge concerns coverage. As a public safety comms network, coverage must be ubiquitous, allowing emergency service personnel to communicate wherever they are. This is less of an issue for the legacy Tetra system, which uses low frequency, narrowband 400MHz spectrum, which delivers effective coverage and in-building penetration in almost any environment.
ESN’s 700MHz, on the other hand, has less range and penetration, which could compromise communication in areas like basements, tunnels, caves, and remote rural locations – just the kind of areas where the public are potentially at greatest risk of an incident!
Coverage is also a concern for those who’ll actually be using ESN – emergency services personnel. In a survey conducted in 2017, individuals from police forces, fire authorities and ambulance trusts ranked network coverage as number one in their list of concerns regarding the transition to the ne network.
Rapid response required
Networks must be densified in order to provide coverage in hard to reach areas, and building owners (modern building infrastructure can also impair coverage) must ensure that they invest in in-building coverage solutions which deliver scalable, affordable LTE public safety communications in these challenging environments. The same goes for road and rail tunnels. Public safety comms solutions must be flexible and straightforward to adapt and upgrade, dependent on regional requirements.
Pushing back the original date of deployment was a wise – if unavoidable – move by the government. In the meantime, we’ll see a phased deployment, which will involve a hybrid Tetra and LTE network – an approach which Britain is not alone in taking. South Korea’s LTE public safety network – which is mid-deployment – is interoperable with legacy Tetra equipment, and in France, an LTE network is being built which will share Tetra’s current infrastructure.
The project is an ambitious one and challenges are to be expected. However, these can be overcome with time and through considered investment in innovative new network coverage solutions. The government has estimated that ESN will result in annual savings of £200 million, yet these will only be realised once Airwave has been fully replaced.
Time is ticking, and although many members of the public won’t have heard as much about LTE ESN as they have 5G, this is still public money and government time that many will be keeping a close eye on. The government and private sector partners must work hard in 2019 to ensure that the ESN can deliver on the price point and performance promised.
This is a guest blog post by Ingo Flomer, vice president of business development and technology at Cobham Wireless
This is a guest blog by Manish Jethwa, CTO at IoT specialist Yotta
Our national infrastructure assets are under more strain than ever. According to the Office for National Statistics (ONS), the UK population was just over 66 million in June 2017. By 2041, the ONS projects it will reach almost 73 million. In line with this, road usage is increasing. According to the Department for Transport (DfT), 327.1 billion miles were driven on Great Britain’s roads in 2017, a 1.3% increase from the previous year, and up nearly 17% on the corresponding figure for 1997.
At the same time, environmental conditions are becoming more challenging as climate change accelerates. In 2018 alone, we have seen the ravages of the ‘Beast from the East’, followed by heatwave conditions as the country experienced one of the hottest summers on record.
In response to these pressures, councils clearly need to put in place a maintenance strategy that protects their infrastructure assets. Given ongoing budget cuts, effective asset management will be required to better identify and target the most critical and vulnerable assets with available funds.
The latest Internet of Things (IoT) technology and sensors can be beneficial here but if councils want to deliver an operationally-efficient and environmentally-friendly approach, they need to use them selectively. Councils should be aware that their manufacture and their energy usage does have an environmental cost, and make sure they are balancing that impact with the benefits they gain from using them.
That said, there are a range of applications where benefits can be achieved. Selective use of sensors can, for example, be crucial in helping councils develop a maintenance strategy that protect against the dangers of flooding. The increased surface water from storms can put a strain on infrastructure that only regular maintenance can help alleviate. Water and silt level sensors mounted in drains can be the key to identifying those assets that are more likely to get blocked, or that are likely to have the biggest impact if they are blocked, allowing these drains to be maintained more frequently. If these drains and sluices are kept clear, the impact of storms would be lessened and the potential for flooding reduced.
Another area where judicious use of IoT solutions can improve public safety is in the development of smart tree technology. Sensors are now emerging capable of measuring the movement of trees to assess their condition and stability.
Once again though, the sensors need to be distributed sparingly to ensure the environmental benefit of trees is not offset by the cost and ongoing energy use of monitoring them. Councils again need to find the most vulnerable trees, typically using data about age and condition together with the expertise of internal arboreal teams, and only mount sensors where required.
Using sensors to target areas of vulnerability within these two different asset classes could provide the authorities with the potential to start building an intelligent combined asset management approach. By pinpointing where trees and the most vulnerable drains are located, they can start to establish connections between the two. That might help them see, for example, that they have large number of deciduous trees located next to a high number of drains in a low spot. Leaf fall is therefore likely to be high, and the fallen leaves tend to wash together and block the drains as a result. That insight can be a significant benefit to councils who can then better understand how different asset classes link together and how maintaining one can positively impact the status of another.
As we look to the future, councils should be carefully monitoring the latest advances in IoT and sensor technology. The use of sensors can bring significant benefits. However, a note of caution needs to be sounded. There are costs to using sensors, both financial and environmental, and therefore they should never be used in a scattergun manner. Instead, their deployment should be precisely targeted as part of carefully planned, coordinated and, ideally, connected asset management strategy. If used in that way, they can be a real boon for councils looking to mitigate the worst impacts of ongoing wear and tear, and severe climatic events like storms and flooding, on their infrastructure assets.
This is a guest blog post by Tony Judd, MD UKI and Benelux at Verizon
With the popularity of new technologies such as artificial intelligence (AI), the Internet of Things (IoT) and software-defined networking (SDN), impacting almost every aspect of modern business, organisations are having to transform their networks in order to take advantage of these technology developments. However, this change is being used by some to falsely prophesise the end of multiprotocol label switching (MPLS), yet this couldn’t be further from the truth.
Although SDN and other networking techniques are transforming how networks are architected and operate, they do not actually replace the functionality that MPLS provides. It is true that SDN has helped drive opportunities to augment network architectures with lower-cost broadband and public internet connections to enable hybrid networking. However, SDN does not actually replace the need for higher-quality MPLS connections for critical applications as some over-the-top (OTTP) network providers might have you believe. Both technologies will coexist and, in fact, SDN will depend on MPLS for traffic management and security—the attributes that made MPLS networks reliable and desirable in the first place.
Recent technology advances such as media streaming, social media and mobility have generated massive amounts of data that flow into networks from a myriad of devices. Now as IoT, AI and edge computing environments start to go live, data volumes will become even astronomical. Currently, 2.5 exabytes of data are generated daily, and Cisco estimates that data volume is growing at an annual rate of 24% through 2021.
Combined, all of the recent and ongoing technology developments – cloud streaming, IoT, mobility – changed how enterprises consume applications and, as a result, also changed bandwidth demands and (WAN) traffic patterns. As such, enterprises face serious challenges related to scalability, security and network performance. Network traffic is unpredictable and much of it flows from multiple sources dispersed throughout private and public cloud infrastructures as well as data centres.
Scalability limitations and security concerns are more pronounced for enterprises that use multiple vendors to run their networks. Like the reliability of the network itself, security policies and solutions vary from vendor to vendor. For instance, OTTP service providers deliver security at the application layer because they don’t own the underlying network, so the data they handle can become more vulnerable when crossing network boundaries. That’s because elements of the underlying networks are managed by multiple service providers that don’t always communicate or collaborate with each other. In contrast, a provider that owns the underlying network infrastructure can design a secure network to meet enterprise needs.
Easing traffic congestion
To get the most out of their SDN investments, enterprises should use MPLS for critical applications and locations and simply supplement with broadband for less critical traffic. MPLS is designed with the built-in security and scalability that modern businesses demand. Network providers that own the underlying network can deliver strong protection against increasingly common types of cyber-attacks – DDoS (distributed denial of service), ransomware and zero-day threats.
Today’s enterprises also need smart networks that prioritise traffic based on the applications they use, both at the point of entry and exit from the network. Intelligent networks prioritise each application and allocate the proper amount of bandwidth. For instance, the network distinguishes between audio and video applications that require higher priority from casual internet browsing.
This refined approach to traffic balancing isn’t available through public internet connections, but there are providers that offer private MPLS connections and monitor those connections around the clock to maintain performance, scalability and security.
Advanced security through MPLS
A further benefit of using MPLS is that it can help to deliver strong security through the design of the network. Through private connections, MPLS can be used to separate IP addresses from routers and hide the internal structure of the core network from the outside.
In addition, MPLS can be used to put in place additional controls customised to an organisation’s specific needs. These controls can typically support an organisation’s compliance with industry-specific regulations or standards such as HIPAA (Health Insurance Portability and Accountability Act) for healthcare and PCI DSS (Payment Card Industry Data Security Standard) for retailers and other businesses that process credit card information.
MPLS and SDN: working together
Without a doubt SDN is changing how networks are managed, driving increased flexibility and scalability to enterprises, allowing them to dial services up and down as required. However, SDN will not mean the end of MPLS, instead SDN will require MPLS to increase security and manage traffic in an effective manner. With this in mind, businesses who want to put in place the latest and greatest digital capabilities should adopt a network strategy that can deliver the best of both worlds; SDN controls combined with MPLS capabilities.
This is a guest blog post by Brendan O’Rourke, head of design at BriteBill, an Amdocs company.
It’s no secret that today’s customers are more demanding than previous generations. They share their experiences online, they expect instant gratification 24-7, and the competition is just a mouse click away.
The communications and media sector hasn’t traditionally been one that impresses customers with great service, but just how badly does it fare compared with other industry sectors? The latest UK Satisfaction Index, published in July 2018 by the Institute of Customer Service, has the answer. It found that satisfaction among UK consumers, across all verticals, rates at 77.9/100. Telecoms scored 74.3, making it the second lowest scoring vertical – only the transport sector fared worse (72.5).
Unsurprisingly, this low level of customer satisfaction translates into high levels of customer churn. A new TM Forum Quick Insight Report, titled ‘Inspire loyalty with customer lifecycle management’ sponsored by BriteBill, found that postpaid churn currently ranges from 5% to 32% per year.
While it’s difficult to put an exact figure on the cost of churn, consider this: The average mobile operator in a mature market spends 15-20% of service revenues on acquisition and retention, compared with the average Capex spend on infrastructure (networks and IT) of just 15% of revenues.
Canada’s BCE and Telus revealed in 2017 that it cost almost 50 times less for them to keep an existing mobile customer than to acquire a new one, with retention costs of CAD11.04 and CAD11.74 respectively, while average subscriber acquisition cost weighed in at an eye watering CAD521. In a saturated market, it would seem sensible for service providers to focus on keeping existing customers, rather than trying to lure new ones away from competitors.
There is, of course, a direct link between customer experience and churn rates. It’s obvious that a positive customer experience aids in retention, but can it be quantified? According to the TM Forum report’s main author and senior analyst, Catherine Haslam, yes it can. Australian service provider Optus achieved a 1.4% reduction in churn amongst its retail post-pay customers by raising its net promoter score (NPS) by six points, while another large service provider saw a 3% decrease in churn following a 25-point boost in NPS.
So, how do you boost NPS?
“It’s so simple yet easy to forget: Nothing is more important than engaging with your customers in a proactive and positive way,” says Haslam. “All too often, communication between service providers and their customers is reduced to the monthly bill – hardly a positive experience for most – and occasional calls to customer care when there is a problem. Service providers are missing a trick by not using opportunities to interact in positive ways, throughout the customer lifecycle.”
Billing, for example, should be a retention tool, rather than a churn agent. The truth is that customers find bills boring and difficult to understand. According to a study by the UK’s USwitch (June 2018), one in six mobile users haven’t even checked their bill in the last six months. When asked why, 18% (or 1.3 million mobile users) said they simply couldn’t be bothered.
Sadly, while service providers may have spent millions upgrading their IT systems to support the digital customer experience, they tend to overlook simple outputs such as the bill. As Haslam puts it, “the first bill is so important, but often it’s very different from a customer’s expectations.” Even if the charges shown on a communications bill are correct, they can be confusing. Factors such as device leases, proration (partial month billing), billing in advance for some services and in arrears for others, overages and vague descriptions all contribute to the complexity, leaving the user utterly befuddled.
It’s high time service providers took a fresh approach to bills that can pay measurable dividends. Cricket Wireless, for example, a subsidiary of AT&T, ran a campaign called ‘Let’s Look Inside Your Bucket’. This inventive campaign used a video-based approach to not only communicate information, but they combined offers and a healthy dose of humour. The campaign was incredibly successful and lead to a whopping 37% reduction in early customer churn.
For service providers looking to transform their bill, the TM Forum recommends the following: communicating billing information accurately, clearly and concisely, demonstrating value, and including new information – such as making customers aware of new products and services that are relevant to them. If a service provider can do this, and do it well, they can change their bill from a churn driver into a valuable retention tool. And quite possibly put an end to the era of boring bills.
Facing negative PR all around after announcing a strategic realignment that is to see 13,000 employees made redundant and the closure of its central London HQ after over 100 years, even the good stuff (such as pivoting the organisation towards full-fibre broadband at long last) that BT CEO Gavin Patterson has done in his five years at the top wasn’t enough to save him from jumping before he was pushed.
The news of Patterson’s resignation broke today after it emerged early this week that angry BT shareholders were mobilising against him. In meetings with chairman Jan du Plessis, it appears that both agreed that even though the strategy changes were the right move, he was not the right man to oversee them.
With a long history at the business, Patterson seemed like the sort of safe pair of hands that befits monolithic organisations like BT. He was never a Bill Gates or Steve Jobs-style techno visionary, but that sort of leadership would have been out of place at BT.
On the handful of occasions I met him, he struck me as a generally likeable man, with a slightly rogueish demeanour that reminded me a little of the sort of lads you find found in IT sales organisations. If you transported him back to the 1980s and dropped him in the City of London, he’d fit right in, and would probably drive a white Porsche cabriolet.
With BT facing tough choices over the next few months, I would imagine the organisation will go for another safe pair of hands, someone who knows the business and is ready and able to steer it through the choppy waters ahead.
For me, this suggests BT will look within for its next leader, so it may be worth keeping an eye on some of the likely internal candidates. Who are they, then?
Twice in recent years BT called on the head of its Retail business to step up, and both Ian Livingston and Patterson answered the call. Retail is now BT Consumer, led by EE’s man Marc Allera, but I reckon he’s not steeped enough in the organisation’s culture yet.
The CEO of Global Services, Bas Burger, is probably right out, given its troubles, and for my money, so would be the head of Technology, Service and Operations, Howard Watson, who is more of a tech specialist, but maybe BT would turn to its Business and Public Sector organisation, led by Graham Sutherland, who oversaw a healthy sales bump and some tasty contract wins last year.
Another name in the hat could be Gerry McQuade, CEO at BT Wholesale and Ventures, a declining business unit as traditional voice revenues wither, but would that give him the oomph to drive the wider organisation?
BT could even consider Openreach boss Clive Selley, a tricky proposition given the organisation’s quasi-independent status, and Selley is, again, a very technical man, but might he be able to bring a new perspective to the wider group? Based on my acquaintance with Selley, I have no doubt he would try to do his utmost to keep up the much-needed pressure to build next-generation networks.
And of course, BT could still surprise us all and tap a complete outsider. One thing is for sure, whoever steps up is going to have one hell of a job.
After yet another set of lacklustre earnings and with over 10,000 staff facing the axe, embattled BT CEO Gavin Patterson needed a quick win. Today’s launch of new consumer offerings from both BT and EE, along with a plan to converge its broadband and 4G networks seem, at face value, to give him that.
At an event in London today BT bet big on its Consumer unit – which is made up of its broadband retail business, EE, and ‘cheap-n-cheerful’ ISP Plusnet. It made over 20 announcements, of which a tie-up with Amazon Prime Video, more content for BT Sport, a managed smart home ecosystem, and the repatriation of its dreaded outsourced call centres to UK shores probably have the most kerb appeal for consumer service buyers.
I think a big part of this strategy is to convince consumers that BT is a natural home for them. Here, BT has a clear advantage as the incumbent (and up until the ’80s the state-run monopoly), and can draw on this to demonstrate that it’s a safe bet for the average user, being the only provider with the size, scale and money to bring together its converged network and customer services vision and wrap it together with lots of nice little perks, such as free 4G Wi-Fi routers if your broadband goes down, or access to Amazon exclusives such as The Grand Tour. No argument there.
But what struck me at today’s press conference was that BT made scant mention of full-fibre, or Openreach’s pivot towards the so-called gold standard of broadband. But then, I wondered, why would it need to? The converged network offering offers a nice speed boost almost right out of the gate in the form of a new router that bonds together a fixed and mobile connection, and to be scrupulously fair, the content streaming experience on a superfast connection is generally as good as on an ultrafast one.
Everything announced today was predicated on making the online experience easier for consumers, not on expanding access to full-fibre. Clever BT has decided that this sort of thing is what consumers want, and in many ways it’s got that right. If the tone of coverage in the mainstream press is anything to go by, this strategy will work out well for it and I expect its customer acquisitions will duly spike a little in the next few months.
Ask yourself this, how can rapidly expanding full-fibre suppliers such as CityFibre compete with that? Sure, you can pay an altnet for an ultrafast connection and it’ll be great, no question, but after that you’re on your own. And let’s be frank here, nobody else building pure full-fibre networks in this country really has a hope of being able to afford Premier League football rights, or to tie-up with content producers like Amazon and Netflix.
Yes, in terms of competition, this is great for BT. But I can’t help but think that to some extent, BT is tinkering with easy fixes and consumer-pleasing add-ons when it ought to be pulling out all the stops on full-fibre. I think it’s in danger of falling back into bad habits and leading on broadband that is, well, just good enough, and I don’t want to see that.
As we’ve been saying here for years, good enough broadband isn’t good enough for Britain’s digital future, and we have a responsibility not to let BT take the easy way out and make good enough out to be desirable. Today’s announcements are good news, but they also show how important it is to keep holding BT’s feet to the fire, and to keep talking about full-fibre as a priority.