As the UK’s full-fibre broadband roll-out gathers pace through 2018 and into 2019, many people continue to believe gigabit services are unnecessary overkill. CityFibre’s Caroline Hughes reckons they’re mistaken.
As a very young child, my brother and I worshiped our ‘now vintage’ Dragon 32 computer. It was no Sinclair ZX Spectrum or BBC Micro, but I’ve never forgotten the fun and enlightenment brought to my childhood by that space-age typewriter!
It came with a few cassette-style games that loaded slowly, loudly and often fifth time lucky, and the only connectivity it had was to the national grid via a power socket. Yet, my brother and I waited patiently, played happily and even spent hours typing in page upon page of computer-magazine-published code to make it draw the simplest of pictures.
In time, our beloved Dragon 32 was superseded in our home by an even more revered Amstrad CPC464. And since those days, countless computers and devices have come and gone in our lives – each more powerful, transformational and immersive than the last.
Little did I know back then that I would spend my career in the heart of the telecommunications industry, or that almost 40 years later I would be sat here reflecting on how far our use of computing technology has come and how dependent we all are on the speed and quality of the connectivity it now demands.
There was also no way I could ever have imagined that by now I would’ve worked for a company like CityFibre for almost three years; helping to realise its founders’ mission to roll-out whole-city full-fibre broadband networks across the UK and celebrating the gigabit speed services that are now setting new connectivity standards for homes and business across Britain.
Anyone who takes time to reflect on how far our industry has come in the last 10 years – let alone since the early home computing days of the 1980s – would be foolish to underestimate where the next 10 to 40 years might catapult us. Yet as CityFibre continues to march ahead, executing its plans to deliver full fibre connections to over 5 million homes by 2025, I see some astonishing comments, from some unexpected naysayers!
In most cases, the announcement that our fibre-to-the-premises (FTTP) services are coming to an area elicits a widespread, heartfelt welcome, huge sighs of relief, or both! But mixed among the overwhelmingly positive reactions are occasional comments that “homes simply don’t ‘need’ gigabit speeds” and that what we are doing is “overkill” or “just for publicity”. And while the average person can be forgiven for this, shockingly, these comments usually come from individuals working within the telecoms industry!
Surveys, like the one carried we out among UK gamers last year, reinforce the short-sightedness of such comments by putting a magnifying glass on the impact that poor connectivity has on this sub-segment of the consumer market. Among other facts, the survey data highlighted that, at today’s average UK connection speeds, downloading recent game releases such as Call of Duty: Black Ops 4 actually takes longer than it would to fly from the UK to the game developer’s headquarters in California!
But my main sympathy doesn’t actually sit with the poor soul waiting over 12 hours for their download. It sits firmly with the tortured mother, brother or flatmate who’s trying to do what is now relatively ‘normal stuff’ online at the same time, i.e. working from home, video chatting with family overseas or streaming a movie.
Need or want? Should it make a difference?
What fascinates me most about the survey though, is how it highlights that not all people who live in UK homes define what they ‘need’ from connectivity in the same way. In many spheres of our lives, what we ‘need’, what we ‘want’ and what ‘puts a great big grin on our face’ are often very different things. I never ‘needed’ that Dragon 32, but it inspired me, it gave me skills and insight, and it definitely made me smile! How we relax in the evening or spend our precious downtime on the weekend is personal to each of us, and, as time progresses, connectivity is unarguably evolving and enriching many of these experiences.
Connected devices are already rife within our homes and even if we can’t predict exactly what our personal digital environments will look like in 20 years’ time, past experience tells us that technology won’t stand still. And, without gigabit-capable fibre connections ready to underpin those inevitable advances, woe betide any ‘regular’ user! Let alone one who finds themselves living with someone whose spare time is spent immersing themselves in the latest gaming experience.
Well-connected homes enable so much more…
Of course, it’s not just about being able to support the entertainment, communication and technology desires of home owners. Gigabit speed networks also have a vital role to play in supporting more basic human needs. Two such examples are digital health and social care and remote learning. Both are areas that local governments are now targeting as part of their smart city and digital agendas. As the technologies in these areas advance, the services deployed will become wholly dependent on ubiquitous high-speed fixed and wireless connectivity. And any remaining digital divide at a community level will hinder the efficiencies and cost savings that are driving their inception.
Investment in ubiquitous gigabit capable networks also radically simplifies and reduces the cost of connecting the small cells that will need to be delivered at density across whole cities in support of enhanced 4G mobile, 5G and smart city IoT. And with that, it comes full circle back to the individual again! Consider the enhanced mobile data use demands of you, I and countless others sat in our self-driving cars as we use the valuable downtime to catch up on work, communicate with family and even stream entertainment and news.
The motivation for broadband is simple
Our industry has a duty to serve all – from the smallest child with their first tablet device to the early adopters of tomorrow. If, as part of this, we fail to look to the future, then we don’t deserve that responsibility.
Committed gamers, the ever-increasing number of connectivity-dependent homeworkers, ‘garage innovators’ and home-based business owners are just a few examples at one end of the consumer spectrum. Yes, they are sub-segments of a large and diverse market sector. But, for as long as I’ve been in this industry, these segments have shone a light on the future; highlighting connectivity requirements that, over time, have become considered a human right by almost everyone else.
If we cut back to where we are today, it’s clear that investing in and building a digital infrastructure capable of serving everyone and everything in an area with gigabit speeds, is neither wasteful nor something to be questioned. We are way beyond that! It’s common sense, enabling for the future and truly exciting, especially to anyone who knows what a full-fibre connection is theoretically capable of delivering – right here, right now!
Our duty is to deliver choice
Comparatively speaking, low- and mid-speed services will always be available! For those who only need to do tomorrow’s equivalent of making phone calls, downloading e-books, paying bills or sending emails, service providers will still be able to provide entry level services over a modern high-speed network. But for many, the desire for those higher speeds is already here and for a large percentage of the rest, technology will drive them there in the blink of an eye.
Even if technology evolves slower than we anticipate, it’s no excuse not to ensure we have the infrastructure and services ready for the future. As an industry our job is to predict, prepare for and embrace the future, feed the hunger and leave no one behind.
Caroline Hughes is head of marketing – portfolio and engagement at CityFibre.
The government recently announced plans that – despite facing delays and setbacks – a new Emergency Services Network (ESN) will be phased in from the beginning of 2019. It will replace the legacy infrastructure, Airwave, a radio-based terrestrial trunked radio (Tetra) network.
Now, Tetra, and similar professional mobile radio (PMR) networks, are a common approach to public safety comms the world over, delivering reliable voice services, basic messaging, and providing effective coverage over great distances. Tetra can carry data traffic, but is not appropriate for large data packet transmission, as it supports only narrowband connectivity and kilobit throughput rates.
ESN on the other hand, can provide multi-megabit per second data rates and multimedia capabilities as well as traditional voice and messaging services. Its completion (date TBC), will make the UK the first in the world to deliver critical voice and data for emergency services over a 4G network. Police, fire and rescue, and ambulance services will be able to take advantage of these services from early next year, with voice capability available at a later stage.
This will also be of a far higher quality than the legacy system, as LTE can change modulation and adapt it to the signal link quality, ensuring that even in bad link loss conditions, the network will continue to provide a voice or low-data connection. Emergency service personnel will be equipped with a multi-feature handset – the result of a £210 million government contract with Samsung – which will also include a ‘push-to-talk’ function, effectively turning them into enhanced radios.
The introduction of data services to the consumer and enterprise markets has been transformative – just think about how often you use WhatsApp, share images and videos, send voice notes and conference call over IP, for example.
In that light, it is hoped that digitalised communication and data services will also be a game-changer for the emergency services. Paramedics will be able to send videos and images to A&E staff to allow them to more effectively prepare for patients’ arrival. Police officers – many of whom are now equipped with body-worn cameras – will be able to live stream video. Real-time content transmission will be used to aid satellite surveillance.
A challenging transition
This sounds great, but the roll-out has not been without its challenges. Work has begun to build the first mobile mast for the network, near Lockerbie in Scotland, but the entire network is a while off completion and an exact end date hasn’t been specified.
Behind schedule and over-budget, completing the build of the standalone public safety network will likely take between five to 10 years. Network operator EE is developing a dedicated core system to support ESN, including hundreds of 4G sites to expand coverage in rural areas, while 800MHz spectrum will be deployed across thousands of other locations. The additional functionality like push-to-talk and group chat functionality, will require new core, signal processing and radio interface protocols to be developed, tested and deployed. It’s therefore of little surprise that the government has been hesitant to confirm a date for the voice services element of ESN to be fully up and running.
A second challenge concerns coverage. As a public safety comms network, coverage must be ubiquitous, allowing emergency service personnel to communicate wherever they are. This is less of an issue for the legacy Tetra system, which uses low frequency, narrowband 400MHz spectrum, which delivers effective coverage and in-building penetration in almost any environment.
ESN’s 700MHz, on the other hand, has less range and penetration, which could compromise communication in areas like basements, tunnels, caves, and remote rural locations – just the kind of areas where the public are potentially at greatest risk of an incident!
Coverage is also a concern for those who’ll actually be using ESN – emergency services personnel. In a survey conducted in 2017, individuals from police forces, fire authorities and ambulance trusts ranked network coverage as number one in their list of concerns regarding the transition to the ne network.
Rapid response required
Networks must be densified in order to provide coverage in hard to reach areas, and building owners (modern building infrastructure can also impair coverage) must ensure that they invest in in-building coverage solutions which deliver scalable, affordable LTE public safety communications in these challenging environments. The same goes for road and rail tunnels. Public safety comms solutions must be flexible and straightforward to adapt and upgrade, dependent on regional requirements.
Pushing back the original date of deployment was a wise – if unavoidable – move by the government. In the meantime, we’ll see a phased deployment, which will involve a hybrid Tetra and LTE network – an approach which Britain is not alone in taking. South Korea’s LTE public safety network – which is mid-deployment – is interoperable with legacy Tetra equipment, and in France, an LTE network is being built which will share Tetra’s current infrastructure.
The project is an ambitious one and challenges are to be expected. However, these can be overcome with time and through considered investment in innovative new network coverage solutions. The government has estimated that ESN will result in annual savings of £200 million, yet these will only be realised once Airwave has been fully replaced.
Time is ticking, and although many members of the public won’t have heard as much about LTE ESN as they have 5G, this is still public money and government time that many will be keeping a close eye on. The government and private sector partners must work hard in 2019 to ensure that the ESN can deliver on the price point and performance promised.
This is a guest blog post by Ingo Flomer, vice president of business development and technology at Cobham Wireless
This is a guest blog by Manish Jethwa, CTO at IoT specialist Yotta
Our national infrastructure assets are under more strain than ever. According to the Office for National Statistics (ONS), the UK population was just over 66 million in June 2017. By 2041, the ONS projects it will reach almost 73 million. In line with this, road usage is increasing. According to the Department for Transport (DfT), 327.1 billion miles were driven on Great Britain’s roads in 2017, a 1.3% increase from the previous year, and up nearly 17% on the corresponding figure for 1997.
At the same time, environmental conditions are becoming more challenging as climate change accelerates. In 2018 alone, we have seen the ravages of the ‘Beast from the East’, followed by heatwave conditions as the country experienced one of the hottest summers on record.
In response to these pressures, councils clearly need to put in place a maintenance strategy that protects their infrastructure assets. Given ongoing budget cuts, effective asset management will be required to better identify and target the most critical and vulnerable assets with available funds.
The latest Internet of Things (IoT) technology and sensors can be beneficial here but if councils want to deliver an operationally-efficient and environmentally-friendly approach, they need to use them selectively. Councils should be aware that their manufacture and their energy usage does have an environmental cost, and make sure they are balancing that impact with the benefits they gain from using them.
That said, there are a range of applications where benefits can be achieved. Selective use of sensors can, for example, be crucial in helping councils develop a maintenance strategy that protect against the dangers of flooding. The increased surface water from storms can put a strain on infrastructure that only regular maintenance can help alleviate. Water and silt level sensors mounted in drains can be the key to identifying those assets that are more likely to get blocked, or that are likely to have the biggest impact if they are blocked, allowing these drains to be maintained more frequently. If these drains and sluices are kept clear, the impact of storms would be lessened and the potential for flooding reduced.
Another area where judicious use of IoT solutions can improve public safety is in the development of smart tree technology. Sensors are now emerging capable of measuring the movement of trees to assess their condition and stability.
Once again though, the sensors need to be distributed sparingly to ensure the environmental benefit of trees is not offset by the cost and ongoing energy use of monitoring them. Councils again need to find the most vulnerable trees, typically using data about age and condition together with the expertise of internal arboreal teams, and only mount sensors where required.
Using sensors to target areas of vulnerability within these two different asset classes could provide the authorities with the potential to start building an intelligent combined asset management approach. By pinpointing where trees and the most vulnerable drains are located, they can start to establish connections between the two. That might help them see, for example, that they have large number of deciduous trees located next to a high number of drains in a low spot. Leaf fall is therefore likely to be high, and the fallen leaves tend to wash together and block the drains as a result. That insight can be a significant benefit to councils who can then better understand how different asset classes link together and how maintaining one can positively impact the status of another.
As we look to the future, councils should be carefully monitoring the latest advances in IoT and sensor technology. The use of sensors can bring significant benefits. However, a note of caution needs to be sounded. There are costs to using sensors, both financial and environmental, and therefore they should never be used in a scattergun manner. Instead, their deployment should be precisely targeted as part of carefully planned, coordinated and, ideally, connected asset management strategy. If used in that way, they can be a real boon for councils looking to mitigate the worst impacts of ongoing wear and tear, and severe climatic events like storms and flooding, on their infrastructure assets.
This is a guest blog post by Tony Judd, MD UKI and Benelux at Verizon
With the popularity of new technologies such as artificial intelligence (AI), the Internet of Things (IoT) and software-defined networking (SDN), impacting almost every aspect of modern business, organisations are having to transform their networks in order to take advantage of these technology developments. However, this change is being used by some to falsely prophesise the end of multiprotocol label switching (MPLS), yet this couldn’t be further from the truth.
Although SDN and other networking techniques are transforming how networks are architected and operate, they do not actually replace the functionality that MPLS provides. It is true that SDN has helped drive opportunities to augment network architectures with lower-cost broadband and public internet connections to enable hybrid networking. However, SDN does not actually replace the need for higher-quality MPLS connections for critical applications as some over-the-top (OTTP) network providers might have you believe. Both technologies will coexist and, in fact, SDN will depend on MPLS for traffic management and security—the attributes that made MPLS networks reliable and desirable in the first place.
Recent technology advances such as media streaming, social media and mobility have generated massive amounts of data that flow into networks from a myriad of devices. Now as IoT, AI and edge computing environments start to go live, data volumes will become even astronomical. Currently, 2.5 exabytes of data are generated daily, and Cisco estimates that data volume is growing at an annual rate of 24% through 2021.
Combined, all of the recent and ongoing technology developments – cloud streaming, IoT, mobility – changed how enterprises consume applications and, as a result, also changed bandwidth demands and (WAN) traffic patterns. As such, enterprises face serious challenges related to scalability, security and network performance. Network traffic is unpredictable and much of it flows from multiple sources dispersed throughout private and public cloud infrastructures as well as data centres.
Scalability limitations and security concerns are more pronounced for enterprises that use multiple vendors to run their networks. Like the reliability of the network itself, security policies and solutions vary from vendor to vendor. For instance, OTTP service providers deliver security at the application layer because they don’t own the underlying network, so the data they handle can become more vulnerable when crossing network boundaries. That’s because elements of the underlying networks are managed by multiple service providers that don’t always communicate or collaborate with each other. In contrast, a provider that owns the underlying network infrastructure can design a secure network to meet enterprise needs.
Easing traffic congestion
To get the most out of their SDN investments, enterprises should use MPLS for critical applications and locations and simply supplement with broadband for less critical traffic. MPLS is designed with the built-in security and scalability that modern businesses demand. Network providers that own the underlying network can deliver strong protection against increasingly common types of cyber-attacks – DDoS (distributed denial of service), ransomware and zero-day threats.
Today’s enterprises also need smart networks that prioritise traffic based on the applications they use, both at the point of entry and exit from the network. Intelligent networks prioritise each application and allocate the proper amount of bandwidth. For instance, the network distinguishes between audio and video applications that require higher priority from casual internet browsing.
This refined approach to traffic balancing isn’t available through public internet connections, but there are providers that offer private MPLS connections and monitor those connections around the clock to maintain performance, scalability and security.
Advanced security through MPLS
A further benefit of using MPLS is that it can help to deliver strong security through the design of the network. Through private connections, MPLS can be used to separate IP addresses from routers and hide the internal structure of the core network from the outside.
In addition, MPLS can be used to put in place additional controls customised to an organisation’s specific needs. These controls can typically support an organisation’s compliance with industry-specific regulations or standards such as HIPAA (Health Insurance Portability and Accountability Act) for healthcare and PCI DSS (Payment Card Industry Data Security Standard) for retailers and other businesses that process credit card information.
MPLS and SDN: working together
Without a doubt SDN is changing how networks are managed, driving increased flexibility and scalability to enterprises, allowing them to dial services up and down as required. However, SDN will not mean the end of MPLS, instead SDN will require MPLS to increase security and manage traffic in an effective manner. With this in mind, businesses who want to put in place the latest and greatest digital capabilities should adopt a network strategy that can deliver the best of both worlds; SDN controls combined with MPLS capabilities.
This is a guest blog post by Brendan O’Rourke, head of design at BriteBill, an Amdocs company.
It’s no secret that today’s customers are more demanding than previous generations. They share their experiences online, they expect instant gratification 24-7, and the competition is just a mouse click away.
The communications and media sector hasn’t traditionally been one that impresses customers with great service, but just how badly does it fare compared with other industry sectors? The latest UK Satisfaction Index, published in July 2018 by the Institute of Customer Service, has the answer. It found that satisfaction among UK consumers, across all verticals, rates at 77.9/100. Telecoms scored 74.3, making it the second lowest scoring vertical – only the transport sector fared worse (72.5).
Unsurprisingly, this low level of customer satisfaction translates into high levels of customer churn. A new TM Forum Quick Insight Report, titled ‘Inspire loyalty with customer lifecycle management’ sponsored by BriteBill, found that postpaid churn currently ranges from 5% to 32% per year.
While it’s difficult to put an exact figure on the cost of churn, consider this: The average mobile operator in a mature market spends 15-20% of service revenues on acquisition and retention, compared with the average Capex spend on infrastructure (networks and IT) of just 15% of revenues.
Canada’s BCE and Telus revealed in 2017 that it cost almost 50 times less for them to keep an existing mobile customer than to acquire a new one, with retention costs of CAD11.04 and CAD11.74 respectively, while average subscriber acquisition cost weighed in at an eye watering CAD521. In a saturated market, it would seem sensible for service providers to focus on keeping existing customers, rather than trying to lure new ones away from competitors.
There is, of course, a direct link between customer experience and churn rates. It’s obvious that a positive customer experience aids in retention, but can it be quantified? According to the TM Forum report’s main author and senior analyst, Catherine Haslam, yes it can. Australian service provider Optus achieved a 1.4% reduction in churn amongst its retail post-pay customers by raising its net promoter score (NPS) by six points, while another large service provider saw a 3% decrease in churn following a 25-point boost in NPS.
So, how do you boost NPS?
“It’s so simple yet easy to forget: Nothing is more important than engaging with your customers in a proactive and positive way,” says Haslam. “All too often, communication between service providers and their customers is reduced to the monthly bill – hardly a positive experience for most – and occasional calls to customer care when there is a problem. Service providers are missing a trick by not using opportunities to interact in positive ways, throughout the customer lifecycle.”
Billing, for example, should be a retention tool, rather than a churn agent. The truth is that customers find bills boring and difficult to understand. According to a study by the UK’s USwitch (June 2018), one in six mobile users haven’t even checked their bill in the last six months. When asked why, 18% (or 1.3 million mobile users) said they simply couldn’t be bothered.
Sadly, while service providers may have spent millions upgrading their IT systems to support the digital customer experience, they tend to overlook simple outputs such as the bill. As Haslam puts it, “the first bill is so important, but often it’s very different from a customer’s expectations.” Even if the charges shown on a communications bill are correct, they can be confusing. Factors such as device leases, proration (partial month billing), billing in advance for some services and in arrears for others, overages and vague descriptions all contribute to the complexity, leaving the user utterly befuddled.
It’s high time service providers took a fresh approach to bills that can pay measurable dividends. Cricket Wireless, for example, a subsidiary of AT&T, ran a campaign called ‘Let’s Look Inside Your Bucket’. This inventive campaign used a video-based approach to not only communicate information, but they combined offers and a healthy dose of humour. The campaign was incredibly successful and lead to a whopping 37% reduction in early customer churn.
For service providers looking to transform their bill, the TM Forum recommends the following: communicating billing information accurately, clearly and concisely, demonstrating value, and including new information – such as making customers aware of new products and services that are relevant to them. If a service provider can do this, and do it well, they can change their bill from a churn driver into a valuable retention tool. And quite possibly put an end to the era of boring bills.
Facing negative PR all around after announcing a strategic realignment that is to see 13,000 employees made redundant and the closure of its central London HQ after over 100 years, even the good stuff (such as pivoting the organisation towards full-fibre broadband at long last) that BT CEO Gavin Patterson has done in his five years at the top wasn’t enough to save him from jumping before he was pushed.
The news of Patterson’s resignation broke today after it emerged early this week that angry BT shareholders were mobilising against him. In meetings with chairman Jan du Plessis, it appears that both agreed that even though the strategy changes were the right move, he was not the right man to oversee them.
With a long history at the business, Patterson seemed like the sort of safe pair of hands that befits monolithic organisations like BT. He was never a Bill Gates or Steve Jobs-style techno visionary, but that sort of leadership would have been out of place at BT.
On the handful of occasions I met him, he struck me as a generally likeable man, with a slightly rogueish demeanour that reminded me a little of the sort of lads you find found in IT sales organisations. If you transported him back to the 1980s and dropped him in the City of London, he’d fit right in, and would probably drive a white Porsche cabriolet.
With BT facing tough choices over the next few months, I would imagine the organisation will go for another safe pair of hands, someone who knows the business and is ready and able to steer it through the choppy waters ahead.
For me, this suggests BT will look within for its next leader, so it may be worth keeping an eye on some of the likely internal candidates. Who are they, then?
Twice in recent years BT called on the head of its Retail business to step up, and both Ian Livingston and Patterson answered the call. Retail is now BT Consumer, led by EE’s man Marc Allera, but I reckon he’s not steeped enough in the organisation’s culture yet.
The CEO of Global Services, Bas Burger, is probably right out, given its troubles, and for my money, so would be the head of Technology, Service and Operations, Howard Watson, who is more of a tech specialist, but maybe BT would turn to its Business and Public Sector organisation, led by Graham Sutherland, who oversaw a healthy sales bump and some tasty contract wins last year.
Another name in the hat could be Gerry McQuade, CEO at BT Wholesale and Ventures, a declining business unit as traditional voice revenues wither, but would that give him the oomph to drive the wider organisation?
BT could even consider Openreach boss Clive Selley, a tricky proposition given the organisation’s quasi-independent status, and Selley is, again, a very technical man, but might he be able to bring a new perspective to the wider group? Based on my acquaintance with Selley, I have no doubt he would try to do his utmost to keep up the much-needed pressure to build next-generation networks.
And of course, BT could still surprise us all and tap a complete outsider. One thing is for sure, whoever steps up is going to have one hell of a job.
After yet another set of lacklustre earnings and with over 10,000 staff facing the axe, embattled BT CEO Gavin Patterson needed a quick win. Today’s launch of new consumer offerings from both BT and EE, along with a plan to converge its broadband and 4G networks seem, at face value, to give him that.
At an event in London today BT bet big on its Consumer unit – which is made up of its broadband retail business, EE, and ‘cheap-n-cheerful’ ISP Plusnet. It made over 20 announcements, of which a tie-up with Amazon Prime Video, more content for BT Sport, a managed smart home ecosystem, and the repatriation of its dreaded outsourced call centres to UK shores probably have the most kerb appeal for consumer service buyers.
I think a big part of this strategy is to convince consumers that BT is a natural home for them. Here, BT has a clear advantage as the incumbent (and up until the ’80s the state-run monopoly), and can draw on this to demonstrate that it’s a safe bet for the average user, being the only provider with the size, scale and money to bring together its converged network and customer services vision and wrap it together with lots of nice little perks, such as free 4G Wi-Fi routers if your broadband goes down, or access to Amazon exclusives such as The Grand Tour. No argument there.
But what struck me at today’s press conference was that BT made scant mention of full-fibre, or Openreach’s pivot towards the so-called gold standard of broadband. But then, I wondered, why would it need to? The converged network offering offers a nice speed boost almost right out of the gate in the form of a new router that bonds together a fixed and mobile connection, and to be scrupulously fair, the content streaming experience on a superfast connection is generally as good as on an ultrafast one.
Everything announced today was predicated on making the online experience easier for consumers, not on expanding access to full-fibre. Clever BT has decided that this sort of thing is what consumers want, and in many ways it’s got that right. If the tone of coverage in the mainstream press is anything to go by, this strategy will work out well for it and I expect its customer acquisitions will duly spike a little in the next few months.
Ask yourself this, how can rapidly expanding full-fibre suppliers such as CityFibre compete with that? Sure, you can pay an altnet for an ultrafast connection and it’ll be great, no question, but after that you’re on your own. And let’s be frank here, nobody else building pure full-fibre networks in this country really has a hope of being able to afford Premier League football rights, or to tie-up with content producers like Amazon and Netflix.
Yes, in terms of competition, this is great for BT. But I can’t help but think that to some extent, BT is tinkering with easy fixes and consumer-pleasing add-ons when it ought to be pulling out all the stops on full-fibre. I think it’s in danger of falling back into bad habits and leading on broadband that is, well, just good enough, and I don’t want to see that.
As we’ve been saying here for years, good enough broadband isn’t good enough for Britain’s digital future, and we have a responsibility not to let BT take the easy way out and make good enough out to be desirable. Today’s announcements are good news, but they also show how important it is to keep holding BT’s feet to the fire, and to keep talking about full-fibre as a priority.
Are Britain’s internet service providers (ISPs) coming up short when it comes to helping their less tech-savvy users protect themselves against the scourge of telephone scams and online fraud?
In a shocking breach of Betteridge’s Law of Headlines, the answer is actually yes. But keep reading anyway.
Now we’ve cleared that up, some explanation, I pose the question because TalkTalk, which runs its own anti-fraud campaign called Beat the Scammers, has just published a set of stats, collated in partnership with Action Fraud, shedding some light on the extent of the problem.
TalkTalk’s data show that in the two year period between October 2015 and September 2017, the five most common online scams in the UK hit over 130,000 people, and those are just the cases that were reported to the police.
Online shopping and auction fraud, where products are misrepresented or never arrive, while the merchants vanish without a trace, was the most prevalent type, with 66,874 cases reported during the monitored period.
Computer service fraud – calls from bogus tech support teams, hit 45,713 people, while email and social media hacks hit 9,473, personal computer hacks, often through phishing emails, hit 6,004, and extortion, where personal data is effectively held to ransom, 1,850.
People in London were the most frequently targeted marks for online fraudsters, where the Met police reported over 20,000 cases, way ahead of their colleagues in West Mercia (Herefordshire, Shropshire and Worcestershire), which only had 9,043 reports.
Meanwhile, the people of Essex and West Yorkshire emerged as the least easily fooled, with only 3,956 and 3,894 cases being reported in these jurisdictions – although a cynic, which I am, might point out that because unreported cases obviously weren’t taken into account, the good folk of Basildon and Bradford might just be too proud to admit it.
Donna Moore, who happily is TalkTalk’s head of scam prevention, believes it is the ISP’s responsibility to take on the role of education, which is just as well, otherwise I wouldn’t fancy her chances in her next appraisal.
“We launched our Beat the Scammers education and awareness campaign in 2016 and have continuously improved our service, encouraging our customers to activate our protection tools, completely free of charge,” she said.
“Such tools include CallSafe, which provides customers with a simple way to avoid unwanted calls and enhance their call security. Furthermore, we’re proactively blocking over 700 million unwanted calls a year, and we continue to safeguard customers with the TalkTalk Nevers – a set of guidelines outlining information we will never ask customers for.”
But of course, every good ISP story has to have ISPs throwing shade at other ISPs, so TalkTalk offered some helpful (to TalkTalk) comparisons. Its own CallSafe service includes a number of features that rivals BT and Sky are missing, including unlimited number blocking and whitelisting, feature activation through handsets, and automatic addition of frequently called numbers to an approved list. BT also lacks a screening service and options to accept or reject callers, while Virgin Media, claimed TalkTalk, offers no call blocking features at all.
This is a guest post co-authored by Zach Katsof, director of intelligent communications at Arkadin; Holger Reisinger, SVP of large enterprise solutions at Jabra; and Alan Shen, VP of consulting services at Unify Square.
Artificial intelligence (AI) and machine learning (ML) are evergreen buzzwords. Even within the unified communications ecosystem, AI and ML are popping up more and more frequently, from Cortana voice assistants in Teams and information overload reduction technology in Slack, to call quality troubleshooting algorithms in UC monitoring software.
When it comes to the unified communications and collaboration market, the potential for AI applications across enterprise messaging, presence technology, online meetings, team collaboration, smart headsets and room systems, telephony and video conferencing is endless. But this begs the question: within the UC ecosystem should we think of AI as still very experimental or as having already crossed the chasm? And, if the latter, which of the AI applications and solutions are over-hyped and what’s the real deal?
The AI potential in UC extends both forwards into the realm of the end-user as well as backwards into the domain of IT. For the end-user, AI can automate a series of actions to improve human-to-human collaboration. AI can sort through data (emails, chats, speech recognition) and identify keywords and patterns to then provide feedback on the best way to communicate based on the audience and topic. The more index-able user data becomes, the greater the ability to compare it with keywords from chats and create automated responses to instant messages based on user communication patterns.
AI can also navigate through data and categorise whether people are using time efficiently and productively. For example, while logging the meetings that take place in a company, AI can determine how many of those meetings had agendas, who were the participants, what the minutes included and how much time was spent on each topic. In a similar way, setting up a meeting using AI allows for better resource management. It can evaluate who is attending and recommend the best possible meeting space based on the number of people, the name or topic of the meeting and the tools that might be needed during the meeting. Additionally, it can determine whether or not the participants are in the same office or require a Skype for Business dial-in, and what hardware is needed, like a speakerphone or whiteboard based on whether it’s a brainstorm session or catch-up meeting.
On the IT side, AI can analyse vast amounts of data and UC logs that are available for use in troubleshooting and specific problem solving. Instead of IT having to be reactive in its response to either individual user or systemic UC issues, the presence of AI allows for extrapolated insights regarding how the individual, team or company is performing. Using this learning AI can then issue proactive guidance to IT regarding everything from changes to server configurations to recommendations for a new or different UC headset for a specific end user.
AI in action
AI is regularly applied to enterprise communications to increase efficiency and reduce unnecessary expenditure from humans on tedious work that a machine could take care of instead. The present and future of AI in UC, along with a rating of hype versus reality, is seen in the following areas:
- AI-based gesture recognition on devices: On a conference call, gestures can improve the conferencing experience. For example, on a video conference the system can measure expressions. Cameras can provide details regarding the body language of the participants and provide real-time feedback to improve presentation skills and/or responses. Rating: Early stage.
- Completely automated conference calls: There is ML in transcription, but it is not advanced enough to where voice transcription has completely eclipsed human comprehension. It wasn’t like that 10 years ago. Nowadays, Amazon’s Alexa can understand speech like a human. Rating: Early stage.
- Meeting Management and Follow-Up: AI-enabled devices learn who is speaking, identify key points and then automatically assist people with tasks and send notifications or meeting summaries to all attendees. Rating: Early stage.
- Conference rooms and room systems management: AI-systems drive the entire process of scheduling and setting up meetings. Rating: Nascent stage.
- Smart devices: Meetings are made more efficient and productive by augmenting the conversation with information/insights that currently take hours or days of additional work post- meeting to realize. Rating: Early stage.
- UC systems/platforms (e.g. Cisco, Skype for Business, etc.): IT departments can monitor entire UC systems as well as room systems and identify, for example, when a specific audio/video system in a specific conference room may require a maintenance check-in by IT to reduce possibility of down-time before users are impacted. Rating: Early stage (mature via third party apps).
- Web-chat systems/platforms (e.g. Slack, Teams, etc.): Based on ML from all conversations on the platform, web-chat systems can think in real-time and adjust the questions/suggestions to cater to a specific situation based on prior history/database of similar discussions. Rating: Nascent stage.
- End-user productivity enhancing bots: Personal assistants built into UC apps can simplify actions (e.g. search for information in real-time), and interactive bots can improve customer service interactions (e.g. IVRs driven by bots). Rating: Early stage.
AI risks and considerations
When analysing data input and output, there are risks to consider with AI. If we are able to achieve a level where software is actually taking action and either self-healing the UC systems, or self-scheduling new meetings, we open the door to the software potentially taking the wrong action. Once the algorithm is able to come up with a better conclusion than what a human could do, it issues a recommendation. The model for ML algorithms can be so complex that, if the user or IT wants to ask “why,” there may not always be a why. There are third, fourth and fifth-level elements in this massive, complex algorithm with a huge data model and structure. The human element must arrive at a decision point regarding whether they simply trust the output (because they believe that the outcome will be better), or whether they remain in constant oversight mode.
This is perhaps the ongoing AI dilemma – can an algorithm spit out a decision that would force IT or the end-user to think the machine is doing a better job at managing UC than the human? AI won’t replace our need to think and react. No matter how good a ML platform or AI solution is, people will always need to exercise judgment and validate actions prior to proceeding. The more AI is integrated into UC, the more dependent users will become on it. This could lead to an increased expectation of “perfect” meetings, chats and calls before, during and after the event. If AI systems are not able to keep up or deliver as expected, there will be limited patience and tolerance for poor performance and users will likely stop using it.
Per the state of hype versus reality, the irrefutable notion is that we have not hit the peak of AI – we’ve only begun to scratch its surface. In the current stage, we can still make AI work for us by improving efficiencies, and as it becomes more complex and developed, hope for an automated state of perfection. In short, AI is the real deal but there are still several miles to cover before we cross the chasm to peak performance.
I was hugely pleased today to see that despite the possibility of another legal roadblock in the Court of Appeal, telecoms regulator Ofcom is going to move ahead and lay the groundwork for the long-delayed auction of two massive slices of radio spectrum, one to support enhanced 4G mobile networks, and the other to form the basis for future 5G mobile networks.
But I also detected a definite note of frustration in Ofcom’s statement, which said “the litigation by Three is continuing to delay access to the spectrum and the benefits to consumers and businesses that can flow from it.”
And to be perfectly honest, I can’t say I blame the regulator for being a tad annoyed. Actually, I don’t think the regulator’s statement goes far enough.
Three is challenging the auction process because it believes that the way spectrum holdings in the UK are structured is unfair to smaller operators, that the combined BT-EE entity owns too much spectrum, that its holdings should be capped, and that its ability to bid in the upcoming auction should be restricted.
And I don’t argue with any of this. Yes, it is self-evidently correct that the way spectrum holdings in the UK are structured is unfair on smaller operators such as Three, but it is also true that Three has been able to buy up two major slices of spectrum in the past three years through its acquisition of UK Broadband in early 2017, and a 2015 deal with Qualcomm.
But I now have to ask myself what is more important? That everything is perfectly fair? Or that the UK is able to compete on the global stage?
The spectrum that Ofcom proposes to sell off could have been in use nearly two years ago. Data use on 4G networks shows no signs of stopping. And the first 5G networks will probably be rolled out in this country two years from now.
The UK needs this spectrum in use as soon as possible, and I find Three’s attitude increasingly at odds with the pressing national need to both grow and exploit the potential of our digital economy. We must have more network capacity!
It’s time for Three either to get over itself, or get its rich parent – which made £1.47bn in profit in the first six months of 2017 – to put its hand in its pocket and help keep its UK operation competitive.