BT’s new chief executive, Philip Jansen, is to revise Openreach’s targets for its full-fibre broadband network roll-out upwards when he reveals BT’s full-year financial results on Thursday 9 May, according to reports.
According to the Financial Times, which first reported the story citing sources familiar with the plans, Jansen may increase Openreach’s target for homes passed with a full-fibre – also known as fibre-to-the-premises or FTTP – connection by 500,000 to 3.5 million properties by 2020, and by five million to fifteen million by 2025.
The original targets were first set by Openreach CEO Clive Selley in February 2018, when he launched the firm’s Fibre First network build programme, which is now active in almost 40 large towns and cities in the UK, passing 14,000 homes every week.
If true, the revised plans are significant because while Openreach is essentially functionally independent, with its own leadership and board, it is still joined at the hip to BT, which has final sign-off on the network builder’s annual budget.
This means Jansen may have to move cash money from elsewhere in the BT Group to fund the additional works, and, ever with an eye for the shareholder angle, the FT reckons this is stoking concerns that the Group’s dividend to its investors will be cut.
This said, if investors can be or have been convinced that the FTTP roll-out is going to be some kind of golden egg-laying goose for the BT Group, that might not be a problem.
For Openreach to increase its targets could also be very good news for the overall regulatory environment surrounding the national full-fibre roll-out, as pointed out by ISP Review, which hinted that some of Openreach’s conditions for achieving its goals – such as easier access to land for works, changes to wholesale and business rates, and agreement with rival network builders on how to encourage mass consumer migration to FTTP – are being met.
The Full Spectrum approached Openreach for comment but was politely rebuffed. Never mind, we’ll find out tomorrow.
Many people using the London Underground today will likely have noticed that the usual Virgin Media-run Wi-Fi service is not working. Some might even have been mildly inconvenienced by not being able to access the internet on the platform like normal.
The network is down not thanks to some technical snafu, but because of the ongoing protests in London being conducted by Extinction Rebellion – a group of climate change activists who believe that direct action and non-violent acts of civil disobedience are now morally necessary in order to force governments to take action.
At the time of writing, the group was targeting London’s public transport network, and activists have glued themselves to Docklands Light Railway (DLR) rolling stock at Canary Wharf.
Because of this, the British Transport Police (BTP) took the decision to instruct Transport for London (TfL) and Virgin Media to turn off the Wi-Fi service in an attempt to disrupt the climate change protests by making it harder for activists to communicate and organise.
In the interests of safety?
While the BTP has attempted to portray the shutdown as being undertaken in the “interests of safety and to prevent and deter serious disruption” the underlying truth of the matter is that its actions curtail freedom to access the internet, freedom to express oneself using the internet, and freedom to organise using the internet.
Whether or not you agree with the aims of Extinction Rebellion – and causing disruption to environmentally-friendly mass transit systems is arguably not a positive step – the BTP has overreached itself.
It is therefore hard to escape the logical conclusion that the UK authorities are acting in a manner more befitting of an authoritarian regime.
Remember that Egypt famously cut off its internet connections in 2011 at the height of the Arab Spring protests. More recently the likes of Venezuela and Sudan have taken similar steps at times of national crisis, restricting the ability of their citizens to communicate freely. In other countries like China, the government prevents all access to western social media platforms (with the exception of from within a few luxury hotels), and virtually all social media activity is directed through government-approved and monitored platforms such as the nearly-ubiquitous WeChat.
A dangerous precedent
Obviously the shutdown of a single wireless network in limited locations does not amount to a concerted attempt to stifle freedom of expression for everybody in the UK, and it would be hyperbolic to suggest it does – otherwise we could not post this article.
However, in a time of increased political turmoil and social discord it sets an extremely dangerous precedent for a country that prides itself on fundamental freedoms to allow law enforcement agencies to act in this manner. The optics are, quite frankly, terrible.
All over the world, freedom of access to and expression on the internet is under growing threat. Freedom House’s 2018 Freedom on the Net reported an eighth consecutive year of global internet freedom declines. It said 17 governments around the world approved or proposed laws restricting online media in the name of fighting fake news and online manipulation, and 18 governments increased surveillance, often eschewing independent oversight and weakening encryption in the process.
Reporters Without Borders already lists the British government as an enemy of the internet alongside some of the world’s most oppressive regimes, such as Belarus, China, Iran, North Korea, Russia, Saudi Arabia and the UAE.
Coming on the same day as the Department for Digital, Culture, Media and Sport (DCMS) confirmed that the delayed and, to all intents and purposes, largely unenforceable UK porn block will come into effect on 15 July 2019 in an email that was openly copied to every technology journalist in the country (a GDPR breach), this will make many who argue for freedom of access and expression online very nervous indeed.
The British government is walking a dangerous path when it comes to online freedoms, and we all have a responsibility to challenge it, or risk our liberties being slowly but surely eroded.
The definition of insanity is doing the same thing over and over again, but expecting different results.
It’s a handy quote for all occasions – whether or not you believe Albert Einstein ever really said it (spoiler: he did not). Just take our dearly beloved prime minister, Theresa May, and her repeated attempts to get her increasingly meaningless meaningful vote through Parliament.
So I was interested to see that my old friend A Source Familiar With The Situation has been on the blower to Reuters again, this time with a choice bit of goss about unified comms firm Avaya. Apparently it is giving serious consideration to an approach from an unnamed private equity firm that values it at $5bn.
Now, maybe I’ve been doing this for far too long, but have a vague sense of déjà vu here.
But this time it’s not just a funny feeling, or a glitch in the Matrix – it definitely happened.
The $8.3bn acquisition of Avaya in 2007 by private equity backers made waves at the time, but to put it charitably, Silver Lake and TPG did not run Avaya particularly well.
The firm was saddled with billions of dollars worth of debt and was allowed to massively over-reach itself through the ill-advised purchase of defunct Nortel’s core networking hardware business (although I have to confess I clearly didn’t think so at the time). Ultimately, it all proved a bit too much and Avaya was forced to declare bankruptcy in 2017.
Of course, there’s no certainty a deal will happen, but it’s not controversial by any means to observe that private equity companies do seem to be all too frequent common denominators in bankruptcies the world over (even Bloomberg, hardly a bastion of Marxist thought, is prepared to publish pieces hinting at this)
The entire private equity business model hinges on acquiring companies, generating massive efficiencies and streamlining operations (job cuts!), increasing margins and then selling them on. But for this to be successful you need market stability and that is something that has been in short supply for some time.
The last time Avaya was acquired in 2007, the US subprime mortgage crisis (that ultimately developed into a full-blown crash) was just beginning to bite, and the resulting recession was so severe that it is perfectly possible to argue we have still not quite recovered from it. Economic booms and economic busts go in cycles, and the next recession is now widely considered overdue.
Coming just 15 months after a revitalised Avaya emerged from bankruptcy protection as the direct result of its mismanagement in private hands, entertaining thoughts of inviting private equity back in…. Well, if that ain’t the definition of doing the same thing again but expecting different results…
With eight days until the UK’s scheduled exit from the European Union, a prime minister who has lost control, a paralysed political system, and Britain reduced to a laughing stock on the world stage as it limps into an entirely avoidable humiliation, the country feels like it’s on the edge of total breakdown.
Indeed, over the past 18 hours, over 750,000 angry Remainers – the sort of people who (for some reason) unaccountably insist on indulgences such as supermarkets with things to buy in them and access to life saving drugs – have signed a last-ditch petition on Parliament’s website to revoke Article 50 and forget about the whole thing.
Unfortunately for them, the unintentional side effect of thousands of people logging onto the same website at once is causing an abnormal spike in traffic to the government’s site, with the effect that the servers that host it are repeatedly crashing, resulting in visitors being served a 502 Bad Gateway error (which means one server has received an invalid response from another one upstream, i.e. it’s offline). In effect, an accidental distributed denial-of-service (DDoS) attack is taking place.
Our sister site WhatIs defines a DDoS attack is one in which multiple compromised computer systems attack a target such as a website and cause a denial of service for users by overwhelming the target with a incoming messages, connection requests or malformed packets, causing it to slow down or crash.
DDoS attacks are more usually associated with bad actors, state sponsored hackers, cybercriminals and so on – the massive Mirai IoT botnet attack in late 2016 was a DDoS attack – but it isn’t unheard of for sudden spikes in legitimate traffic to websites to have a similar effect.
Ruby on Rails expert and Unboxed (the firm behind the petitions website) CTO Andrew White explained on Twitter that the site had gone down because calculating the trending count became too much of a load on the database – Unboxed uses an AWS-hosted relational database service (RDS).
There is one way that you can help if you are signing the petition today, please consider others and resist the temptation to sit there refreshing it, and trust that GDS will have things back up and running in good time.
All the world loves robots, and when they are robots holding something a human might hold, doing an activity a human might do, and preferably doing it while looking a bit like a human (two legs, arms, a smiley cartoon face, etc) the world loves them even more.
And at Mobile World Congress in Barcelona last week, it felt like you couldn’t move without bumping into a robot doing something a human can do. There were dancing robots, drumming and piano-playing robots, packing robots and even a barista robot.
What’s 5G got to do with it?
The idea behind all these robotic demos is that the advent of 5G mobile networks in the not-too distant future will really kick-start the robotics revolution.
Operators promise 5G will bring vastly increased speeds, vastly reduced latency, and vastly improved possibilities around edge computing. This will create a perfect storm (if you’re a robot) for robots, because it enables them to accomplish more tasks, quicker, and with even greater precision than was already possible.
That’s the claim, anyway. The reality seems to be a little more clunky – let’s just say the dancing robot posed absolutely no threat to Anton du Beke, the packing robot made more mistakes than a tired Amazon warehouse worker who hasn’t been to the loo for eight hours, and the drumming robot … well … it might have been the best drummer in the Beatles.
Coffee to go
But one robot in particular caught the eyes of the thousands and thousands of Congress-goers feeling the strain the morning after an intense ‘networking’ session.
Using a smartphone application, users place their order in advance, then watch as the robot swings into action, whirling around, grinding, pouring, frothing and icing. When your coffee is ready, you simply approach the booth, type in a unique pin, and Beat delivers your drink through a self-service hatch.
Backed by KT’s 5G base stations, Beat’s human controllers monitor the robot in as near to real-time as makes no difference, checking in to see if something needs repairing, or if the bean supply is running low.
Unlike human baristas, who generally do not connect themselves to networks, edge-based compute power means Beat can prepare three coffees at once, up to 90 per hour, and if left idle with nobody looking at it, it even becomes its own marketing department, waving its arm to attract attention and flashing happy emojis at passers-by. Beat that, underpaid service industry workers!
Humans under threat?
So do robots have a future in the high-stakes world of coffee preparation? I’m not so sure – for while my iced Americano was a much-needed treat on a busy afternoon, Beat completely forgot to write my name on the side of the cup in black marker pen, and that’s not easily forgivable. Baristas of the world, I think your jobs are safe for now.
This is Alice Scotts Town, reporting from Barcelona, for Computer Weekly…
This year, my flight from London’s Stansted Airport to Barcelona for Mobile World Congress, plus two days of hour-and-a-half commutes through horrendous traffic between my hotel and the conference venue on a half-empty fume-belching diesel bus, generated at least 0.24 tonnes of carbon dioxide.
But it was probably more than that. In order for me to spend time in Barcelona this week, a vast, continent-spanning logistics operation was pressed into service, seamlessly and efficiently, but at such environmental cost! Technology companies spent hundreds of thousands of dollars on building materials for stands – some as big as a decent-sized family home – that otherwise would not have existed and will be torn down at the end of the week.
Then there’s my paper metro ticket, my receipts for lunch, the countless cardboard coffee cups, the plastics in my badge, the unwanted extra fabric lanyard (handed to me by Ericsson when I walked onto their stand covered in Huawei branding), the free pens and little presents tech vendors like to foist on the press corps, all consumed resources that did not need to be consumed.
Oh, I ate a lot of tapas, and I do love pa amb tomàquet – the Catalan speciality of bread smeared with tomatoes and often served with ham – so several pigs died for me, too. They were delicious.
I am a technology reporter at Mobile World Congress, and I am also an environmental villain.
Of course you can always make the argument that that plastic would have been used and consumed anyway, those pigs would still have died, Ryanair flight FR9810 would still have taken off without me, my hosts would still have rented a half empty coach for 12 journalists, and a driver with an apparently magnetic attraction to traffic jams.
So why am I bothered?
A carbon neutral MWC?
The GSMA – the mobile trade body that runs and organises the show – makes a virtue out of Mobile World Congress being a sustainable, carbon neutral event. It enables attendees to offset their carbon emissions by donating to green projects. It also provides a number of recommendations to enable attendees to minimise their environmental footprint, and it can boast a number of laudable achievements. The GSMA is taking sensible, rational steps, and I don’t want to suggest for a moment that they are not trying very hard.
But the GSMA’s actions don’t go far enough. There is no compulsion for attendees to offset their carbon emissions (and carbon offsetting is, in my opinion, pointless – the carbon was still emitted), it is not a requirement to take public transport to the venue, you can still print out your schedules rather than download the app, you don’t have to choose locally grown or sourced food options, and unless you’re coming from elsewhere in Spain, or possibly parts of France, you’re still going to fly.
The GSMA claims that Mobile World Congress was carbon neutral in 2015 and 2016. But was it? Really? When every variable is accounted for, I’d say that was highly, highly improbable.
What’s the point?
So I’d like us to at least begin to discuss the possibility that Mobile World Congress simply isn’t necessary, and as increasing speed, capacity, and capabilities of networks prove, travel for any reason other than personal growth and enrichment, that is to say, business travel, is becoming less necessary.
Okay, you’re right, Mobile World Congress is a fun couple of days. We meet, greet, catch up, learn, drink into the small hours…. These are good things (mostly) and their loss will be keenly felt. But nobody said the transition to a post-carbon economy would be without pain and loss. Nobody said that saving our home from catastrophic climate meltdown and ecological breakdown would not require radical actions.
And with temperatures back home in the UK hitting 20⁰ centigrade in February, ice melt speeding up by every measure, and the biosphere quite literally dying around us, I’d argue the time to take radical actions is now.
Which side are you on?
When you’re weighing up seven billion human lives and the preservation of a habitable planet for future generations, radical choices don’t seem so radical. The solutions to climate meltdown are right in front of us – we just need the popular will and the political backing to implement them. Would you rather your descendants lived in the utopian Star Trek future, or the barbaric Mad Max future?
So I’d like to ask you to ask yourself a few things. Why do we need Mobile World Congress? What social good does it serve? What ecological benefit does it bring? The GSMA talks so much about the power of social good and sustainability, and how mobile will solve these problems.
And yet, and yet, every February 100,000 people descend on Barcelona, snarling up the streets, choking the azure Mediterranean sky, and consuming, consuming, consuming.
As the UK’s full-fibre broadband roll-out gathers pace through 2018 and into 2019, many people continue to believe gigabit services are unnecessary overkill. CityFibre’s Caroline Hughes reckons they’re mistaken.
As a very young child, my brother and I worshiped our ‘now vintage’ Dragon 32 computer. It was no Sinclair ZX Spectrum or BBC Micro, but I’ve never forgotten the fun and enlightenment brought to my childhood by that space-age typewriter!
It came with a few cassette-style games that loaded slowly, loudly and often fifth time lucky, and the only connectivity it had was to the national grid via a power socket. Yet, my brother and I waited patiently, played happily and even spent hours typing in page upon page of computer-magazine-published code to make it draw the simplest of pictures.
In time, our beloved Dragon 32 was superseded in our home by an even more revered Amstrad CPC464. And since those days, countless computers and devices have come and gone in our lives – each more powerful, transformational and immersive than the last.
Little did I know back then that I would spend my career in the heart of the telecommunications industry, or that almost 40 years later I would be sat here reflecting on how far our use of computing technology has come and how dependent we all are on the speed and quality of the connectivity it now demands.
There was also no way I could ever have imagined that by now I would’ve worked for a company like CityFibre for almost three years; helping to realise its founders’ mission to roll-out whole-city full-fibre broadband networks across the UK and celebrating the gigabit speed services that are now setting new connectivity standards for homes and business across Britain.
Anyone who takes time to reflect on how far our industry has come in the last 10 years – let alone since the early home computing days of the 1980s – would be foolish to underestimate where the next 10 to 40 years might catapult us. Yet as CityFibre continues to march ahead, executing its plans to deliver full fibre connections to over 5 million homes by 2025, I see some astonishing comments, from some unexpected naysayers!
In most cases, the announcement that our fibre-to-the-premises (FTTP) services are coming to an area elicits a widespread, heartfelt welcome, huge sighs of relief, or both! But mixed among the overwhelmingly positive reactions are occasional comments that “homes simply don’t ‘need’ gigabit speeds” and that what we are doing is “overkill” or “just for publicity”. And while the average person can be forgiven for this, shockingly, these comments usually come from individuals working within the telecoms industry!
Surveys, like the one carried we out among UK gamers last year, reinforce the short-sightedness of such comments by putting a magnifying glass on the impact that poor connectivity has on this sub-segment of the consumer market. Among other facts, the survey data highlighted that, at today’s average UK connection speeds, downloading recent game releases such as Call of Duty: Black Ops 4 actually takes longer than it would to fly from the UK to the game developer’s headquarters in California!
But my main sympathy doesn’t actually sit with the poor soul waiting over 12 hours for their download. It sits firmly with the tortured mother, brother or flatmate who’s trying to do what is now relatively ‘normal stuff’ online at the same time, i.e. working from home, video chatting with family overseas or streaming a movie.
Need or want? Should it make a difference?
What fascinates me most about the survey though, is how it highlights that not all people who live in UK homes define what they ‘need’ from connectivity in the same way. In many spheres of our lives, what we ‘need’, what we ‘want’ and what ‘puts a great big grin on our face’ are often very different things. I never ‘needed’ that Dragon 32, but it inspired me, it gave me skills and insight, and it definitely made me smile! How we relax in the evening or spend our precious downtime on the weekend is personal to each of us, and, as time progresses, connectivity is unarguably evolving and enriching many of these experiences.
Connected devices are already rife within our homes and even if we can’t predict exactly what our personal digital environments will look like in 20 years’ time, past experience tells us that technology won’t stand still. And, without gigabit-capable fibre connections ready to underpin those inevitable advances, woe betide any ‘regular’ user! Let alone one who finds themselves living with someone whose spare time is spent immersing themselves in the latest gaming experience.
Well-connected homes enable so much more…
Of course, it’s not just about being able to support the entertainment, communication and technology desires of home owners. Gigabit speed networks also have a vital role to play in supporting more basic human needs. Two such examples are digital health and social care and remote learning. Both are areas that local governments are now targeting as part of their smart city and digital agendas. As the technologies in these areas advance, the services deployed will become wholly dependent on ubiquitous high-speed fixed and wireless connectivity. And any remaining digital divide at a community level will hinder the efficiencies and cost savings that are driving their inception.
Investment in ubiquitous gigabit capable networks also radically simplifies and reduces the cost of connecting the small cells that will need to be delivered at density across whole cities in support of enhanced 4G mobile, 5G and smart city IoT. And with that, it comes full circle back to the individual again! Consider the enhanced mobile data use demands of you, I and countless others sat in our self-driving cars as we use the valuable downtime to catch up on work, communicate with family and even stream entertainment and news.
The motivation for broadband is simple
Our industry has a duty to serve all – from the smallest child with their first tablet device to the early adopters of tomorrow. If, as part of this, we fail to look to the future, then we don’t deserve that responsibility.
Committed gamers, the ever-increasing number of connectivity-dependent homeworkers, ‘garage innovators’ and home-based business owners are just a few examples at one end of the consumer spectrum. Yes, they are sub-segments of a large and diverse market sector. But, for as long as I’ve been in this industry, these segments have shone a light on the future; highlighting connectivity requirements that, over time, have become considered a human right by almost everyone else.
If we cut back to where we are today, it’s clear that investing in and building a digital infrastructure capable of serving everyone and everything in an area with gigabit speeds, is neither wasteful nor something to be questioned. We are way beyond that! It’s common sense, enabling for the future and truly exciting, especially to anyone who knows what a full-fibre connection is theoretically capable of delivering – right here, right now!
Our duty is to deliver choice
Comparatively speaking, low- and mid-speed services will always be available! For those who only need to do tomorrow’s equivalent of making phone calls, downloading e-books, paying bills or sending emails, service providers will still be able to provide entry level services over a modern high-speed network. But for many, the desire for those higher speeds is already here and for a large percentage of the rest, technology will drive them there in the blink of an eye.
Even if technology evolves slower than we anticipate, it’s no excuse not to ensure we have the infrastructure and services ready for the future. As an industry our job is to predict, prepare for and embrace the future, feed the hunger and leave no one behind.
Caroline Hughes is head of marketing – portfolio and engagement at CityFibre.
The government recently announced plans that – despite facing delays and setbacks – a new Emergency Services Network (ESN) will be phased in from the beginning of 2019. It will replace the legacy infrastructure, Airwave, a radio-based terrestrial trunked radio (Tetra) network.
Now, Tetra, and similar professional mobile radio (PMR) networks, are a common approach to public safety comms the world over, delivering reliable voice services, basic messaging, and providing effective coverage over great distances. Tetra can carry data traffic, but is not appropriate for large data packet transmission, as it supports only narrowband connectivity and kilobit throughput rates.
ESN on the other hand, can provide multi-megabit per second data rates and multimedia capabilities as well as traditional voice and messaging services. Its completion (date TBC), will make the UK the first in the world to deliver critical voice and data for emergency services over a 4G network. Police, fire and rescue, and ambulance services will be able to take advantage of these services from early next year, with voice capability available at a later stage.
This will also be of a far higher quality than the legacy system, as LTE can change modulation and adapt it to the signal link quality, ensuring that even in bad link loss conditions, the network will continue to provide a voice or low-data connection. Emergency service personnel will be equipped with a multi-feature handset – the result of a £210 million government contract with Samsung – which will also include a ‘push-to-talk’ function, effectively turning them into enhanced radios.
The introduction of data services to the consumer and enterprise markets has been transformative – just think about how often you use WhatsApp, share images and videos, send voice notes and conference call over IP, for example.
In that light, it is hoped that digitalised communication and data services will also be a game-changer for the emergency services. Paramedics will be able to send videos and images to A&E staff to allow them to more effectively prepare for patients’ arrival. Police officers – many of whom are now equipped with body-worn cameras – will be able to live stream video. Real-time content transmission will be used to aid satellite surveillance.
A challenging transition
This sounds great, but the roll-out has not been without its challenges. Work has begun to build the first mobile mast for the network, near Lockerbie in Scotland, but the entire network is a while off completion and an exact end date hasn’t been specified.
Behind schedule and over-budget, completing the build of the standalone public safety network will likely take between five to 10 years. Network operator EE is developing a dedicated core system to support ESN, including hundreds of 4G sites to expand coverage in rural areas, while 800MHz spectrum will be deployed across thousands of other locations. The additional functionality like push-to-talk and group chat functionality, will require new core, signal processing and radio interface protocols to be developed, tested and deployed. It’s therefore of little surprise that the government has been hesitant to confirm a date for the voice services element of ESN to be fully up and running.
A second challenge concerns coverage. As a public safety comms network, coverage must be ubiquitous, allowing emergency service personnel to communicate wherever they are. This is less of an issue for the legacy Tetra system, which uses low frequency, narrowband 400MHz spectrum, which delivers effective coverage and in-building penetration in almost any environment.
ESN’s 700MHz, on the other hand, has less range and penetration, which could compromise communication in areas like basements, tunnels, caves, and remote rural locations – just the kind of areas where the public are potentially at greatest risk of an incident!
Coverage is also a concern for those who’ll actually be using ESN – emergency services personnel. In a survey conducted in 2017, individuals from police forces, fire authorities and ambulance trusts ranked network coverage as number one in their list of concerns regarding the transition to the ne network.
Rapid response required
Networks must be densified in order to provide coverage in hard to reach areas, and building owners (modern building infrastructure can also impair coverage) must ensure that they invest in in-building coverage solutions which deliver scalable, affordable LTE public safety communications in these challenging environments. The same goes for road and rail tunnels. Public safety comms solutions must be flexible and straightforward to adapt and upgrade, dependent on regional requirements.
Pushing back the original date of deployment was a wise – if unavoidable – move by the government. In the meantime, we’ll see a phased deployment, which will involve a hybrid Tetra and LTE network – an approach which Britain is not alone in taking. South Korea’s LTE public safety network – which is mid-deployment – is interoperable with legacy Tetra equipment, and in France, an LTE network is being built which will share Tetra’s current infrastructure.
The project is an ambitious one and challenges are to be expected. However, these can be overcome with time and through considered investment in innovative new network coverage solutions. The government has estimated that ESN will result in annual savings of £200 million, yet these will only be realised once Airwave has been fully replaced.
Time is ticking, and although many members of the public won’t have heard as much about LTE ESN as they have 5G, this is still public money and government time that many will be keeping a close eye on. The government and private sector partners must work hard in 2019 to ensure that the ESN can deliver on the price point and performance promised.
This is a guest blog post by Ingo Flomer, vice president of business development and technology at Cobham Wireless
This is a guest blog by Manish Jethwa, CTO at IoT specialist Yotta
Our national infrastructure assets are under more strain than ever. According to the Office for National Statistics (ONS), the UK population was just over 66 million in June 2017. By 2041, the ONS projects it will reach almost 73 million. In line with this, road usage is increasing. According to the Department for Transport (DfT), 327.1 billion miles were driven on Great Britain’s roads in 2017, a 1.3% increase from the previous year, and up nearly 17% on the corresponding figure for 1997.
At the same time, environmental conditions are becoming more challenging as climate change accelerates. In 2018 alone, we have seen the ravages of the ‘Beast from the East’, followed by heatwave conditions as the country experienced one of the hottest summers on record.
In response to these pressures, councils clearly need to put in place a maintenance strategy that protects their infrastructure assets. Given ongoing budget cuts, effective asset management will be required to better identify and target the most critical and vulnerable assets with available funds.
The latest Internet of Things (IoT) technology and sensors can be beneficial here but if councils want to deliver an operationally-efficient and environmentally-friendly approach, they need to use them selectively. Councils should be aware that their manufacture and their energy usage does have an environmental cost, and make sure they are balancing that impact with the benefits they gain from using them.
That said, there are a range of applications where benefits can be achieved. Selective use of sensors can, for example, be crucial in helping councils develop a maintenance strategy that protect against the dangers of flooding. The increased surface water from storms can put a strain on infrastructure that only regular maintenance can help alleviate. Water and silt level sensors mounted in drains can be the key to identifying those assets that are more likely to get blocked, or that are likely to have the biggest impact if they are blocked, allowing these drains to be maintained more frequently. If these drains and sluices are kept clear, the impact of storms would be lessened and the potential for flooding reduced.
Another area where judicious use of IoT solutions can improve public safety is in the development of smart tree technology. Sensors are now emerging capable of measuring the movement of trees to assess their condition and stability.
Once again though, the sensors need to be distributed sparingly to ensure the environmental benefit of trees is not offset by the cost and ongoing energy use of monitoring them. Councils again need to find the most vulnerable trees, typically using data about age and condition together with the expertise of internal arboreal teams, and only mount sensors where required.
Using sensors to target areas of vulnerability within these two different asset classes could provide the authorities with the potential to start building an intelligent combined asset management approach. By pinpointing where trees and the most vulnerable drains are located, they can start to establish connections between the two. That might help them see, for example, that they have large number of deciduous trees located next to a high number of drains in a low spot. Leaf fall is therefore likely to be high, and the fallen leaves tend to wash together and block the drains as a result. That insight can be a significant benefit to councils who can then better understand how different asset classes link together and how maintaining one can positively impact the status of another.
As we look to the future, councils should be carefully monitoring the latest advances in IoT and sensor technology. The use of sensors can bring significant benefits. However, a note of caution needs to be sounded. There are costs to using sensors, both financial and environmental, and therefore they should never be used in a scattergun manner. Instead, their deployment should be precisely targeted as part of carefully planned, coordinated and, ideally, connected asset management strategy. If used in that way, they can be a real boon for councils looking to mitigate the worst impacts of ongoing wear and tear, and severe climatic events like storms and flooding, on their infrastructure assets.
This is a guest blog post by Tony Judd, MD UKI and Benelux at Verizon
With the popularity of new technologies such as artificial intelligence (AI), the Internet of Things (IoT) and software-defined networking (SDN), impacting almost every aspect of modern business, organisations are having to transform their networks in order to take advantage of these technology developments. However, this change is being used by some to falsely prophesise the end of multiprotocol label switching (MPLS), yet this couldn’t be further from the truth.
Although SDN and other networking techniques are transforming how networks are architected and operate, they do not actually replace the functionality that MPLS provides. It is true that SDN has helped drive opportunities to augment network architectures with lower-cost broadband and public internet connections to enable hybrid networking. However, SDN does not actually replace the need for higher-quality MPLS connections for critical applications as some over-the-top (OTTP) network providers might have you believe. Both technologies will coexist and, in fact, SDN will depend on MPLS for traffic management and security—the attributes that made MPLS networks reliable and desirable in the first place.
Recent technology advances such as media streaming, social media and mobility have generated massive amounts of data that flow into networks from a myriad of devices. Now as IoT, AI and edge computing environments start to go live, data volumes will become even astronomical. Currently, 2.5 exabytes of data are generated daily, and Cisco estimates that data volume is growing at an annual rate of 24% through 2021.
Combined, all of the recent and ongoing technology developments – cloud streaming, IoT, mobility – changed how enterprises consume applications and, as a result, also changed bandwidth demands and (WAN) traffic patterns. As such, enterprises face serious challenges related to scalability, security and network performance. Network traffic is unpredictable and much of it flows from multiple sources dispersed throughout private and public cloud infrastructures as well as data centres.
Scalability limitations and security concerns are more pronounced for enterprises that use multiple vendors to run their networks. Like the reliability of the network itself, security policies and solutions vary from vendor to vendor. For instance, OTTP service providers deliver security at the application layer because they don’t own the underlying network, so the data they handle can become more vulnerable when crossing network boundaries. That’s because elements of the underlying networks are managed by multiple service providers that don’t always communicate or collaborate with each other. In contrast, a provider that owns the underlying network infrastructure can design a secure network to meet enterprise needs.
Easing traffic congestion
To get the most out of their SDN investments, enterprises should use MPLS for critical applications and locations and simply supplement with broadband for less critical traffic. MPLS is designed with the built-in security and scalability that modern businesses demand. Network providers that own the underlying network can deliver strong protection against increasingly common types of cyber-attacks – DDoS (distributed denial of service), ransomware and zero-day threats.
Today’s enterprises also need smart networks that prioritise traffic based on the applications they use, both at the point of entry and exit from the network. Intelligent networks prioritise each application and allocate the proper amount of bandwidth. For instance, the network distinguishes between audio and video applications that require higher priority from casual internet browsing.
This refined approach to traffic balancing isn’t available through public internet connections, but there are providers that offer private MPLS connections and monitor those connections around the clock to maintain performance, scalability and security.
Advanced security through MPLS
A further benefit of using MPLS is that it can help to deliver strong security through the design of the network. Through private connections, MPLS can be used to separate IP addresses from routers and hide the internal structure of the core network from the outside.
In addition, MPLS can be used to put in place additional controls customised to an organisation’s specific needs. These controls can typically support an organisation’s compliance with industry-specific regulations or standards such as HIPAA (Health Insurance Portability and Accountability Act) for healthcare and PCI DSS (Payment Card Industry Data Security Standard) for retailers and other businesses that process credit card information.
MPLS and SDN: working together
Without a doubt SDN is changing how networks are managed, driving increased flexibility and scalability to enterprises, allowing them to dial services up and down as required. However, SDN will not mean the end of MPLS, instead SDN will require MPLS to increase security and manage traffic in an effective manner. With this in mind, businesses who want to put in place the latest and greatest digital capabilities should adopt a network strategy that can deliver the best of both worlds; SDN controls combined with MPLS capabilities.