Computer Weekly Editor's Blog

June 15, 2018  10:12 AM

The next moves by DCMS on data and digital identity must lay foundations for UK’s digital economy

Bryan Glick Bryan Glick Profile: Bryan Glick

Matt Hancock, the secretary of state for digital, culture, media and sport (DCMS), is not everyone’s cup of tea – especially when that cup is Matt Hancock branded. But regardless of whether you’re a fan or not, he’s clearly on a mission to accelerate the UK’s digital economy, and is winning some important battles to do so.

Across Whitehall and in the private sector, the past couple of years have seen growing frustration with the Government Digital Service (GDS) at the slow pace of its plans for data and digital identity – both rightly identified as being fundamental to the digital economy.

That frustration has been particularly felt in Hancock’s DCMS department, where national technology advisor Liam Maxwell and Matthew Gould, director general for digital and media policy, are responsible for “making sure the UK has the world’s best digital economy”.

Rumours of heated debates between Cabinet Office and DCMS officials came to a head in March when GDS lost control of data policy to Hancock’s team, barely a year after GDS put data at the heart of its government transformation strategy.

This week, Hancock announced plans for a new National Data Strategy – not yet in place, but already pushing forward on the data agenda that seemed in limbo over the previous 12 months.

In a press briefing the week before, attended by Computer Weekly, Hancock also dropped the bombshell that DCMS has quietly taken over digital identity policy too. If his team move as quickly again, we can expect to see a new strategy in place in the coming months, one that will be widely welcomed and will hopefully clarify what role GDS’s Verify ID assurance system will play in the UK’s wider digital identity ecosystem.

Hancock’s announcement seemed to take GDS by surprise – its officials initially claiming nothing has changed and bizarrely suggesting GDS was never in charge of digital identity policy. Even allowing for the intricacies of government machinery, that would come as news to most people operating in the identity community.

DCMS’s challenge now should not be underestimated –the steps it takes next on data and digital identity will resonate for years to come, setting the tone for the next stages in developing a world-leading digital economy in the UK post-Brexit.

We now have the most pro-digital government the UK has ever had – in terms of policy, direction and ambition, at least. It’s vital that DCMS turns that intent into practice and lays the foundations that allow the UK to take advantage of the enormous opportunities of the digital revolution.

June 11, 2018  3:03 PM

Brave new world opens up for Microsoft

Cliff Saran Profile: Cliff Saran

Under the leadership of Satya Nadella, Microsoft looks like a very different organisation from the one his predecessor, Steve Ballmer, attempted to build. Ballmer infamously jumped up and down on stage at a Microsoft developer conference, proclaiming: “I love developers.” Outwardly, it looked like his strategy built on the strong Microsoft ecosystem pioneered by founders Bill Gates and Paul Allen, to spread Windows everywhere.
It is highly unlikely that Nadella will ever jump up and down on stage for anything. Instead, under his leadership, Microsoft is currently the largest contributor on GitHub and last week announced its $7.6bn acquisition of the open source repository.
Arguably, open source is not an alternative to commercial software. Rather, it is a way to create software collaboratively, and at a global scale. Active contributors in the open source community resolve problems; the source code is visible and can be tweaked and updated by anyone.
Analyst Gartner believes Microsoft’s acquisition of GitHub gives it a way to target 17 to 30-year-old developers, who code primarily using open source tools and build cloud-native software. That is why Microsoft acquired Xamarin in 2016, bringing an open source version of .Net, and is why its Visual Studio is available on Linux.
If it is to succeed in offering a viable platform in the Azure cloud, Microsoft needs to make Azure the best platform for open source. The value-added services it then offers on top of Azure, such as for machine learning, artificial intelligence, the internet of things and graph databases, are the bait to entice developers to write code built using its public cloud.
Yet for every open source project Microsoft puts on GitHub, there are likely to be alternatives that developers can use, code submitted by the likes of Amazon Web Services, Facebook and Google to startups and individual developers’ work. This gives open source developers a choice, a choice that was not so easy in the Windows-only world of the old Microsoft.
Nadella now has a chance to do something truly remarkable. After the initial licence purchase, Windows is already being distributed as a free operating system update. The next step is to make it free and rely on enterprise support contracts for revenue. And finally, Nadella should use the acquisition of GitHub to pave the way to making its core operating system fully open source.

May 31, 2018  3:12 PM

UK will lose without a new approach in the global race for digital talent

Bryan Glick Bryan Glick Profile: Bryan Glick

More than three-quarters of UK organisations are experiencing challenges in recruiting people with digital skills, according to research from Deloitte. It’s a shocking figure, but is anyone really surprised?

“The skills shortage” has been a trope of tech news headlines for 20 years – the skills for which we are short might have changed, but the UK IT sector has carried on regardless and isn’t doing too badly. So is it really such a problem?

Plenty of cynics would say it’s not, but the reality on the ground for IT leaders suggests otherwise. When Computer Weekly asks CIOs what is their biggest challenge, it’s almost always related to a lack of available talent with the skills they need for digital transformation.

Maybe 20 years ago the lack of skills related to SAP or Oracle products, as companies rolled out finance and business management software – there’s no shortage in those areas any more. But generally, it was only larger organisations affected.

In our fast-developing digital economy, the new skills needed are in demand from organisations large and small, in the private or public sectors. Every company and government body needs to change to take advantage of internet technologies.

As we have catalogued in Computer Weekly on many occasions, the UK education and training system simply does not produce enough people to meet IT’s demand. Meanwhile, the government seems intent on making it harder to attract top overseas talent to help fill the gaps.

Thousands of people eligible for skilled visas in the UK are being refused entry due to government immigration caps. Between December 2017 and March 2018, around 3,500 of the 6,080 Tier 2 visas refused were for people skilled in science, technology, engineering and maths, with 1,226 of the refusals affecting IT roles. It’s also harder to get student visas, sending enthusiastic youngsters elsewhere. Let’s not forget the Brexit effect too.

There’s a global war for top technology talent, and without a change in approach, the UK is going to be on the losing side. Look at Canada, for example, where the government is working to promote its tech sector and recognises the need to import the best people. Anyone with the requisite tech skills and a job earning at least C$80,000 can get a work visa in just two weeks, and the government is offering generous tax credits to fund up to 50% of salary for jobs in research and development.

Canada’s liberal immigration regime is a counterpoint to its nearest neighbour, where US president Donald Trump’s belligerent approach is preventing some foreign IT workers entering the country. Canada hopes to benefit – Seattle-based Microsoft and Amazon, for example, are setting up development centres in Vancouver, a short hop across the border, to house imported skills there instead.

“Canada’s comparatively open attitude towards immigration and attracting talent is a real strength that the UK should look to as a model,” says Thomas Goldsmith, policy manager for Brexit and trade at UK IT trade body TechUK.

“The UK should be keenly aware that there is a global marketplace for in-demand digital skills, and if these workers find it difficult to secure a UK visa or they if they are made to feel unwelcome, then there will be no shortage of countries holding their arms open to them instead.”

Deloitte’s research also shows that UK organisations want to increase their investment in emerging technologies such as artificial intelligence, blockchain, internet of things and virtual reality. Firms can see the opportunity – they just can’t find the talent to make it happen. The UK’s digital economy is under threat until the skills gap is addressed.

May 16, 2018  9:14 AM

GDS aims for path of least embarrassment as it shifts strategy on Verify

Bryan Glick Bryan Glick Profile: Bryan Glick

Last week, the Government Digital Service (GDS) dedicated a whole day to talking about what it’s up to, at its first Sprint event in two years. About time too, many observers said, since GDS seemed to have gone to ground over the past six months, notable mostly for its silence even at a time when Theresa May diminished its role in Whitehall by stripping away responsibility for data policy.

However, it tells you something about how GDS is perceived at the moment that the biggest news headlines from the day came instead from a 10-minute press briefing given to journalists on the sidelines of the event.

During a day of presentations and workshops showcasing the work of GDS, its Verify digital identity system was barely mentioned. But when Nic Harrison, director of service design and assurance, digital identity at GDS – the man in charge of Verify – talked to Computer Weekly and others, he opened up something of a Pandora’s Box.

To Harrison’s credit, his openness was welcome – GDS’s responses to Computer Weekly questions about Verify have for some time bordered on monosyllabic, and have often been information-free to the point of being meaningless.

All the while, the players in the UK’s burgeoning digital identity ecosystem have grown ever-more frustrated with the lack of progress on Verify.

Poor adoption

Harrison conceded that Verify adoption is poor – “still not stellar,” as he put it. He denied there was ever any fight between GDS and other Whitehall departments over Verify, citing “very sound operational reasons” to justify the lack of take-up.

That’s not what sources in some of those other departments have consistently said, but maybe they see things differently from outside GDS. Or perhaps “operational reasons” includes Verify’s inability to be used by companies or intermediaries – an essential requirement for HM Revenue & Customs (HMRC)  – or to work for Universal Credit claimants with little digital footprint.

Harrison also cited departmental cost controls introduced in the 2015 spending review, and the demands of Brexit as further reasons for the problems afflicting Verify.

“They’re worrying about EU exit, so we are frankly just not going to get hundreds of new services being digitised in the next year to bring on Verify,” he said.

As some observers have noted, GDS was given £450m in the spending review, a significant portion of which was earmarked for Verify – in return for an anticipated £1.1bn savings. But because of the poor performance of Verify, GDS hasn’t been able to spend all the money it was allocated for the project, and has been under pressure to return that unspent budget to the Treasury.

Brexit? In February 2017, nearly eight months after the EU referendum, the government transformation strategy committed GDS to delivering to 25 million users of Verify by 2020.

Losing patience

Perhaps it’s unsurprising, therefore, that patience among key players in the digital identity community is wearing thin.

Even OIX, the standards body that is largely funded by the Cabinet Office, and which GDS chose to help develop the market for Verify, is starting to publicly criticise progress. In a report published in April, pointedly titled Digital identity in the UK: The cost of doing nothing, OIX said: “The UK is among an ever-smaller group of developed nations without a national digital identity infrastructure.”

The report added: “While over 60 countries around the world have developed or are close to launching a digital identity scheme, digital identity developments in the UK have been more limited… We have seen the government’s own identity scheme Verify emerge, but it has yet to reach the widespread adoption it was targeting.”

Behind gentle words, often high criticism lies.

Experts highlight a sense of déjà vu in GDS’s latest comments on Verify. Here is a comment from one longstanding digital identity expert, to Computer Weekly:

“All the good (and bad) words on standards, industry engagement, as well as dogma about hubs seemed unchanged from [a] presentation to the Department for Trade and Industry in 2011. European interoperability demands were news at the Manchester declaration in 2005 – not delivered by 2010, so who thought that what was due in 2013 would be here by 2018? Local government still seemed excluded from contribution to provision.

“No battles? Perhaps those battle-scarred by the bitter turf war with DWP have all gone – and those in HMRC changed sides? Scotland didn’t engage. Defra and NHS weren’t exactly enthusiastic. And the missing commercial model for Verify? Not only is there no discernible interaction with active groups like Kantara, even the usually obsequious OIX’s latest report has indicated that all is not well in the UK.”

Other identity experts lament how the delays in Verify have hindered the development of a UK identity ecosystem, and warn that the UK is slipping down the list of digital economy nations as a result. Some have called for greater involvement and leadership from banks, especially in light of recent financial industry regulations such as PSD2 and open banking.

Change of strategy

GDS director general Kevin Cunnington was due to speak at an event on digital identity later this week – according to the latest update on the event website he has mysteriously disappeared, with Nic Harrison stepping up instead.

Rumours grow that GDS is considering a change of strategy for Verify. At Sprint, Harrison confirmed Computer Weekly’s story that GDS is moving towards promoting “Verify compliant” identity providers, rather than pushing Verify itself as a product.

He also partly acknowledged concerns about the role of the existing Verify identity providers, who have an effective monopoly on public sector ID where Verify is in use.

Harrison said he wants to see an ecosystem based on interoperability and standards – with government as the “arbiter”, not the provider. As the old saying goes, the great thing about standards is there are so many of them to choose from.

So it would appear that GDS is slowly and quietly starting to acknowledge the flaws in the current Verify plan – and hopes to extricate itself through a path of least embarrassment. A new strategy is emerging that will place greater responsibility on the private sector to drive the UK’s digital identity infrastructure.

The next question will be whether that’s enough to relieve the frustration caused by Verify so far, and stimulate the functioning identity ecosystem that the UK’s digital economy so desperately needs.

May 11, 2018  11:12 AM

Will Brexit stymie government’s digital transformation ambitions?

Bryan Glick Bryan Glick Profile: Bryan Glick

There’s rather a lot going on in government IT at the moment.

Last year’s ambitious transformation strategy set challenging aims of overhauling back-office systems across Whitehall, building an advanced data capability, and having 25 million users of the Verify digital identity system by 2020.

There’s the Universal Credit digital system being rolled out nationwide this year, despite concerns it’s not ready to cope with the scale of benefit applications. There’s the new digital tax system from HM Revenue & Customs (HMRC). There are a whole bunch of aging, costly outsourcing deals to be wound down and replaced.

Oh, and there’s Brexit.

It’s increasingly clear that government does not have remotely enough capacity to do all these projects, and something is going to break. The cracks are already showing.

The demands of Brexit alone are already pushing the most-affected departments hard. MPs have already expressed concerns about the lack of progress on key digital systems that will need to be overhauled to cope with leaving the European Union.

The Public Accounts Committee has highlighted an urgent need for digital skills across Whitehall to cope with Brexit. That’s on top of the 4,000 extra IT staff previously identified as a priority for departments’ digital transformation plans.

Already, important projects are being scaled back. HMRC is stopping or delaying a number of digital initiatives to allow it to focus on the customs system overhaul needed for Brexit. The Government Digital Service (GDS) this week said that Brexit priorities are a major factor in the painfully slow roll-out of Verify – and that wider transformation plans are being affected too.

“[Departments are] worrying about EU exit, so we are frankly just not going to get hundreds of new services being digitised in the next year,” said GDS director Nic Harrison.

And despite years of promises that government was not going to renew its many bloated outsourcing deals, exactly that is happening because there’s too much else going on to explore alternatives.

Oliver Dowden, the latest minister to be put in charge of digital government, said this week he is confident the government will achieve end-to-end digital services for citizens by 2020, in line with the transformation strategy.

He’s dreaming if he believes that. And with all the uncertainty still about Brexit, government can’t even be sure it’s identified all the likely tech implications of whatever future relationship we will have with the EU.

There isn’t a magic digital skills tree from which Whitehall can pluck tech expertise. There are a lot of big IT companies sharpening their contract-signature pencils, and we all know what happens when they get too big a chunk of government IT work.

Nobody wants to see the pace of digital transformation held back in government, but it’s clear something has to change. The current IT workload will soon become unsustainable.

April 25, 2018  4:07 PM

The biggest surprise about TSB’s IT disaster is that people are still surprised when banks’ IT fails

Bryan Glick Bryan Glick Profile: Bryan Glick

The only real surprise about TSB’s IT disaster this week is that people are still surprised when a retail bank has IT problems.

There are few organisations with a more complex and difficult technology legacy estate than the big banks. For all the online banking sites and mobile banking apps we’ve become accustomed to, these are only modern sticking plasters patched over some of the oldest corporate IT systems in business.

At the heart of most banks you’ll still find a hulking IBM mainframe running software written 20, 30, maybe even 40 years ago, often in programming languages that are so redundant the only people who understand them are retired or quite possibly dead. Cobol is widely used – and is perhaps one of the more modern languages you’d find.

Most of those applications still run as overnight batch processing systems, crunching numbers during the small hours to reconcile all the transactions from the multitude of other systems that feed into them. Your mobile app might show the purchases you made a couple of hours ago – but the master file sitting in the back-end mainframe won’t know a thing about it until that overnight batch run completes.

TSB was forced off the legacy IT of former parent Lloyds Bank only because it was acquired by a Spanish bank, Sabadell. The banking system TSB was migrating onto – a UK version of Sabadell’s in-house Proteo software – was developed in 2000, and that makes it young by banking standards.

Even then, the complexity and scale of the migration floored TSB. There are important questions that need to be answered around contingency planning and the ability to roll back to a stable platform once problems occurred. No doubt we will hear more before long at the inevitable House of Commons select committee inquiry.

But other banks will be looking at TSB and thinking, one day that could be us. The cost of moving from the current patchwork of legacy banking spaghetti to a modern, integrated system would run into the billions of pounds, and be the single biggest risk factor for the entire organisation.

For that reason, banking CEOs and CIOs who rarely stay in post more than three to five years have no desire to be the one who oversees such a high-profile, risky move. And so, the problem perpetuates, handed from one CEO to the next.

But the banks know that the day will come when the cost of doing nothing outweighs the risk of change. TSB needs to be open and transparent about what went wrong, for all of its peers in the industry to learn from its mistakes.

April 20, 2018  10:22 AM

Why Windrush scandal demonstrates the importance of technology to political decision-making

Bryan Glick Bryan Glick Profile: Bryan Glick

The Windrush scandal has revealed many disturbing things about government policy that have nothing to do with IT. But the issues raised amply demonstrate the importance of technology considerations in driving 21st century political decision-making.

Tony Smith, the interim director general of the UK Border Force between 2012 and 2013, told the BBC’s Today programme on Radio 4 this week that the major flaw in Theresa May’s “hostile environment” immigration policy as home secretary came down to one thing.

“You need an identity management strategy if you are going to have a hostile environment. You can’t have one without the other,” he said.

In the Brexit debate over immigration, one fact that has been often overlooked is that the European Union’s freedom of movement rules have an important exception. The European Parliament and Council Directive 2004/38/EC allows member states to deport EU nationals to their home country after three months if they have not found a job or cannot support themselves.

In order to do this, you simply need to know which EU citizens are in the UK, and whether they have found a job. To do that, you need exit checks at the borders to find out who has left the country and when – the UK knows who comes in, but never knew who subsequently went back out.

Why didn’t we have exit checks? Well, they were scrapped by Tony Blair in 1998, and – in theory – reintroduced in 2015, although a report last month by the chief inspector of borders and immigration said the exit check programme was not delivering what was promised.

Without exit checks, an identity management system for immigration is worthless, so the government never bothered with one – although Labour wasted millions on its identity card programme which was scrapped in 2010 as soon as David Cameron became prime minister.

“It is about getting the balance right between national security and civil liberties,” home secretary Theresa May said at the time.

Today, we have the troubled Verify identity management programme, which continues to under-perform and under-deliver, despite lofty ambitions to have 25 million users by 2020. But even if Verify works, it’s not linked to exit checks, and its federated design means it can’t be used to prove UK residence in support of immigration issues.

If the UK had a working identity management system in place – with suitable privacy and data protection controls, of course – that proved who you are and your right to be in the country, how different might the policy landscape be today?

April 6, 2018  11:29 AM

GDS needs to prove it can still transform government after loss of data policy to DCMS

Bryan Glick Bryan Glick Profile: Bryan Glick

I met Mike Bracken, the first leader of the Government Digital Service (GDS), in 2011, not long after he had started the job, for an off-the-record briefing about his ambitious plans to overhaul IT in Whitehall.

He asked me which online service I would highlight as an example of best practice and I chose the DVLA’s car tax system. Bracken acknowledged that most people would say the same, and added that his goal was for people to have many more options to choose from when answering that question.

What’s your choice for the best digital public service today? I suspect that for many of you, like me, you’ll still say car tax. Sure, there are better designed web pages for a lot of other services, paper forms have been web-enabled, and there’s, but the dramatic changes we all hoped for are harder to find.

Bracken’s high-profile resignation from GDS in August 2015 came when he lost the belief that Whitehall lacked the culture, organisation or intent to deliver on the vision he set out in 2011.

Dismantling GDS

Many observers – this one included – felt that Bracken’s departure would be the start of the slow process of dismantling GDS, a feeling accelerated when his successor, Stephen Foreshew-Cain, was unceremoniously ousted a year later, replaced by the DWP’s Kevin Cunnington. Given the friction between GDS and DWP, that move was seen as a victory for the Mandarins over the disruptive, pioneering spirit of early GDS.

It’s taken a little longer than anticipated, but Theresa May’s announcement last week that GDS was losing responsibility for data policy confirms that process is underway. It’s going to be inexorable, it will be a gradual diminishing, but it seems increasingly inevitable.

How can any organisation claim to be the centre of digital government, when it is no longer responsible for the core activity – the central plank – of digital services: data.

Perhaps we will find out on 10 May, when GDS hosts its first Sprint conference in two years. The event is billed as a chance “to celebrate all the great work that has been done so far to transform government” and to look at what comes next.

Cunnington wrote a blog in February to mark the one year anniversary of the Government Transformation Strategy – the plan that outlined what GDS would deliver by 2020. If he had published a similar blog six months ago – even a year ago – the wording wouldn’t have been that different.

Political fudge

When the tranformation strategy was launched, then Cabinet Office minister Ben Gummer said “Data is going to be the way we achieve the largest transformation in government.” It still will be, but that policy is no longer led by GDS.

Twelve months after promising at the launch, the Cabinet Office had still not delivered on the first element of that aim, recruiting a chief data officer for government. Can you blame people for doubting its ability to deliver on data?

If GDS is to lead the transformation of government, then without data policy it has to do so wearing someone else’s badly fitting running shoes.

The Department for Digital, Culture, Media and Sport (DCMS), which won the Whitehall battle over data governance, has grown increasingly frustrated with GDS. DCMS fought just as hard to take on policy for digital identity too – GDS seems to have won that argument for now, but widespread disappointment over performance of its Verify service means it’s under ever-greater scrutiny.

It’s illogical to separate responsibility for data and identity. It’s a political fudge intended to minimise further embarrassment for GDS. “OK, you’re losing data, but you can hang on to identity” – you can imagine the conversation that led to that compromise.

GDS staff frustration

Let’s be clear about one important fact though – GDS still employs some excellent digital specialists. Many of its staff do a great job – they’re passionate and committed to improving public services. But they are increasingly frustrated too, about the leadership and direction of GDS, as recent staff surveys have revealed.

GDS was once lauded for a policy of “making things open, it makes things better”. But today it’s hunkered down, secretive, rarely willing to comment using anything more than brief statements or bland, self-celebratory blogs. GDS still has an important story to tell – it makes no sense not to be telling it. Whoever decided to neuter GDS’s public voice has to take a large share of blame for the growing chorus of concern about its role.

People care about GDS. They want it to work. Most of the criticism directed at GDS comes because people want it to be better, they see its potential and the opportunity that exists. Nobody bothers to criticise an organisation they don’t care about. The moment the wider digital community stop caring about GDS, it’s dead.

The Sprint event will need to show – through measurable targets and proven deliverables – that GDS is still actively leading the digital transformation of government. It will not be enough to simply say it is, to claim it is, to speak in press-office approved words that obfuscate and avoid hard commitments.

The fact that the announcement about losing data policy was snuck out in a written statement without any fanfare at the end of the day before the Easter break, just as MPs left for the Parliamentary recess, was a tacit and shameful admission of failure in how digital government is perceived and communicated in Whitehall.

GDS has not even commented on the loss of data policy – not a word, despite press requests for its views. Such silence cannot continue – GDS needs to loudly and publicly address these concerns. People want GDS to work.

Digital reform

Also last week, two of the architects of the digital government strategy that led to the formation of GDS, Jerry Fishenden and Mark Thompson, published a new “manifesto” for the digital reform of public services.

Fishenden and Thompson were also co-authors of a 2010 paper titled Better for less, led by Liam Maxwell, then an advisor to the Conservative Party’s technology policy lead, Francis Maude. As Cabinet Office minister in the coalition government, Maude led the implementation of many of the recommendations in that paper and it served as part of the justification for creating GDS.

Maxwell, now national technology advisor at DCMS, is likely to be the prime mover in whatever DCMS now does with data policy.

If you feel inclined, read that 2010 paper, then read Fishenden and Thompson’s manifesto. The words might be expressed differently, but the principles are very much the same. How much has really changed in the interim?

We all see the potential, the opportunity, for public services designed for the digital age. So many people want GDS to be successful in delivering that goal. GDS, and the government, needs to demonstrate – publicly, openly and honestly – that it still can.

March 20, 2018  12:40 PM

When tech firms get too big – will Facebook & Google follow the cycle of IBM & Microsoft?

Bryan Glick Bryan Glick Profile: Bryan Glick

The technology industry is still relatively young, and certainly has a lot more growth to come. But it’s old enough now for us to see the repeating trends that make regulatory and legal restrictions of the big internet companies inevitable.

Google is already under scrutiny by the European Commission, and now Facebook is the focus of political and societal anger for its role in spreading fake news, influencing elections, and its relationship with the controversial data science firm Cambridge Analytica.

We’ve had three major eras in modern technology since the 1960s, and for the first two, the dominant company of the time has eventually been reined back by political pressure.

First, there was IBM in the mainframe age. As long ago as 1969, the US Department of Justice launched an antitrust case against IBM, alleging abuse of a monopoly in the supply of general purpose computers. The legal action lasted for 13 years before being dropped in 1982 – but the move changed IBM as a company and changed the IT industry too.

As part of its attempts to rebut the accusations, IBM for the first time unbundled software and services sales from its mainframe hardware, enabling greater competition and creating a whole new sector in the tech industry.

The forced move eventually proved almost calamitous for IBM. Despite being a pioneer of the PC in the 1980s, IBM’s internal paranoia about further antitrust action contributed to Big Blue missing the full impact of the PC revolution – the second major era of modern IT – leading to what was then the biggest loss in US corporate history, of $8bn in 1992.

The company that made hay while IBM’s profits declined was, of course, Microsoft, which became the dominant player in the PC age. Inevitably, this led to a similar turn of fortune, as Microsoft faced a series of antitrust cases in the US and Europe during the late 1990s and into the 2000s.

Distracted by its legal fights and by the need to change its own culture and sales behaviour, Microsoft was late to the game on the web and online search, and completely missed the rise of mobile, apps and social media. In 2007, then-CEO Steve Ballmer famously dismissed Apple’s new smartphone, saying “There’s no chance that the iPhone is going to get any significant market share. No chance.” The iPhone subsequently generated more annual revenue than all of Microsoft.

Now we’re in the internet era, and firms like Google and Facebook dominate. They are introducing new business models, and pioneering not just new technologies but new working practices. Society is only now starting to understand the implications of the vast databases of personal information these companies are gathering.

As someone once wrote, privacy is one of the defining challenges of the internet age. Facebook and Google are finding they have reached the limits of what people consider to be acceptable and ethical, after years of pushing those boundaries ever further.

Many experts predict that the next era of IT will be artificial intelligence (AI). As the influence of today’s early AI use rises, will changing political and legal attitudes to Google and Facebook ultimately stifle their plans to dominate in AI too? The historic example of Microsoft and IBM suggests it might – although if it does, that inevitably means there will be new AI giants emerging to restart the cycle.

March 1, 2018  3:17 PM

Can technology save the day for Brexit? (*Spoilers* : No)

Bryan Glick Bryan Glick Profile: Bryan Glick

If you happen to be an ardent Brexiteer, a true believer in the UK government’s desire for “innovative solutions” to avoid a hard border with the Republic of Ireland, we have good news for you. The technology exists to do the job and save the day for you.

For a start, there’s what your friends in IT call the “internet of things”, which means sensors on containers transporting goods across the Irish land border; sensors on the delivery vehicles; sensors tracking temperature, humidity, and other essential quality indicators; sensors on the goods themselves; sensors on animals – maybe even sensors on people (which of course they already have in their smartphones).

A well-designed customs and border software system would sit behind all this data, offering simple transactions for companies and individuals that need to cross the border, providing the basis of an infrastructure to deal with the majority of the challenges that a hard border presents. You might even be able to develop an app. Maybe you can use blockchain, and some “artificial intelligence“. You would probably still need some manual elements in the process, but largely, the problem is solvable. (Let’s not worry for now about the dozens of existing government IT systems you would need to interface to or integrate with – you know, details).


This may be admittedly a very high-level description, but the oft-touted (mostly) frictionless border is a technological possibility, of that there is no doubt.

If you are a Remainer, or at least a not-so-hard Brexiteer searching for common sense among the casual frippery promoted by the likes of foreign secretary Boris Johnson, here’s your counter argument to the proposal above.

It won’t work. But you know that already.

Why not? Well, it will probably be able to work one day, but even with the best will in the world – and the best technologists, and a very large amount of money – it’s not going to happen anytime soon.

Technology can do some amazing things. But if government should have learned anything from its various IT woes over the years, to do something new, untried, untested, at large scale, and especially in a two to three year timeframe (assuming the UK and EU agree a transition deal to the end of 2020) – it isn’t going to happen.

Frankly, if they agree a transition deal to 2025, it won’t happen either. 2030? Maybe – probably not though.

Remember the NHS National Programme for IT, which started out as a three-year project? How about Universal Credit, due to be fully implemented by 2016? What about e-Borders, kicked off in 2003 and still not implemented? Let’s not worry for now about the billions of pounds wasted on just those three.

The government will, at some point, have to face the fact that technology is not going to save them. If anything, IT is more likely to sink them.

We already know that HM Revenue & Customs alone needs to update 24 IT systems to be ready for Brexit – and doesn’t yet know what it needs to be ready for. We also know that the average duration of a modern, small-scale, well-defined digital project in Whitehall is two years – excluding large-scale transformation programmes, of which Brexit will require many. Oh, and government is already short of about 4,000 digital workers before even considering Brexit.

Should we also point out that the sort of tech implementation described above has never been done, at such scale, anywhere in the world? Presumably a post-Brexit Britain will develop that ingenuity and capability as soon as we leave the EU. Praise be.

Brexiteers – thanks for your faith in what technology can do for you. Just don’t think for a minute it will be able to do what you want it to do anytime soon.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: