In one throwaway line, largely ignored among the wider Brexit comments in his Mansion House speech today (20 June 2017), chancellor of the exchequer Philip Hammond exposed the true scale of the challenge for government IT over the next few years.
This is what he said:
“How do we achieve this ‘Brexit for Britain’? Firstly, by securing a comprehensive agreement for trade in goods and services. Secondly, by negotiating mutually beneficial transitional arrangements to avoid unnecessary disruption and dangerous cliff edges. Thirdly, by agreeing frictionless customs arrangements to facilitate trade across our borders – and crucially – to keep the land border on the island of Ireland open and free-flowing.
“To do this in the context of our wider objectives will be challenging. It will almost certainly involve the deployment of new technology.”
That last sentence is the killer. Note that “almost certainly” is a euphemism – it’s absolutely certain that Brexit will require new government IT systems in many areas, and significant changes to existing systems in others.
Hammond talked about customs – let’s look at that as an obvious example.
All movement of goods in and out of the UK is handled by an HM Revenue & Customs system called Chief – an acronym for “customs handling of import & export freight”. Chief collects some £34bn of tax revenue every year – it is a critical national system.
But it’s also 25 years old and can only handle around 60 million customs declarations per year. It was originally developed to run under the VME operating system on ICL mainframes – younger readers may need to Google “VME” and “ICL”.
As it stands, Chief is not fit for purpose in a Brexit world, and HMRC is already working to replace it. However, in March this year, the Treasury select committee said that confidence in the replacement system had “collapsed”.
According to then committee chairman Andrew Tyrie, the new system needs “to handle a possible five-fold increase in declarations that could occur when the UK leaves the EU. The consequences of this project failing, or even being delayed, could be serious. Much trade could be lost.”
Let’s look at immigration, another topic discussed by Hammond in his speech.
The current UK border systems centre on two main applications – Warnings Index, which is over 20 years’ old; and Semaphore, developed as a pilot project in 2004 but still used today.
A 2013 report by the independent chief inspector of borders and immigration, found that Semaphore and Warnings Index were known to contain “critical system vulnerabilities”. A National Audit Office report in December 2015 found that Warnings Index “suffers from an average of two high-priority incidents a week”, including a component of the system not being available, or “30% or more of border control points being unavailable”.
A plan to replace both systems, the e-Borders programme, started in 2003 with the aim of improving the use of data to track people moving in and out of the UK’s borders. The programme was eventually scrapped in 2014 at a cost of £830m, four years after the then home secretary, Theresa May, cancelled a £750m contract for the IT project. The contractor, Raytheon, subsequently sued the UK government and won £224m in damages.
The programme has since morphed through at least one, possibly two further iterations since. A smaller system for exit checks at borders went live in April 2015, but is not used everywhere. Exit checks will be critical to post-Brexit borders, because without them the government will not know who has left the country or when.
Indeed, the previous lack of exit checks has been cited as a factor that prevented the UK government from introducing EU laws that could have been used to limit freedom of movement without needing to leave the EU.
The Home Office is working on the eventual replacements for Warnings Index and Semaphore, which are clearly not fit for a new immigration regime outside the EU.
So there are two critical areas of Brexit – customs and immigration – already reliant on ageing IT, with ongoing and long-lasting problems in replacing them. And we don’t yet even know what our new customs and immigration rules will be.
Think of agriculture – the Department for Environment, Food and Rural Affairs (Defra), through its Rural Payments Agency (RPA), has seen two of the biggest IT disasters in its attempts to keep up with changes to EU agricultural subsidy schemes. Outside the EU, the UK government will need a whole new system to replace those subsidy payments to farmers.
That’s one major system at each of three departments – HMRC, Home Office and Defra. However, according to the Treasury committee, HMRC is also reviewing 24 other systems that may require changes to be ready for day one of Brexit.
Has anyone counted the number of existing government IT systems likely to be affected by Brexit? Has anyone made an assessment of the amount of work and resource required to adapt or redevelop new systems for Brexit? I don’t know the answer to those questions, but I’m willing to guess it’s a no.
Meanwhile, the government has committed to an ambitious transformation strategy, aiming to deliver vital reforms such as better use of data, identity assurance, an overhaul of back-end systems, development of more digital skills and creation of other cross-government digital platforms.
Whitehall departments are already straining to recruit enough IT and digital expertise to meet these existing plans – let alone what else might be needed to deliver the unquantified scope of Brexit-related technology.
Something has to give
If you take a positive view, Brexit is an opportunity to redevelop Whitehall IT systems and create a digital government infrastructure fit for the 21st century. Realistically, political pragmatism means that’s unlikely to happen – resources will be focused piecemeal where they are needed most, even if there is a clear opportunity to impose digital standards across any new such projects.
Something will have to give. We’ve already learned that plans to scrap costly and inefficient outsourcing contracts are being put back or shelved because of resource limitations. The civil service surely will not have the capacity to deliver new Brexit technology at the same time as existing plans for the digital transformation of government.
The time and resources needed to develop Brexit-compliant systems has to be a factor in any transitional deal that may be agreed to avoid the damaging “cliff edge” of leaving the EU, a risk highlighted by Hammond.
The next five years were already a critical time for the progress of digital government. As Hammond has unwittingly revealed, the pressure is going to be even greater thanks to Brexit.
While British Airways counts the cost of its May bank holiday system outage – £80m and growing so far – much of the datacentre industry has listened to the supposed cause of “human error” and thought, “Are you sure?”
Willie Walsh, CEO of BA parent IAG, said the problem was caused by an engineer disconnecting the power supply to one of its datacentres, before reinstating the power incorrectly, leading to a power surge that damaged equipment.
Most experts that Computer Weekly talked to felt this scenario was – or should be – near impossible in a modern, well-designed datacentre. Either there is more to the issue than has yet been revealed, or it reflects badly on the datacentre design or set-up at BA.
It smacks of an over-reliance on legacy systems – BA is believed to still have software written decades ago for ageing mainframes. Modernity costs money – sometimes it’s seen to be better to maintain the creaking status quo, at least until it becomes too expensive not to.
The UK’s big retail banks are notoriously reliant on extraordinarily complex legacy systems that would cost billions to replace or modernise. So far, only Royal Bank of Scotland has suffered the sort of catastrophic outage on a similar scale to BA – but you can bet that CIOs at the other high-street banks knew it could have been them.
Not all legacy systems are bad, of course. But maintaining legacy that is clearly a long-term hindrance to any organisation is a management decision, not a technical one. It cannot be long before large, well-established companies start to lose market share and fail to compete sufficiently because of their legacy systems – which will then become the root cause of the collapse of a business, not just the cause of an IT outage.
CEOs and CIOs at the sort of large organisations likely to be at risk rarely stay in the job more than three to five years these days – not long enough to have the courage or mandate to take on such an enormous task as overhauling all that legacy complexity. In some cases, it’s simply too big a problem for anyone to take on.
But many of the industries least affected by digital disruption so far are also those with the biggest legacy investments – banking, manufacturing, insurance, for example. When digital revolutionises those sectors – and it will – the legacy laggards will be in serious trouble.
Companies die – that’s inevitable. The FTSE100 in 1990 looks very different to today. But increasingly the factor that determines success or failure will be the managed withdrawal of the sort of legacy systems that remain all too common.
Amid all the chaos, recriminations and excitement on the morning after the General Election, the future of digital government is far from the minds of anyone other than those of us with a personal interest. But it’s worth remarking on one of the high-profile Conservative losses – Cabinet Office minister Ben Gummer lost his Ipswich seat.
No matter how the next government shakes up, this means we will have our third minister in charge of digital government in barely a year, after Gummer succeeded Matt Hancock in Theresa May’s prime ministerial coronation reshuffle in 2016.
It’s no coincidence that the progress of digital transformation across Whitehall has stuttered and stalled in the last couple of years, following five years of consistent leadership from the previous Cabinet Office minister, Francis Maude.
Staff at the Government Digital Service (GDS) will no doubt be nervous, after a year when they have already seen their entire leadership team replaced. Gummer was a big supporter of GDS. Will whoever replaces him feel the same way?
After all, any new minister will look at GDS’s books and wonder what is happening to the £3.5bn savings promised when the team was given its £450m budget in the November 2015 spending review. A big chunk of the business case – £1.1bn – was predicated on the Common Technology Services programme, which has been largely mothballed. Another £1.3bn was to come from government-as-a-platform services, which have received a largely lukewarm reception from Whitehall departments, and notably little take-up from the big departments that are needed to justify the business case.
In particular, Gummer was a big supporter of Gov.uk Verify, the GDS-developed identity assurance system – earmarked to provide a further £1.1bn of savings. He even made delivering Verify one of the Tories’ manifesto commitments, repeating the hugely ambitious target of 25 million users by 2020 that was first introduced in the government transformation strategy in February.
It’s difficult to find anyone in GDS willing – or allowed – to talk publicly about Verify, but they are still recruiting people into the team which suggests confidence in its future. It’s very easy, however, to find people outside GDS willing to label Verify as a disaster. Even before the election there was speculation that Verify could be merged with the Government Gateway programme at HM Revenue & Customs – a move that Gummer arguably might have resisted. It’s too soon, of course, to say that his departure would make it more likely.
Judging by early indications, one thing the election result tells us is that young people have engaged with politics in huge numbers. When he launched the transformation strategy, Gummer said he believed that digital could “restore faith in democracy”. Certainly that younger generation will be expecting their government to engage with them using the digital means they consider routine in the rest of their lives.
It is, of course, far more important to get a functioning government in place, one that can address the many economic and global challenges the country faces. But however that comes to pass, any new administration needs to understand that significant and urgent decisions are needed on the future progress of digital government.
You would expect, of course, that a publication such as Computer Weekly would call on whoever wins the 2017 general election to put the digital economy on its list of immediate priorities.
While the Conservatives, Labour and Liberal Democrats all made important digital promises in their manifestos (the SNP made barely any mention other than a paragraph about broadband roll-out) there will never be an election more timely and vital for the UK’s place in the digital revolution.
As the most international of industries, technology must be at the heart of the UK’s post-Brexit economy if we are to retain any leading role on a world stage. We have an opportunity to plan for the digital economy of the 2020s, not simply to continue among the mass of followers reacting to digital change and not leading it.
That means an education, skills and training programme to prepare the workforce at all ages and career stages for the increasingly central role technology will play in the way we live and work in 10 years’ time. Alongside that, there need to be employment laws that reflect the changing nature of work while protecting the rights of workers who seek a living through firms that operate in ways that were impossible to conceive in the industrial age.
It means putting in place a security and regulatory environment that allows innovation to flourish and attracts inward investment through a safe but open tech environment that allows startups and small businesses to compete and thrive alongside their more established rivals, while respecting an individual’s control and privacy over their own data.
It requires as a minimum a broadband and mobile network infrastructure that is regarded among the world’s best – not simply one that compares favourably to European laggards.
And it needs a government that engages with citizens and delivers public services using modern, digital methods that enhance the ability to adapt policy and services in a faster changing world than current public sector IT could ever cope with.
Not much, then.
The next 10 years are when the leaders in the digital revolution will be established – it is still early days, and there will be much greater technology led social change in the next 20 years than in the last 20.
No government has yet put digital at the heart of its administration. Policy is split between different departments, ministerial responsibility is shared, with co-ordination seeming patchy and inconsistent.
Whoever wins the election, it’s now time to appoint a digital minister, with a seat at Cabinet, and the accountability and authority to put the UK at the forefront of the global digital revolution.
It’s easy to read too much into any party election manifesto, but the Conservatives’ plans – should they win the election, as the polls suggest they will – offer plenty of scope for speculation around the next steps for digital government.
Cabinet Office minister Ben Gummer was closely involved in writing the Tory manifesto, and his hand is certainly apparent in the recognition of digital being highlighted as one of the “five great challenges” faced by the UK over the next five years. Gummer is, after all, a man who sees digital transformation as a means to “restore faith in democracy”.
It will be interesting to see how the timing of the election affects the Government Digital Service (GDS), which sits at a crossroads in its evolution, after being heavily criticised in March by the National Audit Office and told by the watchdog that it needs to redefine its role.
Nine months into his reign, GDS chief Kevin Cunnington has completely overhauled his leadership team. Insiders suggest the changes at the top have not fully filtered down to the troops, but the start of a new parliamentary cycle offers an opportunity to drive forward afresh.
Cunnington is redefining the scope of GDS’s responsibilities – in particular, much of the legacy of former CTO Liam Maxwell is slowly being dismantled.
The role of CTO itself appears to have been abandoned, with no apparent prospect of a replacement for Maxwell’s successor, Andy Beale, who left GDS earlier this year.
The Common Technology Services (CTS) team – set up to roll out better technology for civil servants and to advise departments on ending their large outsourcing deals – has been mothballed, according to several sources, now that its director, Iain Patterson, has left. CTS is continuing certain existing projects, but not taking on any new work, say sources, who claim that Cunnington never saw CTS as part of GDS’s future.
It’s worth noting that as part of the £450m budget GDS was given in the 2015 spending review, the CTS programme was projected to deliver £1.1bn of the forecast £3.5bn savings to be achieved in return.
Computer Weekly asked the Cabinet Office to confirm the status of CTS, but they were unable to comment due to the “purdah” rules that prevent Whitehall discussing future government plans during an election period.
Meanwhile, Maxwell’s other big initiative, spend controls – responsible for a significant portion of the savings GDS claims to have made from government IT costs – has been watered down, handing greater control back to the departments the policy was intended to rein in.
The Tory manifesto gives only a few clues as to where GDS – and wider digital government plans – go next. Much is simply repeating the past.
“We will create a new presumption of digital government services by default”, it says – replacing the old and presumably identical presumption that’s been in place for the last five or six years.
“We will publish operational performance data of all public-facing services,” says the document – presumably that’s the Gov.uk Performance Dashboard that’s been around for some time.
There are references to open data, publishing more information online, and rationalising the use of personal data – which presumably tee-up the imminent appointment of a chief data officer (CDO), for which recruitment has been underway for a few months.
Notably, sources suggest the new CDO will also be positioned outside of GDS.
A few eyebrows were raised to see a specific commitment in the manifesto to Gov.uk Verify, the sometimes controversial online identity assurance scheme. Plenty of outside observers have questioned progress on Verify, some calling for a formal review, but minister Gummer clearly remains a strong supporter.
Perhaps the most interesting manifesto line, in the context of digital government, is the most vague: “We will incubate more digital services within government and introduce digital transformation fellowships, so that hundreds of leaders from the world of tech can come into government to help deliver better public services.”
The use of “incubate” is interesting – past manifestos might have used the word “develop”. Does this suggest a desire to push more development out to suppliers?
What exactly is a “digital transformation fellowship”? Given that so many IT contractors have stepped back from government IT work after the April reforms of IR35 tax laws, is this simply a way to bring them back? Is it some new status of employment that allows Whitehall to pay private sector market rates for IT professionals to get over the limits imposed by civil service pay structures?
Cunnington has been leading a review of digital jobs, skills and pay structures, so perhaps this is one of the fruits of that work.
We can conclude with some confidence that work on Verify and other common platforms will press ahead after the election, but we will have to wait to see whether there are further changes in the structure and delivery of digital government.
In all the debate about the NHS ransomware attack, much has been made of a government decision in 2015 to end a contract with Microsoft to provide support for the ageing Windows XP operating system that was widely in use across the NHS at the time.
Continued use of XP has been highlighted as one of the factors that enabled the ransomware attack – although the bigger issue is the lack of discipline in patching newer versions of Windows, which allowed the attack to target PCs without a fix for a known bug that has been available for two months.
The XP support deal has even become a political issue, with Labour criticising the Conservatives for “cancelling” support for XP. The truth is very different, and sheds light on the deep organisational and structural issues within NHS IT that made a cyber attack on this scale inevitable. It also raises questions about how the prevailing political ideology directing the NHS contributed to the situation.
Computer Weekly has talked to several people directly involved with the decision not to renew the original 2014 support deal with Microsoft – they have asked to remain anonymous – but they provide insights into why the NHS was uniquely vulnerable to this attack.
A purely commercial agreement
The £5.5m XP support contract with Microsoft, signed in 2014, was trumpeted by the Crown Commercial Service (CCS) and the Government Digital Service (GDS) as a helping hand for public sector organisations that had yet to migrate off XP – the end of support had been flagged for years, and Microsoft had long encouraged users to upgrade to newer versions of Windows.
However, the contract was purely commercial – a volume pricing agreement. It added no new capabilities for XP support to that which individual government bodies already had. CCS simply negotiated a pricing deal – a volume discount – to take advantage of the large number of XP support contracts already in existence, and thereby to reduce the overall cost to the government IT estate.
GDS used this opportunity to put pressure on laggards to upgrade XP, saying effectively they had one year left to do so. GDS, however, had no mandate or ability to force any organisations to upgrade.
A year later, CCS proposed a renewal of the deal, but this was turned down by a group called the Technology Leaders Network (TLN), which was set up by GDS for tech chiefs across Whitehall to collaborate and, where appropriate, make collective decisions on IT policy.
What’s important is this: the TLN did not cancel support for Windows XP. They decided to end the volume pricing deal, leaving any organisation still using XP to continue with XP support if they chose to do so. This was clearly communicated to affected departments.
The tech leaders felt the volume pricing deal was acting as a “comfort blanket” for laggards who would prefer – for their own local reasons – not to have to worry about upgrading from XP. There was never a central decision to end support for XP – any such decisions were left entirely to local decision-makers.
Relations between GDS and Microsoft at the time were also not good. Microsoft was reeling from GDS decisions around open standards that threatened the supplier’s dominance of government IT. GDS, in turn, felt Microsoft was behaving badly, unnecessarily playing hardball in its commercial relationship.
The extended support deal already had fees set to double every six months after April 2014 until April 2016, when those charges would have been renegotiated.
The contract agreed by CCS in 2014 was purely about saving money – not about extending support for XP beyond what was already in place. Its cancellation was not about ending support for XP, purely about putting responsibility for the decision to pay for XP support back on those people who still used the system.
Every one of the tech chiefs agreed to the decision to end the contract. Each took responsibility for ensuring any XP users in their departments were fully aware of the implications.
Furthermore, GCHQ had advised the TLN that the XP support deal was practically worthless in terms of protecting XP users from IT security vulnerabilities. While the contract covered the availability of critical patches for XP, GCHQ said there were so many vulnerabilities in the ageing software, that even those critical patches would never be enough to protect users.
GCHQ was well aware that XP was, and would remain, an insecure and vulnerable system whether there was a support deal in place or not.
IT governance in the NHS
Crucially, however, while the Department of Health (DoH) was represented in the TLN, the NHS was not. GDS had no governance role over IT in the NHS. The DoH tech chief told the meeting he could not take a decision on behalf of the NHS – although clearly he could communicate the decision.
The NHS, meanwhile, was still grappling with the reforms introduced by the 2012 Health and Social Care Act, which controversially separated decision-making powers in the NHS, and removed legal responsibility for healthcare from the secretary of state for the first time. NHS organisations were effectively federated, with greater local control over budgets and decision-making, delivering services “commissioned” by GP-led Clinical Commissioning Groups.
As a result, there was no longer any central organisation with responsibility for IT in NHS trusts. The Health and Social Care Information Centre (HSCIC) – now NHS Digital – is responsible for certain central issues, such as data standards, managing the run-down of contracts from the failed National Programme for IT, and driving digital transformation. HSCIC had no responsibility to set technical standards for IT across the NHS, in the way that GDS was able to do across Whitehall.
GDS was worried enough about this situation that it met with then DoH minister George Freeman, to emphasise the need for a central body to set technical standards across the NHS, with the authority to ensure trusts and other organisations followed best practice, and with the transparency to highlight those who chose not to.
One source claimed that secretary of state for health Jeremy Hunt was also briefed on the security risks that a lack of IT standards would create in such a heavily federated NHS organisation, but it was never a priority at that level. “Hunt never grasped the problem,” said the source.
As a result, accountability for IT standards – including security – varies widely in the NHS. Not all trusts have a single person with responsibility for IT on their board. There is no way to know whether trusts include information security on their risk registers unless they choose to publish them.
As Computer Weekly has reported elsewhere, there were further warnings about the security risks to the NHS, including from national data guardian Fiona Caldicott, and from CareCERT, the NHS Digital organisation that now co-ordinates IT security activity across the health service.
But ultimately, decisions and priorities are set locally by managers in each NHS organisation. As we now know, there were plenty who failed to recognise the cyber security risks they faced, and only now has the inevitable end result been made painfully apparent.
Computer Weekly has been banging the drum for full-fibre broadband for several years. We’ve been critical of BT for its reticence and reluctance to embrace fibre to the premises (FTTP) as the future for the UK’s digital infrastructure.
BT would often tell us that FTTP is not needed. That the demand simply isn’t there. Other technologies based on existing copper cabling are more than adequate for years to come. Rival providers like Sky and TalkTalk are simply playing PR games when they call for BT to be forced to build a fully fibred broadband Britain.
We also supported calls for Openreach – BT’s local infrastructure arm – to be separated from the mothership to encourage competition and create a regulatory environment focused on building FTTP. But BT told us that Openreach simply wouldn’t be able to make the necessary investment without BT Retail as its core in-house partner to drive demand. Those pesky rivals couldn’t be relied upon.
So we’ll try not to be churlish this week, and instead congratulate Openreach, newly separated from its parent company, in announcing a consultation with the industry on rolling out full-fibre broadband.
This is the new regulatory environment and the newly structured Openreach working the way everyone said it should. It’s BT bowing, finally, to the inevitable and starting its way towards FTTP for all. It’s several years too late, but hey, better now than later.
Of course, it’s taken a change of rhetoric from the government, which now promotes the idea of full-fibre after years of supporting the BT line that fibre to the cabinet (FTTC) is all that matters. And it’s taken the market to send out strong signals that everyone else sees FTTP as the way forward – witness BT rival Gigaclear securing £111m in equity funding earlier this month to support its fibre roll-out that includes many of the rural areas BT deemed uneconomic even for FTTC.
We wouldn’t want to suggest either, that BT’s plunging share price, £528m accounting scandal at its Italian subsidiary, £300m restructuring, 4,000 job losses, £42m fine from Ofcom over Ethernet installations, or the associated £300m in compensation payments, had anything to do with the need for some positive news.
No – the important thing here is simply that we’re getting there, at last. If the UK digital economy wants to be competitive as we leave the EU, we need world-class, world-leading infrastructure as a key incentive to attract companies to invest in this country. For all our sakes, as well as theirs, let’s hope BT gets us there in time.
For a long time, people working in IT laboured under the knowledge that nobody from outside IT was interested in what they do because it’s all too technical – a bit boring and geeky. It’s been impossible to have a conversation about tackling the long-term skills shortages in IT without someone mentioning that the profession has an image problem.
The digital revolution seems to be changing attitudes, however. According to this year’s technology skills survey by recruitment firm Mortimer Spinks and Computer Weekly – one of the largest of its kind in the UK – 76% of non-IT/digital workers would consider a career in IT.
We should rejoice on any number of levels about this – but it raises a very important question: what’s stopping them?
If we’ve succeeded in convincing the technically unconverted that IT is an interesting, rewarding, motivating – and fun – place to work, we need to do more to get them over the line and into a job.
It’s expected that by 2020 the UK will have 800,000 fewer IT professionals than it needs – and that’s before you even consider the potential impact of Brexit given that 18% of the 1.6 million digital and tech workers in this country come from overseas.
For many years there has been a roll-call of initiatives to try to encourage more people into IT – from school computer clubs to women in tech programmes; from changes to academic curricula and closer ties between universities and industry. None of them have bridged the skills gap – and it’s a terrible indictment of IT employers that the proportion of women in IT has actually fallen over the last 10 years, now at a low of 16%.
The lack of tech skills in the UK is the biggest challenge faced by our digital economy as we seek to become a global technology leader post-Brexit. If we don’t have the people, we won’t have the growth.
It’s often been said that people outside IT think you need coding skills or technical expertise – our survey suggests even those attitudes have changed. Only about a third of non-technical workers think a key requirement for a tech role is coding ability, just 26% think candidates must have a tech-related degree, while only 33% think candidates have to be good at maths. They’re right – creativity, communication skills, and other soft skills are just as important.
The biggest problem over the last 10 years is the steep decline in training budgets. Employers looking to fill IT skills gaps must invest in cross-training for those 76% of eager non-tech workers – we need a concerted effort to convince them to play their part in our digital future.
In the run-up to the last UK general election in 2015, the Labour Party’s then shadow employment minister Stephen Timms pointed out that the target completion date for the Universal Credit welfare reform programme had “slipped four years in four years”.
In July last year, the secretary of state for work and pensions, Damian Green, moved the completion date back to 2022 – five years later than the original 2017 target set at project launch in 2011. That makes about seven timescale slippages in all.
So perhaps it’s not surprising that the Department for Work and Pensions (DWP) is still cautious when talking about future deadlines for the controversial benefits scheme – as shown by a recent freedom of information (FOI) request.
Independent IT programme manager and FOI campaigner John Slater has been a dogged thorn in the side of DWP for over five years, pushing the department through the courts to reveal unpublished documents in an effort to bring greater transparency to one of the highest-profile IT failures of the Coalition government.
Currently, Universal Credit seems to be going well – at least, compared to its troubled early stages. The “full service” version – formerly referred to as the “digital service” – is at last being rolled out country-wide. The previous version – the remnants of the system that was “reset” in 2013 at a cost of £130m – handled only the simplest of claims, whereas the full service covers the entire complexity of the scheme to replace six different in-work welfare benefits with a single payment.
Full-service roll-out is due to be completed by September 2018 – meaning that all new benefit claims will be handled through Universal Credit. A bigger challenge lies ahead – migrating about seven million claimants for the existing benefit schemes onto Universal Credit. The UK government – perhaps no government anywhere – has ever attempted such a large-scale data migration.
Slater’s latest FOI request asked DWP for further information on the planned completion date – at the time of his request, this was still set to be 2021.
DWP, however, claims that it no longer works with deadlines or targets, citing its use of agile development as the justification.
“The Universal Credit Programme deploys ‘agile’ techniques to ensure the system develops incrementally and this is how it is managed through its governance route. We work in short phases and, as explained before, ‘target dates’ are not features of agile programme management and are not how we run Universal Credit. We articulate the scale and structure of our delivery plans for Universal Credit in terms of phases of roll-out, to specific jobcentres and local authority areas,” said the DWP response to Slater’s FOI request.
Slater points out that this is perhaps stretching the definition of “agile” somewhat.
“The DWP is hiding behind this argument that agile means you don’t have a plan and this isn’t true,” he told Computer Weekly.
“At the programme level there should be some kind of high-level plan that sets expectations of when things need to be completed. Where agile has been applied to programmes rather than projects there is still a map/programme portfolio/goals/plan or whatever people want to call it that covers each of the projects or work-streams (depending on how the programme is structured) and when it needs to be completed.”
Given that the secretary of state has already told Parliament that Universal Credit has a 2022 target completion date, you can have some sympathy with Slater when he adds: “The response seems to confirm to me that the DWP is making it up as it goes along and doesn’t have any kind of credible plan showing how long it will take.”
DWP acknowledged to Slater that the 2021 target has been mentioned in documents supplied to the Universal Credit Programme Board, but stated the date has “yet to be confirmed”. It said:
“In line with agile methodology, the sooner the activity, the more detail there is. These activity streams are called:
“Governance and project management, which gives details of reviews and assessments that take place to review progress. This activity stream refers to a 2021 closure date, which is yet to be confirmed.
Transformation and planning, which looks at the interfaces and frameworks that need to be in place for Universal Credit to roll out. This looks at migration and refers to ESA/tax Credit claimant migration completed by 2021.
“UC product development, which describes the digital features Universal Credit will make use of. There is a reference to decommissioning legacy IT in 2021, which is yet to be confirmed.
We have not yet started to plan any activity around project closure or legacy decommissioning; nor have we started any significant planning for the ESA/tax credit stage of migration, which, as you may know, is now planned to complete in 2022.”
MPs have repeatedly criticised DWP for a “veil of secrecy” and lack of transparency over Universal Credit, and Slater’s experience suggests the department continues to take a highly cautious approach to what it reveals about project development and timescales.
Amazingly, given the programme has been going since 2011, the full business case for Universal Credit has still not been submitted or signed off by the Treasury – that’s due to take place in September this year.
At that time, perhaps DWP will finally reveal more detail about how it will avoid further delays during a three-year migration period that will present significant risks to Universal Credit roll-out.
Like it or not, necessary or not, we have another General Election, and while nobody will decide their vote based on a party’s digital policies, the imminent poll will raise important questions that affect the tech community.
It’s unlikely we will see many digital surprises in the Conservative manifesto. Only this year, we’ve seen publication of the industrial strategy – which outlines policy on the IT industry, digital skills and telecoms infrastructure – and the transformation strategy, which covers digital issues within government. They will most likely continue as the core of Tory plans.
In 2015, Labour was the most vocal party on technology and had the most thought-through digital strategy. Under then policy chief Jon Cruddas and shadow digital minister Chi Onwurah, they reached out to the tech community and ran an extensive consultation process to produce some comprehensive proposals.
Sadly, the upheavals in the Labour party since have seen digital on something of a backburner. Onwurah has been somewhat sidelined, although remains the spokesperson for industrial strategy. Louise Haigh has proved a capable and knowledgeable shadow to digital economy minister Matt Hancock. But Labour’s digital interventions have seemed mostly reactive, with little overarching strategy other than an acknowledgement that digital is a critical economic issue.
The Liberal Democrats were equally keen to talk digital in 2015, but since their spokesman at the time, Julian Huppert, lost his seat the party has barely had the resources to extend their reach into tech.
Computer Weekly published a detailed guide to the digital manifestos of the main political parties in 2015, which you can read here.
The new context of Brexit will be a factor, of course – steering a path that allows the success story of the UK digital economy to continue, when much of the growth has been down to our membership of the European Union.
But Theresa May has effectively signalled that the details of Brexit are not on the table for this election – regardless of how many people in the UK tech community might like to see the vote as an opportunity to rethink our departure from the EU.
The timing of the election could be good for the Government Digital Service (GDS). As it stands, GDS had three years to prove itself and deliver the targets in the transformation strategy for 2020. A Tory victory in June will probably secure GDS’s future for the duration of the next parliamentary cycle – but it won’t ease the scrutiny as digital government will be expected to deliver significant and measurable improvements for citizens.
We await the parties’ digital plans with interest.