Whichever way Scotland votes on its independence, one outcome that is assured will be the need for one of the fastest and most ambitious IT projects the public sector has ever seen.
If Scotland leaves the UK, it will need to set up IT systems to replace every one of the functions it currently shares with Whitehall – in particular tax collection and welfare payments. Computer Weekly research suggested the cost could be in excess of £1bn.
Even if Scotland votes to stay, the offer of much greater devolution is likely to mean that many of those systems will still need to be developed from scratch. The time it’s taking to develop a new welfare system for the UK – Universal Credit – shows that it’s no easy option to try to adapt current systems for a subset of the population who may be taxed or paid benefits using different rules and rates to those outside Scotland.
While Scotland has its own IT for existing devolved functions – such as NHS, education, police and justice – it’s a huge undertaking to put in place the additional government services from scratch. And that’s without even considering timescales – Scottish independence would start in March 2016, just 18 months after the vote.
Such a project would be unprecedented, but there are examples to turn to. Estonia is widely lauded as the most digital government in the world – thanks to the fact it had to start from nothing after gaining independence from the former Soviet Union. But even Estonia – a country with a quarter of Scotland’s population – took a lot longer than 18 months to do so.
Scotland may yet be forced to use existing Whitehall systems on an outsourced basis – but that would seriously hinder its ability to make the sort of political and policy changes that the Yes campaigners have promised.
Technology, of course, is not one of the issues that will decide which way Scotland votes. But it will be a critical issue in delivering the future government that Scotland will gain either way.
We all had a bit of a giggle, didn’t we, when the Daily Mail tried to explain “the cloud” to its readers.
In a story about the apparent hacking of Apple’s iCloud service that led to the publication of various celebrities’ naked selfies, the Mail wrote:
“The moment you snap a photo with an iPhone, for instance, a copy is uploaded – not to an actual cloud – but to a bank of gigantic humming and whirring computers in vast warehouses thousands of miles away in California or North Carolina”.
BBC’s News at Ten also struggled to describe the cloud in terms that would make it clear to the everyday viewer – although didn’t stoop quite so low as to suggest viewers might start looking up to the skies for pictures of Jennifer Lawrence.
It’s easy to be smug – but the fault here lies not with the Daily Mail or the BBC, but with the technology industry.
This is yet another perfect example of the continuing desire of the IT sector to wrap itself in jargon and buzzwords. Once upon a time, IT relied on its own obscure language to justify its existence. Never heard of virtualisation? Don’t worry Mr CEO, the IT guys understand it, so you had better give them a pay rise and a decent budget.
That has never been an acceptable situation. The single biggest complaint from business executives about their IT counterparts has been the inability to speak the same language. Jargon has been a major contributor to the gap between “the business” and the IT department that has existed since the first business computer.
That gap has protected IT for too long, but now it’s a threat not a salvation. IT no longer controls the drawbridge between jargon and understanding.
Technology belongs to everyone thanks to consumerisation. If the IT industry cannot talk in the language of its customers, those customers will simply look elsewhere. Ironically Apple has done the most to bridge the gap, making its products so easy to use that it threw away the user manual – that perpetual source of confusion and misunderstanding.
But it’s no good IT simply expecting that its users need to do the thinking – why should they know what we mean by a cloud? More than ever, IT needs to talk the language of its users. Please, let’s finally scrap the jargon and use words that people understand.
I’ve just completed jury duty – my first real-life experience of the English judicial process. It’s a fascinating thing to do – I would recommend it to anyone. But it was also an eye opener for the role of technology as evidence in a case.
This is based on just one trial in which I was involved – albeit a complex and disturbing one – and it’s always risky to generalise, but if the trial was symptomatic of elsewhere in the criminal justice system, it’s clear there is a lot to learn, and to gain, about how technology can help.
As a juror I’m restricted in what I can discuss about the case – we 12 were unable to reach a verdict and the Crown Prosecution Service now has to decide whether or not to re-try the defendant.
But suffice to say it was a difficult case, involving acts of incest and paedophilia. As is so often the case with sexual assault charges, it can boil down to one person’s word against another’s.
Big advances have been made in the use of technology for evidence in court cases. Specialist police officers are trained in recovering information from mobile devices and other computers. The use of mobile location tracking and CCTV has been key to many cases.
But if my experience is anything to go by, the justice system is still missing an opportunity. With ever more widespread use of mobile devices, apps, social media and so on, the evidential trail is widening.
As a juror I can’t go into details of this particular case, but I would say that better use of technology-based evidence might have made a significant difference to the trial.
We all leave an extensive digital trail these days – recognised by the government in its controversial attempts to force through legislation to retain as much of our technological footprint as possible.
But greater knowledge and awareness about technology needs to be spread through the police and judiciary, and a greater focus on how digital fingerprints can be just as important as physical ones.
If, as I suspect, the trial I watched was just one example of a much wider issue, there is a huge opportunity to improve the English justice system – and cut its costs – if technology is used more wisely and effectively in preparing evidence to help a jury make its difficult decision of guilty or not guilty.
It may be an unfashionable thing to do this week, but I’d like to stand up for government IT.
The catalyst for this was understandable – the Home Office being forced to stump up some £224m to Raytheon after an arbitration tribunal found the government was wrong in 2010 to cancel the supplier’s £750m contract to provide the e-Borders system.
Cue the usual – and deserved – hand wringing about government IT cock-ups.
But where the resultant criticism has got it wrong is in complaining about the same old faults that have been the cause of pretty much every past IT disaster you can think of – the “weakness for big IT projects” and “rigid specifications” cited by the FT; the “big bang” projects and lack of civil service skills referenced by The Guardian.
But it is only right and fair to point out that not only does the current government know and acknowledge these as problems, it has spent the last few years trying to do something about them.
Big IT projects? The Cabinet Office “red lines” process is in place to prevent any projects with an external spend of more than £100m unless they have a special justification.
Rigid specifications? You can’t get within earshot of a senior government IT leader without hearing the word “agile” in every other sentence.
Big bang projects? There is now a new process in place for “agile” business cases, which means IT projects no longer need to assume everything is needed up front, nor needs to go live at the same time.
Civil service skills? This remains a challenge – purely because the public sector is competing for scarce digital skills in a market full of shortages – but perhaps the most important achievement of the Government Digital Service (GDS) has been to establish that the civil service needs its own in-house digital and technology skills. Those skills must never be outsourced again.
Perhaps the most relevant proof of these changes relates to e-Borders itself. As Computer Weekly revealed earlier this year, the project that was meant to replace e-Borders was rejected by the Cabinet Office because it breached those new rules – the Home Office wanted another mega-outsourcing deal with a big systems integrator; and it did not confirm to GDS principles for agile projects.
As a result, the Home Office was sent back to the drawing board to find a new, digital approach, which is now underway.
Of course, all of those fixes described above are a work in progress and have a way to go to become embedded in government IT from top to bottom. And all it takes is a big failure in one or two smaller, agile but high-profile projects to raise fresh questions.
But until or unless that happens, we should give praise where it’s due, take encouragement from the changes underway, and as taxpayers hope they are getting it right at last.
Coverage of Ofcom’s latest annual report on the state of communications in the UK shows that much of the so-called establishment in the country still look at technology with wide-eyed wonder: “Ooh, pretty; clever; shiny!”
National newspaper headlines proclaiming that children understand more about technology than their parents were probably written by the same people who, as children themselves, were showing their parents how to program the VCR 20 years ago. It was ever thus.
At a government level, politicians are only starting to realise they can look good by being associated with tech – hence George Osborne turning up to launch a new fintech programme last week, and David Cameron ever desperate to associate himself with tech startups (even if much of his promised cash doesn’t actually get spent).
Meanwhile, as the establishment gazes in rapture, the rest of us are just getting on with being digital. We’re on the web more than ever, using mobiles, connecting to Wi-Fi, upgrading to superfast broadband, watching Netflix, and quite naturally integrating all this great technology into our everyday life and work.
There can no longer be any question of whether people are embracing technology – it is self-evident they are, and that given new tools and faster speeds, we will all continue to do so. There is no need to debate if or when we will need 5G mobile and 100Mbps (or more) broadband – it is obvious that we will, and sooner rather than later.
The government also quietly announced this week a consultation on a national communications infrastructure strategy, which is pondering just these sorts of questions and what public support will be needed to make it happen. It’s an absolutely critical process, but it needs to be much higher profile and more ambitious.
The key infrastructure questions for the UK in 2030 or 2040 are not to do with airports, roads or railways – they are to do with communications and technology. It’s no good looking at what we need in 2020 because we need to be looking much further ahead than that – not guessing what technologies will have emerged in 20 years’ time, but enabling an environment that allows them to emerge and be rapidly adopted.
One of the biggest risks to the UK’s digital economy is that the digital masses are way ahead of the establishment policymakers. Today’s 4G and broadband “notspots” are tomorrow’s economic problems.
Without prompt, ambitious, long-term digital thinking at the highest levels of commerce and politics, there is a real danger that by 2020 our national infrastructure will become even more a source of technical frustration than it already is.
While children are off enjoying their summer holidays, the last thing they want to hear is about next term. But in a few weeks, the school year they return to in September will be one of the most important for the future of IT in the UK.
The much-heralded new computing curriculum will be taught for the first time, with hopes that its greater relevance to the modern digital world will lead to more school leavers’ choosing a career in IT over the next 10 years.
Meanwhile, the shortage of skills that new school course is intended to tackle is only becoming more pressing.
The European Commission warned last month that 900,000 IT jobs need to be filled by 2020 across the European Union. The Labour-commissioned Digital Skills Taskforce, led by former Tomorrow’s World presenter Maggie Philbin, published its report last month too, citing research that found 745,000 more workers with digital skills will be needed by 2017 in the UK alone.
But Philbin’s report also highlighted the lack of funds available to make the new course work. Currently, the government has offered just £3.5m to support the curriculum – equating to just £175 per school. Philbin called for a further £20m to be invested by 2020. There is a clear risk that not enough teachers are equipped to deliver the computing curriculum successfully.
Our best hope is that the increasingly widespread acknowledgement of the UK needing more and better digital skills at all levels has enough momentum to be sustained, delivering real results.
Over the past 10 years, we have seen too many failed initiatives. There has rarely been any shortage of intent, but rarely enough funding, resources or government commitment.
The next five years are a one-off opportunity to develop the UK’s digital skills base and implement measures that will stick in the long term. If, by 2020, we have failed to embed technology skills into education, training, government policy and corporate life, it will be too late. The focal points of the global digital economy will have gone elsewhere.
Everyone in IT carries a responsibility to make this work, from encouraging your own children to study computing, to being a role model for the next generation. Shout louder about the great opportunities from working in technology, and ensure we can build the digital economy the UK sorely needs.
Read more about the new computing curriculum:
Back in February, when the UK government’s consultation on the use of open document formats closed, I described the decision to be made by the Cabinet Office as the acid test of its commitment to open standards.
Well, congratulations to all involved, for it is a test they have passed with flying colours.
The policy has been formally ratified, and ODF is now the standard format for sharing government documents – rejecting Microsoft’s impassioned proposals to add its preferred OOXML standard to the approved list. The move has been widely welcomed across the industry, and even as far as Brussels, with EU Commissioner for the Digital Agenda, Neelie Kroes, tweeting her approval.
The policy has been less well received in the Thames Valley headquarters of Microsoft UK. The software giant, not surprisingly, is unhappy, stating that it is “unproven and unclear how UK citizens will benefit” from the decision.
So one question now remains: how serious is Microsoft’s objection to the decision to choose ODF?
The Government Digital Service (GDS) is concerned that the issue may not be left to rest. In its latest GDS Business Plan for 2014/15, hidden away on page 43 of a 72-page document, it states one of the risks to its policy: “There is the potential for litigation on open standards“.
Is that risk highlighted out of some potentially misguided attitude towards Microsoft, on the basis that’s the sort of thing a big company might do to protect its market dominance?
Or are those fears based on comments – threats, even – made by Microsoft executives during discussions with GDS over the open standards policy?
We know that Microsoft has gone to great lengths over the past three years – since the Cabinet Office first announced its open standards policy – to lobby against decisions that might reduce Microsoft’s revenue from its biggest UK customer. The supplier has also petitioned Labour, encouraging the party’s Digital Government Review to avoid commitment to a single document standard (or at least, to avoid a single standard that is not its own).
Plenty of lobbying cash has been spent, unsuccessfully, to get to this point. Is there more left in the pot to fund litigation? We will have to wait and see, but Microsoft should resist any such urge. Accept the decision, move on, prove it can compete on a more level playing field.
GDS, meanwhile, faces its own challenges in making the policy stick across Whitehall. GDS director Mike Bracken acknowledged as much, writing in a blog that, “This is a big step for government, and things won’t change overnight.”
But it’s a positive step to government taking back control of its IT decision-making, and creating the open architecture and level playing field it has stated for so long that it wants to achieve. Round one to GDS.
Sometimes news stories coincide fortuitously in a way that highlights the significance of major trends transforming the IT landscape – and we have a big one this week.
Yesterday, Apple and IBM announced a partnership that will see the biggest name in corporate IT developing enterprise-focused tools to support the biggest name in consumer technology for its customers.
IBM will resell Apple products and open up its global partner network to the combined product set – giving Apple a corporate IT veneer it has previously lacked and rarely shown any interest in developing.
The very next day, Microsoft laid off 18,000 people as a result of its acquisition of Nokia.
The historical resonances stretch back over 30 years, to when IBM and Microsoft jointly created the IBM PC and desktop computing transformed the IT world as a result. The big loser from that deal? Apple – then an emerging player in personal computing that subsequently fell on hard times as Microsoft’s DOS/Windows software licensing model, hardware neutrality and partner ecosystem left the Mac as a niche product.
Thirty years later, Microsoft’s increasingly desperate attempts to protect its Windows Phone business led to buying Nokia, its only dedicated smartphone partner, at a time when Nokia’s smartphone sales were falling through the floor.
Apple is on the offensive – making its first tie-up with one of the traditional giants of corporate IT, helping IBM to offer mobile and cloud-based IT products and services based on the most popular smartphone and tablet technologies used in businesses today.
Microsoft meanwhile is struggling to protect its position in mobile, and under a new CEO starting to preach a “cloud-first, mobile-first” strategy that it hopes will reverse its declining influence on end-user computing.
Microsoft has even realised that it has no future as a Windows-centric supplier. Its cloud service is now just Azure, no longer Windows Azure. Office is available for the iPad – and being relentlessly and successfully promoted by Microsoft executives.
You could equally argue that IBM needs Apple a lot more than Microsoft needed Nokia. IBM has been late to the party on cloud, while Azure has established itself as the second biggest cloud service – although well behind market leader Amazon Web Services.
Both suppliers realise that users – both consumer and professional – will be using mobile devices connected to cloud services as the primary way to access personal and corporate applications.
IT leaders will have to assess which offering – Apple devices plus IBM cloud, or Microsoft cloud plus Nokia devices – is best suited to their corporate environment. Although of course, without the Windows lock-in that has kept so many IT managers tied to Microsoft for years, there’s no reason they can’t choose almost any other combination of Apple/Android devices with anybody else’s cloud. Who knows, they might even choose Windows Phone.
Old partnerships are being re-written. There will be some very big losers. The biggest winners will be the IT leaders who make the most of these dramatically shifting sands.
The government has finally been forced to admit that the business case for the troubled Universal Credit programme has still not been signed off – despite repeated assurances that approval was imminent.
The business case is required to confirm the spending needed to see the welfare reform programme through to completion. The Treasury will not sign off the project and release funding until all aspects have been checked and approved by the various Whitehall authorities – not least the Major Projects Authority (MPA) and the Government Digital Service (GDS) whose intervention early last year led to the project being reviewed and eventually “reset” in September 2013.
The admission came in a Public Accounts Committee (PAC) meeting yesterday when, pressed by PAC chair Margaret Hodge, the head of the civil service Sir Bob Kerslake said the business case had yet to be signed off.
Hodge had been pressing Kerslake and his fellow senior civil servants, asking them four times whether the approval had been given, and receiving classic “Sir Humphrey” answers in return, before Kerslake finally bit the bullet and said: “I think we should not beat about the bush. It has not been signed off.”
That statement came after much beating about the bush, for example:
Hodge: “Is it [Universal Credit] on track now?”
Sir Jeremy Heywood: “In its current form, I believe it is.”
Hodge: “What do you mean, ‘In its current form’?”
Heywood: “That is something we look at very carefully.”
Hodge: “I just want to get one answer to the question. Have you signed off the business case?”
Sir Nicolas Macpherson: “On universal credit? I think the Treasury have discussed this quite frequently. I believe that at each key milestone of the reset programme there is a Treasury decision to take.”
Hodge: “Have you signed it off?”
Macpherson: “It is signed up, up to a point; up to the point…Up to the milestones.”
Hodge: “I do want an answer. I want just a yes or no. Has it been signed off or not?”
Heywood: “I cannot speak [for] the Treasury.”
At this point, Kerslake folded and gave the answer Hodge wanted and most people have suspected for some time. He went on to add: “We have had a set of conditional assurances about progress and the Treasury has released money accordingly. That is one of the key controls they have”
So in other words, money to fund Universal Credit is only being released in small chunks for clearly defined sets of work, until the Treasury and its advisers have the confidence that the Department for Work and Pensions (DWP) isn’t going to make the same cock-up it did before. And they don’t yet have that confidence.
The obfuscation around Universal Credit continues to amaze. DWP statements continue to confirm the project is on track – and who knows, it might well be? But nobody fully trusts those statements given the track record the department has on denying problems in the project.
“It is clear that the department still has much to do to address all the concerns raised [about Universal Credit] and to ensure it delivers value for money,” the watchdog said in its latest DWP financial report.
The second-hand whispers coming out of the DWP – and I stress these are from people who tell me they know people, rather than direct inside sources – say that the digital team under Kevin Cunnington are doing a good job in developing the “end-state” digital system that will eventually form the basis of the system that goes live by 2017 to support the full roll-out.
But that digital system has yet to be tested in anger – not until the Autumn is it due to be tried out on 100 real-life claimants.
We know that the NAO is going to review Universal Credit again before next year’s general election, and until then we will no doubt continue on a combination of leaks, speculation, and yet more blind DWP optimism. That NAO review is undoubtedly going to determine what happens to the project after the election.
Labour has already said it will pause the programme if it wins the election. You can be sure the Tories would have a serious rethink too, especially if the NAO finds further problems. And an election victory for David Cameron would also give him a timely excuse to take the much criticised secretary of state, Iain Duncan Smith, away from the project (and the DWP) completely.
In the meantime, we are left picking through the civil-service speak and the obfuscations and the misdirections to find the truth. And let’s remember – this was the flagship reform programme for a government that declared it would be the most open and transparent in history.
Congratulations to the 25 most influential women in UK IT – our annual list was announced yesterday, and it contains some amazing, high-achieving individuals (who happen to be women).
The event at which we announced the list – attended this year by 120 of the most senior and successful women in UK IT, every one a role model in their own right – is a fantastic afternoon. It has an atmosphere and an energy entirely different to the male-dominated, men-in-suits events that are so commonplace in technology.
But with only 16% of the UK IT profession being female – a statistic that continues to shame the industry – it is all too difficult to get a more balanced and diverse set of delegates to most IT conferences.
After last year’s event, I wrote a post on this blog with 10 things that men need to do to help get more women into IT. There’s a growing realisation that no amount of women in IT networking events or female role models or encouraging girls at school to consider a career in IT, will change the status quo.
The only way to make a change is if the men in IT make it happen – and that was the theme of this year’s event.
We heard from three male IT leaders – James Evans, director of IT strategy and enterprise architecture at BP; James Robbins, CIO at Northumbrian Water; and Kevin Gallagher, CIO of Channel 4 – each of whom has positively targeted a more diverse workforce in their IT teams. Every one of them talked about the benefits that more female representation has brought to their organisations – the diverse set of skills that add to and complement those of their male colleagues.
But those three are sadly rare exceptions. It is all too common to hear male IT leaders say, “Of course I understand why there should be more women in IT, but at the end of the day, does it really matter? I have a great team, does it matter that they are nearly all men?”
It’s an easy statement to make, and one that reflects the unconscious bias present to varying degrees in us all.
But it does matter.
We have a large and growing skills shortage that is the biggest single threat to the future success of the UK’s digital economy. We will not fill that gap with men alone – and nor should we want to.
Some of our speakers talked about the fact that women now influence the majority of consumer technology purchases – laptops, smartphones, tablets – yet remain vastly under-represented in the companies that design, manufacture and sell those products. Our event sponsor, Microsoft, talked about the efforts it is making to redress that balance internally.
The digital revolution is not gender specific – digital technology is changing the way all of us live and work. But digital cannot deliver its full potential if it is specified, designed, built, managed and supported by a professional group that reflects the characteristics of only half the population.
James Evans from BP said that one of the keys to the energy giant’s successful diversity policy was the commitment from the very top of the company.
That same principle applies to IT. To every CIO or IT leader who has responsibility for recruitment, talent management, skills, or staff promotions – the reason it matters is because you have to set the example.
It will be hard work – it will make recruitment harder, for example, but how will recruitment agencies change unless you tell them you want a quota for CVs from women?
It has to start from somewhere. Male IT leaders – it is time you took the lead. No more excuses.