Once again, the Chancellor of the Exchequer’s red briefcase remained firmly in the analogue world.
George Osborne made his first tentative steps into the modern world by joining Twitter on the day of the 2013 Budget (well, some unfortunate Treasury press officer did at least, and then had to read all the abuse their boss received).
But you can’t imagine George is planning to retire the briefcase and walk out of Number 11 Downing Street carrying a red iPad anytime soon.
There can’t be many sectors that contribute 12% of GDP yet still get so little consideration in the UK’s economic planning – not least as a growing industry with enormous opportunity to establish Britain as world leader.
There was some encouragement for start-ups – start-ups generally, that is, not specifically tech – which will help the burgeoning UK scene. But other than that, the briefcase contained policies for banking, manufacturing and construction, those important but resolutely 20th century industries.
Surely it can’t be difficult to join a few simple dots for those in the Treasury team.
We have record youth unemployment in the UK, and a huge shortfall of young entrants in the IT profession – not to mention the need to fill tens of thousands of new roles in the coming years. Would it have been so hard to offer incentives for training and recruitment into IT jobs?
According to research, nearly 15% of commercial properties in the UK are empty. Meanwhile, one of the fastest growth areas in the technology world is cloud computing, a US-dominated area where the main suppliers have to be convinced by data protection laws to set up cloud datacentres in Europe.
Might it be a good idea all round to find ways to use suitable empty commercial properties to establish datacentres or businesses to support the cloud ecosystem? Or of course, you could just let UK companies give all their cloud IT budgets to US suppliers.
It can’t be difficult for some economist to prove the net-positive effect of such policies – more people in work, paying more taxes, buying more goods; more landlords receiving rentals, corporate IT budgets being reinvested in UK companies, a growing source of tax revenue from the UK being a hub for cloud providers.
Labour leader Ed Miliband labelled Osborne as a “#downgradedchancellor” after his Twitter debut. What more do we have to do to convince George to upgrade his UK IT budget?
In the long history of bad IT projects, one of the common characteristics has been the willingness – perhaps even determination – of IT teams to see things through to fruition even when it’s obvious that things have gone wrong.
This has been highlighted numerous times in various official reports about some of the high-profile government IT projects that have gone off the rails – even now, it looks like the increasingly troubled Universal Credit programme is going the same way.
Elaborate quality assurance processes have been put in place to flag up failing projects at an early stage – but in many cases civil servants have ploughed on regardless.
It’s not exclusive to Whitehall though – there are plenty of private sector projects that have made the same blinkered mistakes.
So it’s refreshing – even if on a small scale – to see retailer John Lewis talking publicly about its intent to “fail fast” at technology innovation.
The department store canned a “virtual mirror” initiative that had been launched with great fanfare last year, because it was making no money. “The key is, when you find it’s a dud, stop,” said one of the John Lewis IT executives.
It’s a principle that all IT leaders would do well to adopt.
“Fail fast” is a phrase that has come out of the online and start-up worlds, where the emphasis is on prototyping and piloting new initiatives, which inevitably makes it easier to stop. When there have been thousands of pounds sunk into a major development programme, there is clearly more at stake for project leaders.
But if “fail fast” has an antonym, it’s “pride comes before a fall”. It would take a culture change in many organisations to make a fail-fast strategy accepted, and not simply seen as failure. Risk-averse British business culture can be a hindrance.
But failing fast doesn’t just encourage honesty and prevent problem projects. It enables a culture of innovation – think how much more confident employees would feel about suggesting new ideas if they knew there would be no stigma attached to their subsequent failure.
“Right first time, every time” was the mantra of the quality brigade, and it’s a nice objective to have – but it doesn’t always reflect reality. Fail fast is good for IT departments, and good for innovation.
The vultures are circling over the IT behind the government’s Universal Credit (UC) programme.
Computer Weekly has catalogued the gradual drip-feed of concerns and rumours around the highest profile IT project in Whitehall at the moment.
Staff have been removed from the project at the highest level; costs are running over; deliverables appear to be thin on the ground. And now the row is brewing inside the House of Commons as Labour and Coalition politicians make claim and counter claim – the latest being that the key IT contractors have been told to stop work.
We hear lots of other speculation that, if true, would show a programme in chaos – suppliers that have simply not delivered; ultimatums and threats of legal action; despair from those in Whitehall IT who said all along that the approach taken was doomed.
Much of that speculation may be simple gossip – but the fact that people are engaged in such negative whispers shows the dark cloud under which UC is being developed.
You can understand why the government is reticent to admit problems. Universal Credit is a flagship Tory policy that could make or break their General Election hopes in 2015, by which time it is due to be live. Don’t be surprised if political expediency sees those deadlines put back to 2016 as polling day gets closer.
But the IT politics is even more interesting. UC is the project that, so we hear, minister Iain Duncan Smith personally insisted should remain outside the IT reforms being introduced from the Cabinet Office.
While nobody in Whitehall wants another IT disaster, you can bet the IT reformers would not be too disheartened if UC followed that all too familiar route.
And then there is the supplier relations – some of the biggest names, from the “oligopoly” of system integrators who have dominated government IT, are involved. These are the companies the reformers want to see booted out; the ones who still believe Whitehall can’t do IT without them.
If those suppliers, individually or collectively, fail on UC, it would be a crushing blow to their future sales in government. They will not go down without a fight. But there are powerful forces who would love to see them go down nonetheless.
Don’t underestimate just how much is at stake with Universal Credit – the outcome could shape the future of Whitehall IT.
The European Union wants to impose a cap on banker’s bonuses. Regardless of what you think about the reasonableness of legislators trying to regulate private sector salaries – I’d say it’s a daft idea, no matter how much banks need to change the bonus culture – it is another sign of the shifting sands for the financial services industry.
The sector is slowly, inexorably, shifting its centre of gravity eastwards. London is never going to disappear as a global finance hub, but it’s inevitable that the City’s influence will decline over the next 10, 20 maybe even 30 years.
That’s going to have a huge impact on the UK economy, and no matter how much successive governments will try to protect the finance sector, it is difficult to see anything but a gradual reduction in its contribution to GDP. Thanks to technology, banks can put their physical headquarters wherever the regulatory and tax environment is most favourable.
So this government, and the next, and the one after that, will all face a difficult decision. The sector that was the powerhouse of the UK economy for the 25 years since Maggie Thatcher’s “Big Bang” deregulation, will need to be supplanted by something new.
Now I know I’m biased, but surely if any sensible person takes a long-term view of what industry is likely to generate the biggest growth over the next 20 years, you can’t look much further than technology.
What’s more, the UK is well placed – a great (but still under-developed) skills base, a culture of innovation and creativity, a history of invention, and the most digitally active population in Europe.
The government plans to publish an “information economy” strategy in May – a rather belated stab at what it describes as “Digital technologies and information combining to drive productivity and create new growth opportunities across the whole economy.”
Glad they noticed.
But more than ever, it is clear the UK needs a committed, long-term strategy to place technology at the heart of our economy; one that transcends party politics and promises to put the country at the centre of what will be the most important sector in world business.
The industrial revolution was the turning point for the UK’s global economic influence. Let’s not waste the opportunity for the digital revolution to have the same impact.
Cyber security has made its ultimate mainstream breakthrough. This week, a relatively minor hack targeted at Apple not only made the BBC 10 O’clock News, but warranted a lengthy studio discussion between presenter Sophie Raworth and a BBC security correspondent.
Attacks of varying sophistication and impact are becoming a near daily occurrence – and they are only the ones we hear about. Meanwhile, David Cameron hopes to export the UK’s supposed cyber security expertise, offering the government’s support to Indian officials during a trade visit, while China stands accused once more of extensive military-backed cyber espionage.
So the European Commission’s proposed directives on cyber security and data protection are nothing if not timely. But they are proving controversial.
As businesses come to understand the full implications, there is likely to be quite some uproar over plans to effectively police and regulate the data security of almost any firm that offers some form of electronic or online service, from social media to retailers to banks.
US internet firms are lobbying to water down the proposals, and European companies are pushing back on plans for mandatory data breach reporting. There will be other concerns yet, and it seems certain that before long we will be bracketing IT security providers alongside lawyers as the guaranteed beneficiaries.
But the problem ultimately remains one of the IT industry’s own making.
Security is still an afterthought too often. Few, if any, of the software and hardware systems that are used on a regular basis were designed with security in mind from the start. Security is still a feature to be added in, rather than a fundamental element of product strategy, architecture, design and development.
The need for regulation is in itself an indictment of the failure of technology providers. But much of the proposed rules will serve only to allow those suppliers to sell more products and make more money, instead of changing their behaviour.
The users of technology will bear the brunt and cost of compliance with the EU’s directives, not the providers of the technology upon whom they rely. This, surely, risks missing an opportunity to force change on a negligent IT industry.
We’ve all become used to the idea of constant change working in a sector like IT. We’re so familiar with coping with the pace of innovation, that sometimes it’s easy to miss the scale of what is happening around us.
Make no mistake, we’re seeing change at a pace and scale that even for technology is almost unprecedented.
This week, one of the world’s biggest and most successful IT companies basically admitted that its strategy is wrong and desperate measures are needed to secure its long-term survival. Michael Dell placed a $24bn bet that he can turn the company he founded into a modern software, services and cloud provider.
Taking the firm private is a huge gamble. It’s also a historic milestone in the decline of the PC sector that has sustained so much of the IT industry for 30 years.
As I’ve written in this blog before, we’re entering a period where change is no longer a gradual, inexorable downward curve – but instead it’s a cliff. Dell saw the cliff approaching too late and is trying to perform a high-speed handbrake turn.
BlackBerry, the original icon of the smartphone and mobile working world, has also placed what could be its final bet. After far too many years of development, its new BlackBerry 10 operating system and associated handsets are the last throw of the dice. If it fails, the company fails – simple as that, and as unprecedented as that.
Closer to home, we’ve seen one of the UK’s biggest and best known resellers, 2e2, collapse into sudden administration.
In an industry that has always been obsessed with the “next big thing” – take a look at the current big things: cloud, mobile, social, big data. As we know those topics today, they were barely on the edges of many IT leaders’ radar even three years ago. Only a select few forward-thinkers saw the dramatic rise and the scale of change those trends would initiate.
We are in what one expert has called a state of war – established companies fighting for their position as they see their cosy status quo disappear, and new entrants charge the battlefield.
In the midst of such change, it’s hard sometimes to find the time to step back and get some perspective. But for IT leaders it’s going to be increasingly important to understand the broader context of the current landscape, and to plot a successful route.
A lot of people in IT will be placing a lot of bets in the next year or two, and a lot of companies will see their future determined as a result.
Great news: It’s a good time to be a software engineer again, you no longer have to worry about your job going to India, and IT leaders are queuing up to employ you.
For much of the “noughties”, it was a widely acknowledged truth that all software development was going to India and the Far East. Swathes of corporate application development teams were laid off and big outsourcing contracts put in place, as the global system integrators convinced customers of their “transformational” delivery skills.
Numerous expensive non-transformations later, things have changed.
According to Sam Gordon, a director at recruitment firm La Fosse Associates speaking at the first CW500 Club event of the year this week, there is a huge and growing demand from CIOs to bring software development back in-house. Gordon quoted one un-named CIO as saying he would rather have “10 rock-star software engineers than 100 programmers in India.”
It’s important to note this isn’t a backlash against India – it’s a realisation that in a digital world, when dealing with technology-enabled customers and IT-led business change, software is your salesperson.
Increasingly, your customers are dealing with you not through well-trained human beings, but through software – whether websites, mobile apps, in-store kiosks, ATMs, whatever. Software is your shop front. It’s also the key to internal efficiency and productivity.
The value that organisations deliver is increasingly embodied in the software they use. And that means you need the value-creators in your organisation, not in a supplier bound by little more than a project plan and a service-level agreement.
“Agile” is an over-used word, but it’s the essence of the interface between software and competitive edge or customer service.
Walk into any online company – Asos.com, or LateRooms.com for example – and you’ll see software engineers sitting alongside business decision-makers. They’re not talking code, they’re talking business, and user needs.
Software engineers themselves have changed. Walk around the Government Digital Service – the closest thing the public sector has to a software start-up – and you won’t see the old programmer stereotype of bearded blokes with open-toed sandals and few social skills. These are bright, creative, thought-provoking, idea-generating business people who happen to be able to push out some kick-ass Ruby code.
Attitudes are changing elsewhere too.
This week, there were two fantastic news stories for the future of UK IT. First, the Department for Education announced that computer science will be one of the subjects for the new EBacc baccalaureate qualification in schools. Second, the number of students applying for computer science courses at university grew for the first time in years – what’s more, it was one of the highest increases in popularity of any degree course.
That’s the future. At last, kids are finally seeing software as something they want to learn.
Another of the IT leader guests at CW500 came up with a great phrase – “agile annihilation” – to describe the prospects for organisations that don’t have the flexibility to rapidly respond through software to what’s happening in their market.
If you’re a software engineers with the skills – it’s going to be a good few years for you. If you’re an IT leader without the in-house software skills you need – well, it’s time to review your recruitment plans. Agile annihilation awaits for the rest.
Oh, Microsoft. I’m starting to feel sorry for you.
As we all get old, there usually comes a moment when you know you’re out of touch. I remember my dad asking me once if I had been out with any “hot chicks” lately. The look I gave him told him that was his moment.
As a journalist, it will come on the day I say confidently that some new gadget will never be a success, only to see everyone and their dog carrying one six months later. (I have a horrible feeling that Google Glasses might be just that for me – people choosing to walk around in tech specs? Really? Now watch it fly…)
Microsoft is 38 this year. And clearly, that’s not yet old enough to have its moment.
I wrote in an earlier blog post that the question of whether Microsoft will “get it” will be pivotal in the technology world this year. Sadly, the evidence is only piling up that Redmond still lives in another world, one that probably last existed around 2006 when Bill Gates first stepped back from his old day job.
I say “sadly” because I have always been a supporter of Microsoft. I grew up with its PCs. I never agreed with the criticisms of Windows as a platform – I knew it, it did everything I wanted both at home and at work, and using Office was second nature. Even when it became clear Vista was a dog, I stuck with it because it was familiar and comfortable. My home laptop is still a Vista machine – even though I use it a lot less these days because my Google Nexus 10 tablet is so convenient and enjoyable.
But now, I constantly come across things that make me shake my head in sorrow for Microsoft’s future. Here’s some examples.
Every year, Microsoft invites a variety of different interest groups to Seattle for an update on its plans – a bunch of security experts one month, top academics the next, then CIOs, and so forth. Talking to one or two people that have attended recent events, the phrase “Microsoft just doesn’t get it” is their repeated anthem.
One guest at Redmond highlighted the moment when this large crowd of highly influential people arrived in a conference room, and promptly pulled out their iPads, Android tablets or Macbook Airs to take notes. There was barely a Windows device among them – this group of people chosen by Microsoft as their key influencers.
Last year, Microsoft raised its prices twice – by 15% on products such as Exchange, Lync and Windows Server; and volume licensing increased by up to 25% to “create price consistency across Europe”.
And that’s not to mention the shameful attempt to remove discounts for some charities.
Now, the introductory pricing for Windows 8 upgrades is being withdrawn, increasing the price for anyone wanting to move to the flagship new operating system, just as the PC market starts to see one of its biggest ever declines.
What sensible organisation increases its prices when it sees its market in decline and flocking to competitors?
There’s also a great story on The Register, here, citing sources that say Microsoft blames its PC maker partners for the lacklustre sales of Windows 8 over Christmas. The story says that manufacturers told Microsoft they were unwilling to develop highly specified Windows 8 tablets that would cost too much and fail to sell.
Microsoft’s partnerships with PC makers have been the bedrock of its success. Those distribution networks are the very reason Windows and Office became so ubiquitous.
But now, for perhaps the first time, even those ultra-loyal PC firms, whose own success once depended on Microsoft’s relentless product update cycle, are no longer playing ball.
We have a choice, and we’re not choosing Microsoft. The world has changed. Even some CIOs are now publicly saying they see a future where they won’t have any Windows products in their organisation. Until very recently, that wasn’t even an option.
At this point in the narrative of this post, I would usually put in the counter argument, and point out the reasons why Microsoft can and will turn things around. But it’s getting harder and harder to see what the company is actually doing to that end.
The Surface tablet? I tried a Surface RT and hated it – confusing, non-intuitive and endlessly frustrating with its stupid dual interface. The Surface Pro tablet, due out next month, the product that Microsoft hopes will bridge the tablet and corporate IT worlds? Not at a reputed $1000 each it won’t, no matter how good it is.
Windows Phone? At best vying with BlackBerry for distant third in the smartphone market – just look at Nokia’s sales, which remain light years behind Apple and Android.
Microsoft is suffering from a classic mistake of talking only to its friends. It surrounds itself with people who think the company and its products are great. Dissenters are ignored, because the friends (and CEO Steve Ballmer) are convinced the critics don’t know what they are talking about. Nobody there wants to listen to bad news, yet the room full of friends seems to have more space to move about than it used to.
But we still need Microsoft. Pricing and licensing aside, it still understands the corporate IT market, even if its hold on end-user devices is weakening. I really hope Microsoft “gets it” soon. There is no pleasure to be taken from watching the world’s most successful software company getting old and out of touch, revelling in its own “dad dancing” moves even as the dance floor empties in embarrassment.
I should know – you should see my dancing…
The issue of electronic patient records in the NHS is back in the headlines after health secretary Jeremy Hunt called for the health service to be “paperless” by 2018.
After £6bn or so of taxpayer’s money failed to introduce a nationwide electronic records scheme in the ill-fated National Programme for IT, it’s inevitably a subject of much debate and cynicism – justifiably so, in the circumstances.
But putting aside the history, the concerns about NHS IT capability, and about privacy and data protection, the over-riding consideration here must be that surely it is patently obvious the NHS must have nationwide electronic records, accessible online by patients, if it is to function as any sort of modern health provider.
The issue cannot be whether or not we have online patient records, but how we avoid the IT disasters of the past and overcome the inevitable privacy fears in doing so.
It cannot be a sensible debate if it starts with, “Don’t do it, the NHS will never get the IT right”, or, “The privacy issues of nationwide digital records are too great” – both of which I have seen suggested.
Can anyone seriously claim that while we live more of our lives online, sharing our personal information, even managing our finances on the web, that somehow our medical records should not be a part of that world?
It would be ludicrous to think we could have an NHS without online patient records.
For a start, the vast majority of us already have digital medical records – they sit in our GPs’ systems, lonely and isolated, unreadable to anyone but your GP practice. Remember too that we have a legal right to look at our medical records at any time, whether paper-based or electronic.
Some might say that is enough, but my personal experience is that it cannot be.
Someone very close to me has a long-term, complex medical condition. Her GP maintains the only copy of her entire medical history – but she has no idea how accurate it is. Updates from the hospital at which she is treated are sent to her GP by letter, to be typed in by a secretary at the GP practice. The hospital, meanwhile, keeps its own entirely separate digital record of the medical history she has accumulated under its care.
Parts of the NHS refuse to treat her because they don’t have full access to her medical history to understand the medication she takes and the treatments she has had. For example, she has struggled to register with a dentist, because her condition affects her bones and she needs a monthly infusion to strengthen her bones – to many dentists, unfamiliar with her history, that’s too much of a risk.
I hope it never happens, but if she ever has to go to A&E at any hospital other than the one at which she is regularly treated, doctors there would have no idea of her complex history – and that lack of awareness could be life threatening.
If any suitably qualified NHS practitioner had access to her full medical records, with all the necessary security and privacy controls, it would allow the health service to deliver a significantly improved service and better care.
Of course, neither she nor I want her medical records to be hacked, or accessed by anyone we wouldn’t want to see them – any more than I’d be happy for someone to hack my online bank account.
And I don’t want untold billions of pounds to be wasted again on big, bespoke IT systems that don’t work. But there is no reason why the technology is not capable of delivering a secure, connected solution.
Tell me one reason why we shouldn’t keep on trying until we get it right. I just hope this is the time the NHS finally does.
In the hopefully unlikely eventuality that your company executives still need convincing that the internet is going to transform your business, the past few days have provided further evidence of the accelerating changes brought about by consumers moving to the web.
Only last week, I highlighted, “Which businesses are ready to thrive in the internet era?” as one of the big IT questions for 2013. We’re already finding out some of those that are not.
Camera retailer Jessops is the latest high-street retailer to suffer, following Comet at the end of last year – both household names that failed to adapt to the digital world.
In the latest round of supermarket financial results this week, Morrisons was the big loser, with its lack of an online presence or a multi-channel strategy cited as a primary cause of its sales decline. Meanwhile, Tesco and Sainsbury’s both highlighted impressive online growth as key factors in their performance.
In a generally depressed retail environment before Christmas, John Lewis stood out for a 44% year-on-year growth in web sales making a major contribution to 13% growth in December revenue.
John Lewis is so bullish about its digital plans it has even coined the phrase of an “omni-channel” strategy – which does rather make it sound like Malcom Tucker from TV political satire The Thick of It has taken over – but shows the extent of the department store’s ambition.
Plenty of experts now predict that this online shift is not going to be the evolutionary process that many firms hope, but instead more of a cliff. Consumer behaviour is changing so rapidly, that one day you may just find your business goes straight down – in which case, there can be no more excuses for delaying the IT investments needed.
Consumer behaviour is the key consideration. It’s not that the internet is changing businesses, it’s changing the way your customers think and shop and buy. And when consumers are prevented by a particular company from thinking and shopping and buying the digital way they want to, they just go elsewhere. Bang – goodbye.
The digital King Canutes who want to hang desperately to their old pre-digital business models are not fighting the tide of the web and mobile wave, even if they think they are.
Take the obvious example of the music industry. Here, they resorted to copyright and intellectual property law, and to the rhetoric of turning customers into criminals, to prevent their former, lucrative models from disappearing.
Music firms that claim they are simply stopping illegal piracy miss the point. The reason consumers opted for illegal downloads is because they were stopped from getting reasonably priced, easily available legal downloads. It wasn’t about rampant illegality, it was about consumer behaviour changing and the music industry wanting to prevent that change.
If you’re doing anything similar, you’re on a downhill slope to the cliff. Stop now.