It’s clear today that the financial services sector has been protected from technological disruption as a result of the 2008 economic crash.
While industries such as retail and media have been thoroughly disrupted by internet, cloud and mobile technologies, banking has seen relatively little of the changes in consumer behaviour that brought the likes of HMV or Blockbuster to their knees.
A widespread conservatism, as bankers kept their heads down to avoid their fair share of blame for the recession, has combined with an increase in regulation to effectively entrench legacy IT systems in many major financial institutions.
In other sectors, the commoditisation of IT has broken down barriers to entry, increased customer choice, and encouraged disruptive new businesses to take on the inertia-bound incumbents. In finance, it’s been almost impossible to do the same.
But things are changing, and the same process of disruption will inevitably shake the financial sector.
The collapse of the Co-op’s plan to buy 632 Lloyds bank branches shows how difficult it remains to increase competition and open up the sector to new entrants. But the government is trying to foster such changes, by forcing easier access to global payments networks, for example.
There was much debate this week at Computer Weekly’s CW500 in the City event for financial services IT leaders, about the relationship between regulation and innovation. There was some agreement that fear of regulation acts as an inhibitor to innovation in the sector, although other delegates felt that it was too easy to use compliance as an excuse not to innovate.
Undoubtedly, for most banks their hugely complex and costly legacy IT estate acts as more of a barrier to innovation than any regulatory authority does. After all, it was “innovation” in collateralising risk that led to the crash in the first place. Few finance firms have taken the bull by the horns and overhauled their back-office systems – Nationwide has a £1bn IT transformation programme in place, but there aren’t many others taking such a big step.
But now we are starting to see glimpses of how the future might be different.
The banks have desperately tried to prevent the emergence of peer-to-peer mobile payment networks, but increasingly the likes of Google, Paypal, and network providers such as Telefonica are offering new ways to move money that bypasses the banks.
Germany has seen the world’s first social media bank – Fidor is a start-up operation that uses Facebook as its primary means of customer acquisition and support. It even offers a savings account where the interest rate is influenced by the number of Facebook “likes” it attracts.
For a young, social media generation looking for their first bank, that’s going to be an appealing alternative – and becomes especially disruptive if they stay loyal as they get older.
Customer behaviour is changing faster than most established banks can cope – offering a mobile banking app is one thing, using that as your primary means of customer communications is quite another. Eventually, that customer behaviour will be as much of a driver for change as government or regulators – and technology will be the means by which those customers force that change.
Like it or not, financial services will be disrupted every bit as much as other commercial sectors, with new entrants shaking up the established players as technology fragments the market. The big banks are blind if they cannot see it coming; they will of course resist for as long as they can, but like every other industry they will over the next 10 years be forced to embrace IT innovation or see their empires decline.
One of the criticism made of the late prime minister Margaret Thatcher during the recent coverage of her life and death was the way she abandoned communities that had once relied on former businesses such as mining and manufacturing.
There’s a case to be made that she was right to withdraw state support for loss-making companies, but the failure of that policy was the lack of investment in re-training affected workers who were left on the dole with skills that were no longer needed.
Sadly, this has become something of a recurring theme in businesses themselves in the last 20 years years, and particularly for IT professionals – and it’s likely to come to a head in the next few years.
During any economic downturn, one of the first items to be struck from the corporate cost base is training. In too many IT departments that has gone a stage further – employers unwilling to constantly update IT skills to keep up with the pace of technology change have instead sourced newer skills externally, often overseas.
This has all contributed to the continuing skills shortage today. It’s not a shortage of jobs – there are plenty of IT jobs out there – but CIOs say their biggest recruitment challenge is finding people with the relevant skills they need.
Thousands of IT professionals lost their jobs at the height of the crash, only to find that the skills they had honed on legacy IT in their former employers were no longer wanted elsewhere.
There’s a huge rush to recruit in-house software developers at the moment, from large firms to tech start-ups. One entrepreneur has called for the relaxation of UK immigration laws to attract more highly-skilled foreign developers. Many existing, out of work UK IT workers would take offence at such a move.
As Computer Weekly documents every day, we are in the midst of great change brought about by technology – some would call it the digital revolution. That means a growing need for modern IT skills, in areas such as cloud, web development, mobile, social media and more. The skills that have traditionally been the mainstay of the IT department – systems admin, operations, product-focused technical support – many of these will wither as the legacy technologies they support wither.
And if all this follows the same predictable path, with the same attitudes to training and skills that have hampered UK IT in the past 20 years, we have a very big problem.
It is essential that IT leaders do not allow training and skills development to become the forgotten investment in the digital revolution.
Anyone who has read this blog over the past few months will have seen there are several recurring themes I’ve been banging on about lately. It’s been interesting to see a few things happening recently that, to me, reinforce some of the points I’ve been discussing, so here’s a bit of a random round-up:
UK gives Belgium €300m to develop the cloud
Google has announced a €300m investment in its cloud datacentre in Belgium – that’s in addition to the initial €250m spent building the site, which opened in 2010.
When was the last time you read about a similar investment in cloud in the UK by one of the internet giants? Erm, never?
According to Datacenter Dynamics, one of the reasons Google chose the site was its welcoming government: “The local authorities have a strong vision for how the internet can bring economic benefits and jobs to the area,” Google said.
That’s money the UK is giving away. It would not be difficult to put in place a forward-looking strategy to support the internet and technology industry in the UK. All those big US internet firms like us – we speak a version of the same language, after all. The UK is such an obvious base for developing cloud datacentres, bridging the US and Europe in much the same way as we do in our role as one of America’s aircraft carriers.
And when the inevitable rise of Asian cloud firms happens in the future, they would look to the European location with the most proven infrastructure, and that would be here.
Except it won’t be, because there is no incentive or support for bringing such business to the UK. The government here gets excited instead when Google spends £1m on a new office.
When Google tells us we have a problem with our education system, it seems to make the government jump. But despite the close links between 10 Downing Street and Google, we have failed to gain any meaningful inward investment in the UK as a digital country. We are missing such a huge opportunity and the government should be embarrassed to see that money going elsewhere.
Falling off a cliff
Both Gartner and IDC have reported the biggest decline in PC sales they have ever seen. Gartner said global PC shipments went below 80 million units in the first quarter of 2013 for the first time since mid-2009.
Many firms in different sectors continue to view the changing environment brought about by the digital revolution as a gradual and manageable decline in their established markets. But the disruption and fragmentation caused by the internet and new technologies will not work like that – many markets will fall suddenly off a cliff, and companies constrained by inertia and management sclerosis will go with it.
Look at retail – HMV, Blockbuster, Jessops. The IT industry itself is seeing the same thing happening – Dell, HP, BlackBerry, Nokia. You can argue about the definition of a cliff, but year-on-year declines of 11% (Gartner) or 14% (IDC) in PC sales are at the very least a sign that the lemmings have started to jump.
Business leaders that fail to change, and ignore the inevitability of the digital revolution in their market, will have no excuses when their revenues reach the cliff edge too.
Microsoft is not going away
Gartner produced some separate research recently that looked at the overall market for end-user devices as a whole – PCs, smartphones and tablets. The headline news was the rise of Android devices, which Gartner predicts will be more than double the number of Windows (desktop plus smartphone/tablet) devices by 2017.
Much of the reporting around this research highlighted the demise of Microsoft – The Guardian labelled it “a slide into irrelevance”.
But even the Gartner graphic used to illustrate that argument in The Guardian told a different story.
To me, this shows that for sure Microsoft is no longer going to enjoy the dominance it has been accustomed to for the last 25 years. But irrelevance? I don’t think so.
Look at that green line, which shows Windows device shipments growing from a little over 300 million in 2012 to just below 600 million in 2017. That’s still not a bad growth rate, even if it masks the fact that the product mix will change from profitable desktop Windows to the likely lower income of smartphone or tablet Windows.
Android’s predicted growth is remarkable but predictable – it’s open source software; Google created it as such for the very reason that it would disrupt the market and fragment the ecosystems built around Microsoft and Apple.
But while device manufacturers and app developers may gain sales as part of the Android ecosystem, Microsoft and Apple are the firms for which that upwards chart represents the biggest actual revenue growth.
It may be a different Microsoft in 2017, one that no longer dominates the consumer nor corporate IT sectors in the way it used to, but it’s still going to be a force to be reckoned with.
While we marvel and despair at the pace of changes being wrought by the technology industry, there’s a body of opinion that everything we are seeing now was entirely predictable.
Perez says we are living through the fifth technological revolution to occur since the Industrial Revolution, and that each follows a predictable pattern. For example, “Each time there are two different periods with a bubble and recession between them,” she says.
Sound familiar? First the dot com boom and bust, and now the seemingly endless age of austerity?
Wardley writes about the move from a time of “peace” to one of “war” – when technology shakes up stable markets, disrupts the business of inertia-bound incumbents, and opens up opportunities for innovative new entrants who become the future giants of the next era.
You can see it in the IT industry – the slow-to-react giants like HP, Dell, Nokia and BlackBerry, while the likes of Google and Amazon rise rapidly. Look in particular at the battle for control of Dell – founder Michael Dell trying to take the company private, while rival investors seek a higher price and a different future, all the while distracting the supplier from the real task of adapting to the changes that brought about the battle in the first place.
Go back even further and you find evidence that the transformational shift represented by cloud computing was predicted as long ago as 1966, by a Canadian technologist, Douglas Parkhill, in his book The challenge of the computer utility.
You see it too in the increasingly rabid reporting around IT security – the idea of cyber warfare with China, and doom-laden predictions of the internet being brought down by hackers abound in national news today, only proving the old adage that truth is the first casualty of “war”.
Wardley in particular is scathing about the likes of Dell, whose travails were, he says, inevitable because they missed the entirely predictable changes in their market.
For IT leaders, this predictability should be a strategic tool.
You can see the same patterns of change happening in every major business sector. In retail, the inertia of HMV, Jessops, Blockbuster and others while new entrants like Netflix and iTunes revolutionised their market. In media, the financial challenge for national newspaper groups as blogs and online content destroys their old business models.
If you’re in an industry undergoing such radical change, which side of the inertia barrier are you on? Are you fighting to protect the peace, or embracing the battle of reshaping your business with IT?
Perhaps you’re in an industry so far protected from such change – financial services springs to mind. Perez claims the 2008 crash actually delayed revolutionary change in the sector, but that the change is nonetheless inevitable. Are you ready for the technology-led change you are about to experience?
All IT leaders need to understand which side of the inertia divide their organisation occupies, and get on the right side before it’s too late. Your future is both predictable and inevitable.
Once again, the Chancellor of the Exchequer’s red briefcase remained firmly in the analogue world.
George Osborne made his first tentative steps into the modern world by joining Twitter on the day of the 2013 Budget (well, some unfortunate Treasury press officer did at least, and then had to read all the abuse their boss received).
But you can’t imagine George is planning to retire the briefcase and walk out of Number 11 Downing Street carrying a red iPad anytime soon.
There can’t be many sectors that contribute 12% of GDP yet still get so little consideration in the UK’s economic planning – not least as a growing industry with enormous opportunity to establish Britain as world leader.
There was some encouragement for start-ups – start-ups generally, that is, not specifically tech – which will help the burgeoning UK scene. But other than that, the briefcase contained policies for banking, manufacturing and construction, those important but resolutely 20th century industries.
Surely it can’t be difficult to join a few simple dots for those in the Treasury team.
We have record youth unemployment in the UK, and a huge shortfall of young entrants in the IT profession – not to mention the need to fill tens of thousands of new roles in the coming years. Would it have been so hard to offer incentives for training and recruitment into IT jobs?
According to research, nearly 15% of commercial properties in the UK are empty. Meanwhile, one of the fastest growth areas in the technology world is cloud computing, a US-dominated area where the main suppliers have to be convinced by data protection laws to set up cloud datacentres in Europe.
Might it be a good idea all round to find ways to use suitable empty commercial properties to establish datacentres or businesses to support the cloud ecosystem? Or of course, you could just let UK companies give all their cloud IT budgets to US suppliers.
It can’t be difficult for some economist to prove the net-positive effect of such policies – more people in work, paying more taxes, buying more goods; more landlords receiving rentals, corporate IT budgets being reinvested in UK companies, a growing source of tax revenue from the UK being a hub for cloud providers.
Labour leader Ed Miliband labelled Osborne as a “#downgradedchancellor” after his Twitter debut. What more do we have to do to convince George to upgrade his UK IT budget?
In the long history of bad IT projects, one of the common characteristics has been the willingness – perhaps even determination – of IT teams to see things through to fruition even when it’s obvious that things have gone wrong.
This has been highlighted numerous times in various official reports about some of the high-profile government IT projects that have gone off the rails – even now, it looks like the increasingly troubled Universal Credit programme is going the same way.
Elaborate quality assurance processes have been put in place to flag up failing projects at an early stage – but in many cases civil servants have ploughed on regardless.
It’s not exclusive to Whitehall though – there are plenty of private sector projects that have made the same blinkered mistakes.
So it’s refreshing – even if on a small scale – to see retailer John Lewis talking publicly about its intent to “fail fast” at technology innovation.
The department store canned a “virtual mirror” initiative that had been launched with great fanfare last year, because it was making no money. “The key is, when you find it’s a dud, stop,” said one of the John Lewis IT executives.
It’s a principle that all IT leaders would do well to adopt.
“Fail fast” is a phrase that has come out of the online and start-up worlds, where the emphasis is on prototyping and piloting new initiatives, which inevitably makes it easier to stop. When there have been thousands of pounds sunk into a major development programme, there is clearly more at stake for project leaders.
But if “fail fast” has an antonym, it’s “pride comes before a fall”. It would take a culture change in many organisations to make a fail-fast strategy accepted, and not simply seen as failure. Risk-averse British business culture can be a hindrance.
But failing fast doesn’t just encourage honesty and prevent problem projects. It enables a culture of innovation – think how much more confident employees would feel about suggesting new ideas if they knew there would be no stigma attached to their subsequent failure.
“Right first time, every time” was the mantra of the quality brigade, and it’s a nice objective to have – but it doesn’t always reflect reality. Fail fast is good for IT departments, and good for innovation.
The vultures are circling over the IT behind the government’s Universal Credit (UC) programme.
Computer Weekly has catalogued the gradual drip-feed of concerns and rumours around the highest profile IT project in Whitehall at the moment.
Staff have been removed from the project at the highest level; costs are running over; deliverables appear to be thin on the ground. And now the row is brewing inside the House of Commons as Labour and Coalition politicians make claim and counter claim – the latest being that the key IT contractors have been told to stop work.
We hear lots of other speculation that, if true, would show a programme in chaos – suppliers that have simply not delivered; ultimatums and threats of legal action; despair from those in Whitehall IT who said all along that the approach taken was doomed.
Much of that speculation may be simple gossip – but the fact that people are engaged in such negative whispers shows the dark cloud under which UC is being developed.
You can understand why the government is reticent to admit problems. Universal Credit is a flagship Tory policy that could make or break their General Election hopes in 2015, by which time it is due to be live. Don’t be surprised if political expediency sees those deadlines put back to 2016 as polling day gets closer.
But the IT politics is even more interesting. UC is the project that, so we hear, minister Iain Duncan Smith personally insisted should remain outside the IT reforms being introduced from the Cabinet Office.
While nobody in Whitehall wants another IT disaster, you can bet the IT reformers would not be too disheartened if UC followed that all too familiar route.
And then there is the supplier relations – some of the biggest names, from the “oligopoly” of system integrators who have dominated government IT, are involved. These are the companies the reformers want to see booted out; the ones who still believe Whitehall can’t do IT without them.
If those suppliers, individually or collectively, fail on UC, it would be a crushing blow to their future sales in government. They will not go down without a fight. But there are powerful forces who would love to see them go down nonetheless.
Don’t underestimate just how much is at stake with Universal Credit – the outcome could shape the future of Whitehall IT.
The European Union wants to impose a cap on banker’s bonuses. Regardless of what you think about the reasonableness of legislators trying to regulate private sector salaries – I’d say it’s a daft idea, no matter how much banks need to change the bonus culture – it is another sign of the shifting sands for the financial services industry.
The sector is slowly, inexorably, shifting its centre of gravity eastwards. London is never going to disappear as a global finance hub, but it’s inevitable that the City’s influence will decline over the next 10, 20 maybe even 30 years.
That’s going to have a huge impact on the UK economy, and no matter how much successive governments will try to protect the finance sector, it is difficult to see anything but a gradual reduction in its contribution to GDP. Thanks to technology, banks can put their physical headquarters wherever the regulatory and tax environment is most favourable.
So this government, and the next, and the one after that, will all face a difficult decision. The sector that was the powerhouse of the UK economy for the 25 years since Maggie Thatcher’s “Big Bang” deregulation, will need to be supplanted by something new.
Now I know I’m biased, but surely if any sensible person takes a long-term view of what industry is likely to generate the biggest growth over the next 20 years, you can’t look much further than technology.
What’s more, the UK is well placed – a great (but still under-developed) skills base, a culture of innovation and creativity, a history of invention, and the most digitally active population in Europe.
The government plans to publish an “information economy” strategy in May – a rather belated stab at what it describes as “Digital technologies and information combining to drive productivity and create new growth opportunities across the whole economy.”
Glad they noticed.
But more than ever, it is clear the UK needs a committed, long-term strategy to place technology at the heart of our economy; one that transcends party politics and promises to put the country at the centre of what will be the most important sector in world business.
The industrial revolution was the turning point for the UK’s global economic influence. Let’s not waste the opportunity for the digital revolution to have the same impact.
Cyber security has made its ultimate mainstream breakthrough. This week, a relatively minor hack targeted at Apple not only made the BBC 10 O’clock News, but warranted a lengthy studio discussion between presenter Sophie Raworth and a BBC security correspondent.
Attacks of varying sophistication and impact are becoming a near daily occurrence – and they are only the ones we hear about. Meanwhile, David Cameron hopes to export the UK’s supposed cyber security expertise, offering the government’s support to Indian officials during a trade visit, while China stands accused once more of extensive military-backed cyber espionage.
So the European Commission’s proposed directives on cyber security and data protection are nothing if not timely. But they are proving controversial.
As businesses come to understand the full implications, there is likely to be quite some uproar over plans to effectively police and regulate the data security of almost any firm that offers some form of electronic or online service, from social media to retailers to banks.
US internet firms are lobbying to water down the proposals, and European companies are pushing back on plans for mandatory data breach reporting. There will be other concerns yet, and it seems certain that before long we will be bracketing IT security providers alongside lawyers as the guaranteed beneficiaries.
But the problem ultimately remains one of the IT industry’s own making.
Security is still an afterthought too often. Few, if any, of the software and hardware systems that are used on a regular basis were designed with security in mind from the start. Security is still a feature to be added in, rather than a fundamental element of product strategy, architecture, design and development.
The need for regulation is in itself an indictment of the failure of technology providers. But much of the proposed rules will serve only to allow those suppliers to sell more products and make more money, instead of changing their behaviour.
The users of technology will bear the brunt and cost of compliance with the EU’s directives, not the providers of the technology upon whom they rely. This, surely, risks missing an opportunity to force change on a negligent IT industry.
We’ve all become used to the idea of constant change working in a sector like IT. We’re so familiar with coping with the pace of innovation, that sometimes it’s easy to miss the scale of what is happening around us.
Make no mistake, we’re seeing change at a pace and scale that even for technology is almost unprecedented.
This week, one of the world’s biggest and most successful IT companies basically admitted that its strategy is wrong and desperate measures are needed to secure its long-term survival. Michael Dell placed a $24bn bet that he can turn the company he founded into a modern software, services and cloud provider.
Taking the firm private is a huge gamble. It’s also a historic milestone in the decline of the PC sector that has sustained so much of the IT industry for 30 years.
As I’ve written in this blog before, we’re entering a period where change is no longer a gradual, inexorable downward curve – but instead it’s a cliff. Dell saw the cliff approaching too late and is trying to perform a high-speed handbrake turn.
BlackBerry, the original icon of the smartphone and mobile working world, has also placed what could be its final bet. After far too many years of development, its new BlackBerry 10 operating system and associated handsets are the last throw of the dice. If it fails, the company fails – simple as that, and as unprecedented as that.
Closer to home, we’ve seen one of the UK’s biggest and best known resellers, 2e2, collapse into sudden administration.
In an industry that has always been obsessed with the “next big thing” – take a look at the current big things: cloud, mobile, social, big data. As we know those topics today, they were barely on the edges of many IT leaders’ radar even three years ago. Only a select few forward-thinkers saw the dramatic rise and the scale of change those trends would initiate.
We are in what one expert has called a state of war – established companies fighting for their position as they see their cosy status quo disappear, and new entrants charge the battlefield.
In the midst of such change, it’s hard sometimes to find the time to step back and get some perspective. But for IT leaders it’s going to be increasingly important to understand the broader context of the current landscape, and to plot a successful route.
A lot of people in IT will be placing a lot of bets in the next year or two, and a lot of companies will see their future determined as a result.