Computer Weekly Editor's Blog


December 19, 2014  12:33 PM

Is BT buying EE a step towards selling off Openreach?

Bryan Glick Bryan Glick Profile: Bryan Glick
Broadband, BT, fibre, Networking, Ofcom, Openreach

BT is changing. Under CEO Gavin Patterson, the former monopoly telecoms giant has expanded into content through BT Sport, paying heavily for broadcast rights to English Premier League matches, and now is on the verge of splashing out £12.5bn on EE to get back into the mobile market that it exited when it sold O2 (then known as BT Cellnet) in 2001.

There has also been speculation that BT wants to merge its wholesale division with Openreach, the regulated subsidiary that manages its national telephone and broadband network. BT Wholesale sells access to some Openreach network services to other telcos and ISPs.

That would be a more complicated move, given the strict Ofcom rules under which Openreach operates. But what it would effectively also do is give Openreach greater in-house sales capability, separate from the main mothership of the consumer- and business-facing BT Group.

BT continues to generate controversy and opprobrium in equal measure from rural broadband campaigners over its dominance of the government’s BDUK programme to roll out superfast broadband to areas outside the regions BT considers commercially viable for fibre to the cabinet (FTTC) broadband. Critics are equally keen to point out that BT’s copper network is an effective monopoly, is outdated and will eventually have to be replaced by an all-fibre network at some point in the future as the hungry apps and browsers of internet and mobile users need to be fed by bandwidth-heavy services such as Netflix or the BBC iPlayer.

BT, of course, argues its case equally fervently. I discussed the arguments over BT and broadband last year – you can read the article here to save repetition – but concluded then that the problem is the lack of competition in the wholesale telecoms market, which can only be solved, in my opinion, by divesting Openreach.

I just have a sneaky suspicion that Gavin Patterson is moving in that direction.

His new BT is becoming more like a modern, integrated, internet-savvy communications provider – offering high-speed broadband, landline telephony (itself a diminishing market), 4G mobile, online and broadcast content. That’s a model more like Virgin Media or Sky than a traditional telecoms infrastructure player. Do BT’s long-term shareholders really want to invest in a creaking copper network subject to heavy regulation that will inevitably need to spend billions on upgrading its core infrastructure? I suspect not.

I think Patterson can see the writing on the wall for Openreach – hence merging with Wholesale gives it an opportunity to be a standalone company, similar to National Grid in the energy sector which owns most of the UK’s electricity and gas distribution networks. National Grid has been able to expand internationally as an energy infrastructure player, using its freedom as a publicly quoted company to buy similar businesses overseas, particularly the US.

It’s a model that could offer a future for Openreach outside of BT.

FTTC-based “superfast” broadband is going to last the UK for a few years yet – as it turns out, roll-out is well ahead of consumer adoption, which lags behind several European countries – but by 2020 the cracks will start to show. Even though 5G mobile does not even exist yet, it is equally inevitable that mobile networks will in future offer connection speeds far ahead of what even FTTC broadband currently provides. No way will BT want to keep spending on a declining, heavily-regulated asset in those circumstances – hence the purchase of EE.

A BT free of Openreach and its regulatory handcuffs becomes a very different proposition – but more importantly, so does an Openreach freed from BT; free also to expand internationally and invest sensibly and prudently in all-fibre networks with its own access to capital and debt. And without the parental relationship between BT and Openreach, perhaps other telcos large and small will be more enthusiastic about setting up wholesale network competitors in the UK.

BT will deny this of course – I can already imagine the emails from its press office, similar to the reaction to my article last year. But it feels to me like the new BT knows that, in the long run, it won’t need and doesn’t want the legacy of owning Openreach. And if Gavin Patterson does go down that route, then good for him – and good for the UK’s communications infrastructure.

December 16, 2014  6:08 PM

Introducing the Devereux-Hodge shambolicness scale for rating progress of Universal Credit

Bryan Glick Bryan Glick Profile: Bryan Glick
Agile, Digital skills, dwp, GDS, NAO, PAC, Universal Credit

If Universal Credit were made into a Hollywood rom-com, you just know that Margaret Hodge and Robert Devereux would be the central characters.

Thrust into repeated opposition on different sides of a heated Public Accounts Committee (PAC) table, the two star-crossed combatants would bicker and argue in ever-increasing circles of conflict. Then, in the final act, in a moment of outrageous serendipity – such as Universal Credit actually going fully live – they would realise their true feelings for each other.

You can decide which outcome – Universal Credit going live, or a future romance between PAC chair Margaret Hodge, MP and Department for Work and Pensions (DWP) permanent secretary Robert Devereux – is the more likely. (Remember this – nobody would have believed John Major and Edwina Currie…)

Last week the soap opera continued as Devereux was once more hauled in front of MPs by Hodge to discuss the latest National Audit Office (NAO) report into the troubled welfare reform programme. The latest act ended with Hodge insisting Universal Credit is a shambles, and then shutting down the meeting before Devereux had one last opportunity to deny the accusation. For the two pugilists, “shambles” is a binary measure – it either is (Hodge) or it absolutely isn’t (Devereux).

On Universal Credit at least, it’s increasingly clear that “shambolic” is in fact a sliding scale, with the programme veering in one direction or the other to different degrees of shambles. Let’s call it the Devereux-Hodge Scale of Shambolicness, to represent the end-points at either extreme, where 100% Devereux represents everything tickety-boo and working as expected; and 100% Hodge representing a total shambles.

So, based on recent revelations from the NAO and the PAC hearing, where on the scale is the project right now?

Wasted spending or value for money?

The discussions last week in the PAC meeting between Devereux, the Treasury’s Sharon White, the NAO and Hodge veered into the arcane terminology of accounting and economic modelling. At one stage, Hodge highlighted that £697m has been spent on Universal Credit so far – that’s all costs, not just IT costs – and yet DWP said only £34m of that will make it onto the balance sheet as an asset as a result of the “twin-track” approach now being taken, and once the digital system currently being developed is live.

On face value, that certainly seems like close to 100% Hodge on the shambolicness scale. Some reports claimed this means DWP could “write off” £663m, but that’s not the case. As White – soon to be leaving the Treasury to become the new CEO of Ofcom – explained, much of that £697m is normal operating expenditure which would never be recorded as an asset, and part of it is “Plan B” contingency spending so the existing Pathfinder system being used can stay in place if the digital system is delayed or doesn’t work.

We know for sure that £131m of assets will be written off by the time Universal Credit is fully live, but it is certainly likely that the true figure will be much higher.

Much of that £697m has gone on things like staff costs – money which would have been spent anyway – and up-front design and planning work. The only way that expenditure will have been wasted is if Universal Credit is scrapped entirely. That’s unlikely as Labour is entirely supportive of the policy, if not the implementation plan.

But Hodge is absolutely right to say that spending £697m and still not yet having a fully agreed business case is a shambolic way to spend money. In the unique language of the Civil Service, DWP has so far had its Strategic Outline Business Case approved; next year the Outline Business Case is due for approval; and then in 2016, the Business Case should finally be approved.

Devereux, on the other hand, maintains that spending £697m is chicken feed compared to the £7.7bn that will be saved as a result of the twin-track approach compared to the other alternatives considered when the programme was “reset” last year. That £7.7bn figure is derived from those arcane economic models based on more benefit claimants being on Universal Credit sooner, and the assumption that more of them will go back into employment more quickly.

If you want to consider the programme on the basis of a good old-fashioned return on investment calculation, then Devereux’s figures make such spending seem minor compared to the promised returns. This has, consistently, been the DWP line – that for all the spending, all the write-offs, and all the delays, the benefits to the UK of Universal Credit vastly outweigh the problems in getting us there.

So how do you decide where this stands on the Devereux-Hodge scale? You could consider the views of the independent Office for Budgetary Responsibility (OBR), which stated last week that in its opinion, “there remains considerable uncertainty” around the plans for Universal Credit, and that “weighed against the recent history of optimism bias in Universal Credit” it expects at least a further six-month delay in the latest revised timescales – which themselves are considerably delayed compared to the original plans back when the project was launched in 2011.

So in terms of the value for money so far, we’re very much at the Hodge end of the shambles scale – but if you’re looking long term (and believe the DWP economists) then it’s closer to the Devereux end.

The business case; or, does anyone actually know what they are doing?

As mentioned above, the phrase “business case” seems to have a flexible definition in the Civil Service. But there is no disputing that DWP has yet to produce what the NAO calls a “target operating model” for Universal Credit.

In layman’s English, this effectively means the DWP has yet to define what Universal Credit will actually do, how it will work, and what its processes and workflows will be. And you’d have to say that if this were the business case in a company, you wouldn’t get the project past first base if you had not defined what the end result would be. Even an agile project has a good idea of the desired outcome.

One of the independent advisors to Labour’s review of Universal Credit told me that for any project of this scale, failing to determine right up front what the target operating model will be and how the future processes will work, is a fundamental error. Labour has promised a three-month pause to the project should it win the general election in 2015, and one of the primary reasons for that is to step back and define that target model.

DWP would counter that it is pursuing a “test and learn” approach – a vaguely agile concept whereby it introduces new features and functions, then learns from their trial implementation, feeding the results back into the roll-out process. Bear in mind of course, that “test and learn” only came about when the project was so out of control that it was halted, reviewed and “reset” because it was without any doubt 100% a Hodge-level shambles.

In many ways, this comes down to the IT argument of waterfall versus agile, and if ever a project was trying to shoehorn both approaches into one, it’s Universal Credit. Agile is not a panacea; as the Labour advisor put it, it’s “horses for courses”. And in its desire to be seen to be agile, Universal Credit forgot some of the basics – namely, agreeing up front what they were all meant to be aiming for.

The NAO’s previous September 2013 report made the same criticism, stating: “Throughout the programme the department has lacked a detailed view of how Universal Credit is meant to work… The department was warned repeatedly about the lack of a detailed ‘blueprint’, ‘architecture’ or ‘target operating model’ for Universal Credit.”

Work has started to address that gap – the Treasury refused to sign off even the Strategic Outline Business Case without it – but the truth is that no matter how much testing and learning takes place, it’s difficult to say with certainty how much of the work completed so far will be relevant for a target operating model that has yet to be fully defined.

So on the business case, you’re looking at 80% Hodge so far.

The digital Holy Grail

The digital system being developed as part of the twin-track approach will eventually replace all but £34m of the IT assets created to support the Pathfinder trials – that’s just 17% of the £196m of IT assets created so far being retained, according to the NAO.

Digital has become the Holy Grail for Universal Credit – the knight in shining armour riding over the horizon to rescue the programme. It’s easy for this to be the bright future when it’s only just passed its “alpha” stage of development, and the initial trial of the system in Sutton is processing just 17 claimants so far, with a fair amount of manual intervention still required. It is simply too soon to tell how well the digital development is going – especially since the pilot started six months later than planned, based on a timescale established only 12 months ago.

Insiders are saying good things about the management of the digital project under DWP’s digital transformation director Kevin Cunnington. The reason for the delay came from problems recruiting suitably skilled digital expertise into DWPComputer Weekly reported back in January that the initial recruitment plans were already proving over-optimistic.

It’s worth remembering at this point, that the Cabinet Office and the Government Digital Service (GDS) had recommended that DWP put all its Universal Credit eggs into the digital basket, and scrap the Pathfinder system entirely. The latest NAO report shows that the DWP has managed to play with its accounting and economic models well enough to demonstrate that the twin-track approach will save the UK £7.7bn more than if they had waited for the digital system to be ready and stopped the current system roll-out last year, as GDS advised.

At the PAC meeting, Hodge set out her wariness over the optimistic promises around the digital system, compared with the reality of how the programme has gone in the past. For Devereux, digital can be the bright future for only a short time before it has to prove itself.

The NAO, meanwhile, warned that failure to complete the digital system, and relying instead on the existing system for full roll-out, comes with a £2.8bn bill to taxpayers.

So, 50-50 on the Devereux-Hodge scale so far.

The moving target of roll-out timescales

Gauging the shambolicness of the Universal Credit roll-out to date depends very much on the tint of the glasses through which you view progress.

Based on the original plans, over four million claimants were meant to be on Universal Credit by April 2014. So far, fewer than 18,000 people are claiming the benefit. By any standard, that is 100% Hodge of a shambles.

But through DWP-tinted glasses, it is far more important to get Universal Credit right in the end, than to adhere to an unachievable timescale – even if that was DWP’s own timescale you’re talking about.

As with the digital system, it’s easy for DWP to be confident about its latest roll-out plans because most of the work (and the risk) is so far away that within the confines of a House of Commons committee room everything can easily be 100% Devereux. Even Hodge finds it hard to disagree when asked, “What would you prefer, that we get it right in the end, or we get it wrong on any timescale?”

The roll-out plans are still very risky because they are so back-ended. Millions of claimants will have to be migrated onto new systems over an 18-24 month period – and even then, migration of tax credits has been further delayed until 2019. That scale of migration is simply unprecedented under any government.

The NAO report said that a further six-month delay (which is what the OBR expects to happen) will mean the loss of £2.3bn in potential economic benefits to the UK.

If nothing else, you have to give DWP credit for not blindly sticking to its wildly unrealistic former timescales. But based on progress to date, it’s been a comfortable 75% Hodge shambles – and the risk of further delays is at the same end of the scale.

The Devereux-Hodge Shambles Rating

Overall, based on progress to date, Universal Credit sits very much nearer to the Hodge extreme of shambles than to Devereux. The DWP’s argument is based very heavily on future promises, of jam tomorrow – and of course, if they are right, that’s going to be one heck of a tasty jam sandwich.

It’s not too late to swing Universal Credit towards the Devereux end of the scale. But with national roll-out of the most basic benefit claims due to take place in 2015, and the early promise of the digital system set for its biggest test, we will learn in the next six to 12 months if the needle is going to move along the scale towards Devereux any time soon.


December 5, 2014  4:00 PM

The key themes from this year’s UKtech50: digital, recruitment, and out-of-touch suppliers

Bryan Glick Bryan Glick Profile: Bryan Glick
CIO, CTO, digital, Government IT, it management, recruitment, Skills, uktech50

Soon after Computer Weekly launched our UKtech50 programme to identify the most influential people in UK IT, over four years ago, we were approached by a couple of the female IT leaders that made the list at that time. They were in a clear minority – only eight or nine women on the list, even if that was reflective of the meagre 17% of IT professionals who are female.

That conversation was about how we find more female role models in IT and give them the recognition and profile that will help to encourage women into IT. That led to our first programme to highlight the 25 most influential women in UK IT. That first poll was won by Jane Moran, then the global CIO of Thomson Reuters.

We were, therefore, especially pleased to see that three years later, Moran – now global CIO at Unilever – this year became the first woman to top the overall UKtech50 list as the most influential person in UK IT. What’s more, 16 of this year’s top 50 are female – almost one-third.

But of course, Moran did not top the list because she is a woman – she won because she is a high-profile IT leader, driving digital and technology innovation at one of the UK’s most important companies, one that touches most of our lives through its consumer products every day. Her gender, in this context, is not the issue. But nonetheless, it is great to see more recognition for the women driving the role of technology in the UK economy, and as examples of the digital glass ceiling being shattered.

Our UKtech50 event to announce the final list also heard talks from 12 of the top CIOs and CTOs in the country, and a few clear themes emerged across those presentations.

The first is the rise of “digital” – the IT leaders acknowledged that digital is already becoming one of those buzzwords that means different things to different people, but all agreed that it is a trend that is transforming how IT is managed and delivered, and its role in the organisation.

That leads directly to the second major theme – the changing skills and organisational profile of the corporate IT team. Increasingly, IT chiefs see two distinct functions. There are the back-end operations teams, running the traditional IT infrastructure under well-governed processes that focus on stability and reliability. And increasingly there is the digital team, often using agile methods to rapidly respond to business needs, developing software close to the customer, iterating, testing, experimenting, learning as they go – but highly reliant on those back-end experts for infrastructure.

Some speakers talked of “dual-speed” IT, or what Garner calls “bimodal” – although not everyone agrees with such terminology – but the clear message was that a new technology team is emerging.

From that comes the challenge of recruitment – finding people with the new digital skills needed. And here, the IT leaders shared successful experiences of growing their teams – Royal Mail, for example, which needed to find 300 new IT staff during a skills shortage, but thought smart recruitment using non-conventional methods, attracted nearly 30,000 applicants.

The message for skills recruitment was to target diversity and to recruit different profiles from those traditionally brought into IT – not just engineering or computer science students, but linguists, historians, economists, psychologists and so on – reflecting the importance of technology across all aspects of culture and society.

The final theme concerned IT suppliers – and really should concern IT suppliers too. Many of the IT leaders felt their traditional providers have not changed with the times and are stuck in a model of software licensing, hardware products and expensive and often unproductive consultancy services.

One speaker, Bank of England CIO John Finch, cited a large supplier brought in to help with a software audit, which ended up sending a £2.5m bill because the Bank was using virtualisation and cloud services in contravention of the licensing terms.

IT leaders agree that their world is changing, but they do not feel their key suppliers are changing with them. This is why speakers like John Lewis IT director Paul Coby highlighted the work he is doing with tech startups, and why government CTO Liam Maxwell flagged that over 50% of purchases put through the G-Cloud framework have gone to SMEs.

The best IT leaders – those featured in our UKtech50 list – are leading digital change, driving innovation, and establishing the best practices for others to follow. They are not short of challenges – and they are certainly not short of work to do. But everyone on our UKtech50 list shows that IT leadership in the UK is thriving and leading the world.


November 28, 2014  12:03 PM

DWP still has lessons to learn to make Universal Credit IT a success

Bryan Glick Bryan Glick Profile: Bryan Glick
Digital skills, dwp, GDS, NAO, PAC, Universal Credit

“Unacceptably poor management” and “wasted time and taxpayer’s money” – that’s how Margaret Hodge, MP, the chair of the Public Accounts Committee described Universal Credit this week.

The latest National Audit Office (NAO) report into the troubled welfare reform programme simply added to the catalogue of concerns. It is easy for secretary of state Iain Duncan Smith to continuously say that the project is on time, when the timescales are put back every six months. At any point in time, delivery is on target – until it’s delayed and is back on target again.

The headline findings from the report have been widely covered – still unable to determine value for money; no contingency plan should the new digital service fail to work; lack of an overall blueprint for delivery of the policy.

But reading through the 60-page NAO report also reveals some startling nuggets of information that the Department for Work and Pensions (DWP) would no doubt prefer to have kept under wraps.

For example, in April 2014, a software update caused an increase in incorrect payments to benefit claimants, which meant that every payment had to be manually checked for three months. The cause of the problem was an un-named supplier that released an update containing “significant changes” that the DWP had not been told about, and were therefore not properly tested.

The supplier is likely to be one of IBM, Accenture, HP or BT, the four key vendors supporting the flaky system that will be mostly replaced by the future digital service. Those suppliers have received far too little criticism or scrutiny of their role – even being allowed to audit their own work, what one MP called “marking their own homework”.

Only 17% of the work that these suppliers have done will be used once Universal Credit is fully live, according to the NAO.

In January this year, Computer Weekly revealed that DWP was already struggling to recruit the skills it needs to develop the digital service – a fact the DWP denied at the time. The NAO revealed that this has been the key reason for delays in the progress of the digital system, and that recruitment is still required to reach the necessary capacity. The report said that DWP was offering maximum salaries between 8% and 22% lower than the market average.

The DWP Digital Academy programme launched by the department’s digital transformation chief, Kevin Cunnington, is proving to be successful in training internal staff – but it’s a longer-term solution.

The real problem is that DWP chose to ignore the warnings and recommendations of the Government Digital Service for too long – blundering along with poor project management and misfiring suppliers. Now that digital has been placed at the centre of the programme, it’s been a case of catch-up in a recruitment market where digital experts are in short supply and high demand.

The risks around Universal Credit remain, and if future progress follows past history, the cost to the taxpayer of those risks being realised will be significant. There are chinks of light emerging and insiders say they have been impressed by Cunnington’s approach. With full roll-out of the new benefits now put back until 2019, there is time to get things right, but only if the DWP has finally learned its lessons over implementing modern, digital government IT.


November 21, 2014  11:12 AM

Cyber security: Can businesses put a value on trust?

Bryan Glick Bryan Glick Profile: Bryan Glick
Cyber security, GCHQ, Government IT, HP, RBS, Security, snooping, trust

Who can you trust in a digital world? The most dispiriting part of the technology revolution is the growing lack of trust felt by individuals and businesses as a result of cyber security threats.

For example, I learned this week that HP is being asked by some large customers to provide legal assurances that its products do not have backdoors – the IT giant even received such a request from a Nato country.

When one global financial services firm held an executive meeting in a Middle East country, its security chiefs were so concerned about industrial espionage, they transported an entire office communications setup from the US – but even then found they were not allowed to bring their Cisco routers into the country and had to source them locally.

While anybody with knowledge of the information security world will see such concerns as understandable, surely we should ask ourselves why we have allowed trust to deteriorate to such an extent.

Awareness of state-backed hacking and the Edward Snowden revelations about internet surveillance are important stories, but have helped to create a culture of distrust that will be increasingly difficult to reverse.

We seem to have accepted that, to do business in the digital world, you have to assume you trust nobody. But at what cost?

Companies are recognising the need to spend money on cyber security, but it is seen as a cost. How, instead, might they put a value on trust? How much more business might you be able to do by using technology as a way to gain the trust of your customers? What is the economic value to governments of making their country an internationally respected place for trusted digital business?

UK Cabinet Office minister Francis Maude said this week that the government’s cyber security strategy aims to “make the UK one of the safest places in the world to do business and ensure that our economy and society continues to benefit from the ongoing digital transformation.”

That is absolutely the right justification for the government plan – but how much is it undermined by Snowden’s revelations about GCHQ snooping?

Royal Bank of Scotland learned the cost of losing trust with a £56m fine for technology failures that prevented customers accessing their money. How much value would the state-owned bank gain from becoming a digitally trusted institution?

Building trusted technology will still cost, and will still need to use the same security systems and risk management methods – but it could represent an important change in mindset and corporate culture. Could the business benefit of trust be the key to putting extra cash into information security budgets to better address those risks?

The IT industry has become used to responding to fear of cyber threats. How much better would it be for everyone if it focused instead on helping its users in building trust.


November 14, 2014  12:01 PM

However many ‘modes’ you choose, the IT department needs to change

Bryan Glick Bryan Glick Profile: Bryan Glick
CIO, digital, Gartner, IT department, it management, recruitment, Skills

The age of the “command and control” IT department is over – of that fact most experts are agreed. The balance of power in corporate IT has shifted away from the IT team – and away from their technology suppliers – to their users in the business. But plenty of IT leaders are still trying to come to terms with what that means for how best to manage and organise their team.

This week saw the Gartner Symposium in Barcelona – the analyst’s annual IT leadership shindig. At the event, Gartner experts put forward their latest research on IT management, hailing the emergence of “bimodal IT“.

The analyst firm said that IT departments need to operate in two modes – one for fast-moving, agile, digital initiatives; the other for more conventional IT, with stricter governance, systems management, change control and so forth, essentially for back-office systems. Gartner claimed that 45% of CIOs have a “fast mode of operation”.

Not every expert is so convinced. Simon Wardley, who works with multinational companies and with the Government Digital Service (GDS) on techniques such as mapping and strategic gameplay, ridiculed Gartner’s research. Wardley is a highly respected adviser to CIOs – and to declare an interest, someone whose opinions and expertise I really respect.

Wardley proposes a three-way model for IT – what he calls pioneers, settlers, and town planners. He suggests there needs to be a middle stage that takes all the agile, digital stuff, and evolves it through to become business as usual, managing the culture and process changes that often implies.

His theory makes a lot of sense – how many times have we seen great new corporate IT innovations that simply fail to take hold because the business is not ready to take advantage of them? In government IT particularly, this was a key cause of failure in many big technology programmes in the past. You need that expertise in taking the new, and turning it into the everyday.

In Gartner’s defence, Computer Weekly has talked to numerous CIOs who talk about “two-speed IT”. But in many cases those CIOs do have an interim stage – the settlers – between the two, even if they don’t identify it as such.

Whichever view you subscribe to, the common theme is that the IT department is rapidly changing and having to take on new ways of working, and better ways of relating to its business users.

The job descriptions in the IT team are changing too. For example, the Ministry of Justice (MoJ) digital team is recruiting at the moment. Look at the job titles sought by MoJ and by GDS: content designers, product managers, user researchers, web operations, data scientists. It’s not database administrator, systems analyst, IT operations, systems engineers or support executives anymore – although those functions may still need to be fulfilled somewhere (unless all your systems run in the cloud, which is the likely future and the likely demise of many of those roles).

The structure and skills of IT departments have always evolved as technology changes, of course. But the pace of that organisational change is going to feel bewildering for many IT leaders in the next few years.

Whatever style or structure you prefer, IT leaders need to re-think the very fundamentals of their team. Those who don’t will soon find someone else doing it for them.


November 7, 2014  10:45 AM

This is the age of ‘just do it’ for IT leaders

Bryan Glick Bryan Glick Profile: Bryan Glick
Agile, digital, Ecommerce, GDS, HP, Internet of Things, Microsoft, Smartwatch

Without wishing to sound like a Nike advert, IT is entering the age of “just do it”.

One of the most important aspects of the digital transformation underway in Whitehall has been the Government Digital Service mantra of “show don’t tell”. If they see a need, they just get on and develop something to show people, instead of spending endless weeks gathering requirements and documenting specifications. “You want us to do that? Is this the sort of thing you mean?”

You can wrap it up in whatever buzzword you like – agile, disruption, digital, etc – but the essential change taking place in corporate technology delivery is the need to respond quickly and with flexibility to changing needs.

It is no longer acceptable to send your business analysts in to spend a couple of months drawing flowcharts and writing documents, turning that into a project spec and assembling a team to spend the next 12 months trying to make it happen. In the meantime, your “just do it” rivals have iterated their way to a competitive edge months ago.

Compare, for example, the online fashion players like Net-a-porter and Asos with their agile development approaches and constant updates, with Marks & Spencer. M&S used to outsource its e-commerce website to Amazon. When it decided to bring it back in house it led to a two-year development project and a “big bang” launch. Three months later, M&S online sales were down 8% year on year; in its latest financial results this week they were still down 6%.

Look instead at Nationwide – the building society has just become the first retail bank to offer a smartwatch app, for real-time balance checking. That’s a product that it is easy for some to be cynical about – jumping on a bandwagon, just a gimmick etc. But so what? Nationwide just did it – what is there to lose?

Some of the hoary old traditional IT suppliers are realising this too. HP and Microsoft, for example, have been justifiably criticised for a lack of innovation in recent years. But in the past week, HP launched its latest concept PC – strangely named Sprout – and Microsoft announced an internet of things headset designed to help visually impaired people find their way around unfamiliar places.

The success or otherwise of both products might be questionable – but it sends out the message that the suppliers are willing to try, to innovate, to just do it.

There will always be the occasions when a conventional “waterfall” style project is still more appropriate, but this is destined to be the exception. In a digital world, IT leaders need to be less “just do IT” and more “just do it”.


October 31, 2014  11:51 AM

IT buyers must demand open standards as balance of power shifts in their favour

Bryan Glick Bryan Glick Profile: Bryan Glick
AWS, Azure, cloud, Cloud Computing, Google, Integration, Interoperability, Microsoft, Windows

The old cliché about open standards has never been truer – that the great thing about them is there are so many to choose from.

Look at the Wikipedia page on the topic, for example. It lists 20 different definitions of what an open standard is, and at least 30 different specifications all of which claim to be a definitive open standard in their field.

The history of IT is one of numerous attempts to get interested parties around a table and agree how to make stuff work together. Most often, such initiatives have floundered on the simple truth that most hardware and software manufacturers have little to gain from being genuinely open. Why make it easy for users to switch away from your products, or to plug in other companies’ products at the expense of your own?

Most standards that have achieved the necessary ubiquity have done so despite official attempts to formalise rather than because of them. The internet is the best example – while vendors spent years trying to agree international standards for networking, the rest of the world just got on and used internet protocol and it became the de facto standard.

Lack of standards leads only to single supplier dominance – the effective standard for PCs for 25 years has been Windows, and for all the benefits that has delivered, it still leaves many organisations locked into Microsoft.

The same is happening in the cloud today – Amazon Web Services has become the dominant player by building an ecosystem based on its technology and APIs. Azure and Google are playing catch-up – but with no interoperability between clouds. More than ever, users are demanding cloud standards to prevent a repeat of the past.

The IT4IT Forum set up by the likes of Shell and BP, along with Microsoft, HP and IBM, is another welcome user-led initiative but is hardly the first time this has been tried. But perhaps – hopefully – one thing is different now. The consumerisation of IT has shown how much innovation can be sparked by genuine interoperability – even if Apple and Android still exist in rival ecosystems, an app on one can still talk to the same app on another.

The digital revolution will be built on openness, not proprietary products. Any vendor that wants to succeed must genuinely play the open game.

Corporate users need to demand open interoperability – it is no longer enough for a vendor to say that it’s too complicated. The balance of power has shifted to IT buyers, and it’s time they told their suppliers to change.


October 27, 2014  9:41 AM

Socitm 2014: The digital challenges for local government IT leaders

Bryan Glick Bryan Glick Profile: Bryan Glick
Digital strategy, G-Cloud, GDS, local government, Socitm

A big thank you to Socitm, the society for local government IT leaders, for inviting me to chair their annual conference, which took place last week in Manchester.

As a Liverpool FC supporter, the awkward choice of Old Trafford as the venue did at least give me the opportunity to poke some fun at the local team in Salford – one presumes that in this age of austerity, Socitm got the stadium cheaply given it’s not being used in midweek this season…

But more importantly, it was a chance to chat with some of the most forward-thinking IT leaders in the sector – those grappling with the dual challenges of major budget cuts and the move to digital government. Thanks to the likes of Socitm president Nick Roberts, formerly at Surrey County Council; John Jackson, CIO at Camden council; Steve Halliday at Solihull; Warwickshire CIO Tonino Ciuffini; Staffordshire’s Sander Kristel; Martin Sadler at Walsall – and anybody else I’ve forgotten to mention.

There were a number of themes and issues, and I thought it best to round up as many of them as I can in one place.

Local digital – just do it

In the run up to Socitm 2014, there has been much debate about how best to approach digital across the sector, given that almost every council will have very similar needs, in similar timescales, as austerity continues to bite over the course of the next five-year parliamentary cycle.

There is a growing recognition that the old model for local government (LG) IT is broken – a similar realisation in Whitehall led to the creation of the Government Digital Service (GDS) to stimulate reform in central government.

Some 70% of the LG software market is owned by three suppliers – Capita, Northgate and Civica. Just as Whitehall suffered for its dominance by the oligopoly of big system integrators, so LG IT is held back by a lack of competition with few incentives for suppliers to innovate and break their cosy status quo.  

There has been a lot of talk about creating a Local GDS to develop common platforms across the sector, but the politics around such an organisation – Who would fund it? Who would lead it? – mean it’s unlikely to work.

GDS deputy director Tom Loosemore addressed the Local GDS question in his keynote speech, urging delegates not to focus on that as the issue, but to think instead about what need they are seeking to meet and what problems have to be solved.

In its simplest terms, that need is change, according to the LG CIOs I talked to – change that in the past would have meant turning to an IT supplier to ask what they can do to help. But calls for a Local GDS are really highlighting the need for LG IT to do more digital among themselves, without relying on off-the-shelf software and inflexible outsourcing deals.

The old ways would be to define a series of digital platforms for LG and invite suppliers to develop them. The new way is to just get on and do it yourself – but to share that work around the sector. Open up data by wrapping APIs around legacy systems; encourage app developers to develop services around that data, and as one council develops a platform, share that as open source code with anyone else who wants it. Apparently there are now six or seven different apps for reporting potholes, ever since roads data was opened up, for example. And by the way, once you abstract those old legacy systems behind an API layer, they later become much easier to get rid of.

It’s not that different from the principles GDS promotes – but it’s a more ground-up way to develop local government as a platform, and more in keeping with the localism of the sector.

There is a lot of inertia still in LG, but a few forward thinkers are breaking that logjam. The overriding message I heard from those leaders is that everyone in the sector needs to just get out there and do it – develop services, try them out, see how they work, and share the lessons (and the code).

Next year’s conference would be perfect if it consisted entirely of council IT chiefs demonstrating all the new digital services they have created, for anybody else attending to also use.

Turkeys voting for Christmas

One of the most popular talks at Socitm 2014 came from Mark Thompson, an academic, consultant, digital advocate and author on digital government. He kick-started a debate on a critical issue for LG – and equally for Westminster – that ultimately the people who need to make the cash available to fund digital government within local councils are the very ones whose jobs are most threatened as a result of digital government.

Thompson has written on this question for Computer Weekly in the past – as public services are disintermediated in the same way as, for example, music and movies have been in the entertainment industry, much of the local bureaucracy that currently supports those services becomes redundant.

Why do you need people to co-ordinate pothole repairs, for example, if they are reported through an app that automatically passes details on to a council-approved repair contractor?

One IT chief told me his board instructed him to make 60% cost savings – and that’s after all the austerity cuts of the past few years. He benchmarked his cost against other councils in the same region and found he was already one of the most efficient. So he went to the board and said, if you want me to do this, you need to let all staff use their own devices; you need to get rid of our datacentre and use the cloud; and you need to let me redevelop any of our applications to make them digital. And on that realisation, the board baulked, realising the effect that would have on them, perhaps.

As Sander Kristel said in a Tweet, turkeys are not going to vote for Christmas – it’s the “elephant in the room” of digital government,

It’s a real question that LG digital leaders will have to address – but nobody has the answer yet.

Are IT suppliers gaming G-Cloud?

Local government has been criticised for not making enough use of G-Cloud, the central framework for buying commodity cloud software and services set up by GDS and the Cabinet Office. G-Cloud chief Tony Singleton came to Socitm 2014 to encourage delegates to buy more of their requirements that way. Lack of awareness and understanding of G-Cloud has been cited as the main reasons for the apparent reticence to use it.

But several people I talked to suggested a different reason – that they can buy identical services from the same suppliers more cheaply than they are offered on G-Cloud.

The aim of the G-Cloud online catalogue is to use price transparency to increase competition and drive down prices. But some LG CIOs believe that suppliers are artificially inflating prices on G-Cloud so as not to reveal their true charges to rivals. Furthermore, those same CIOs insist that LG has always negotiated lower prices than central government for IT – and therefore, G-Cloud seems cheap in Whitehall, but less so in councils.

Given the secrecy and commercial confidentiality that suppliers maintain around the fees they charge individual public sector bodies, it’s hard to prove that claim one way or the other. But many LG IT buyers are convinced that G-Cloud is simply not the best deal on the market.

Double silos – the extra layer of complexity for local digital

GDS is leading the way in Whitehall for moving away from the department-oriented IT silos in central government, to platform-focused shared delivery of services. It’s a huge task.

But in local government, that task is effectively doubled. Within each council there are similar departmental silos – transport, housing, social care and so on. And then there are the siloes between councils – neighbouring authorities who in the not-so-distant past worked on the basis that if next-door bought IBM, they would buy HP to be different. Nobody wants to admit that the bloke next door does things better.

So for an LG IT leader, they are trying to convince their own organisation to think differently, while also trying to encourage other councils to share their resources and efforts to build digital services with them across all their internal silos too.

Work that one out.


October 24, 2014  12:01 PM

A bad week for retailers, or a good week for digital retailing?

Bryan Glick Bryan Glick Profile: Bryan Glick
Argos, digital, homebase, multichannel, Retail IT, Tesco

It’s not been a great week for retailers. Look at Tesco – terrible; online sales grew by a whole 11%. And ailing DIY chain Homebase too – digital revenue up by 12%.

Of course, you might only have seen the headlines about Tesco’s profit decline and accounting errors, or Homebase shutting a quarter of its stores. As a result, you might not have heard that Homebase’s sister-company Argos saw online sales reach 43% of revenue; mobile sales increase by 45% in six months; or that 22% of sales now come through mobile or tablet devices.

Since the dot com boom of the late 1990s, we have heard much discussion about “bricks and clicks” – retailers looking to merge the best of the online and physical worlds. But even after all that talk, there are plenty still not getting it right.

If you look at two of the best performing high-street retail giants – Argos and John Lewis – what distinguishes them is a genuinely integrated multi-channel approach to their customers. Want to buy online and collect in store? You can do that. Want to buy something in the shop that is out of stock? Sure – here’s a tablet device to use, we’ll deliver to home for you.

Argos is even trialling digital concept stores – effectively, a physical store that sells digitally to customers who visit in person.

Tesco continues to be a leader and innovator in online retail – but perhaps underestimated the impact its digital success would have on austerity era shoppers unwilling or unable to afford premium products in store. Factor in aggressive low-cost competitors like Aldi and Lidl and you get the current problems.

Many retailers still see digital as something separate to their stores when their customers see them simply as different windows into their stock room. As happens in every sector, online was first seen as a threat – will the web replace the high street? Then it was seen as a complement – to save the high street, go online.

The truth is that technology does not simply replace or supersede anything – instead it fragments markets into many pieces, and eliminates barriers to entry. It demands an often radically different approach to customer engagement, not a bolt-on to old ways. Things change – and that’s hard for a lot of incumbents to deal with.

The main thing that is killing the high street is retailers’ reluctance to embrace digital and innovate their business models. The message from hard-up shoppers is clear – retailers need to go where their customers are.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: