Gartner’s latest magic quadrant report on cloud infrastructure puts IBM in the bottom left quadrant, as a niche player, alongside Oracle and Alibaba. IBM is hoping the £34bn it has spent on buying Red Hat will propel its cloud infrastructure strategy.
IBM’s core enterprise customers do not necessarily expect Big Blue to sell them the latest, greatest tech. They often see IBM as a trusted partner, providing systems that support core business functions. As the saying goes: “No one ever got fired for buying IBM.” One could argue this has left those companies that entrust IBM with their technology decisions, on a slower tech adoption curve. Yes, IBM has blockchain, Watson for deep learning and a quantum computing roadmap, but it takes a long time for the culture of a large enterprise to change. And this slowness to adapt arguably affects both its customers and IBM itself.
IBM used to be good at reinventing itself: from a hardware business to a software business and then a services business. Each change has seen IBM offer its traditional enterprise customers a way to evolve their IT function. The cloud is a phenomenon traditional enterprises have been slow to adopt and IBM has made several attempts at providing an IBM cloud for these customers. “IBM appeals to its existing customers who have a strong preference to purchase most of their technology from IBM,” Gartner noted in its magic quadrant for cloud infrastructure. From a cloud infrastructure perspective, Gartner sees the IBM cloud as an option for “Mode 1” IT – which comprises mainly of systems of record type workloads.
The Red Hat acquisition represents a $34bn gamble: will those businesses that have stuck with IBM remain bedded to a traditional approach to IT, or will they eventually shift the majority of their enterprise workloads onto the public cloud? If they chose the latter, AWS or Azure and Google – rated as leaders by Gartner – may, arguably, offer more fully-formed public cloud strategies. But, for IBM, there is a $350bn opportunity – so long as it can convince these organisations that it is the safest pair of hands to make them cloud-ready.
What does a modern desktop look like? For the last few decades since the desktop PC established its place in offices, the role of desktop IT has been to provide end-users with access to corporate approved software and enterprise data, file servers and printers.
But the ease with which end users can create and distribute information has introduced its own set of problems. For instance, the success of email has made it far too easy for people to have several revisions of the same document exist on the corporate network. Logically, it makes sense to deploy cloud-based storage. This enables colleagues to collaborate and share information via the cloud, such that a single version of the document can be shared, changes can be tracked and people can comment on revisions. But if one user in the team continues to use email, this often determines how other colleagues communicate.
The road ahead for the CIO and IT department will involve showing the tech laggards in the business how new cloud-based tools, simplify traditional email-based workflows.
Change how people use PCs for work
It’s about providing people with the right technology to deliver a good level of employee experience, such that they realise there are benefits in adapting old working practices.
In 2016, MIT’s Building business value with employee experience report found that companies with scores in the top quartile of employee experience, were twice as innovative as the bottom quartile, based on the percentage of revenue generated from new products and services in the last two years.
Focus on a good employee experience
Insight’s recent Employee Experience report found that a lack of engagement and the unavailability of useful, user-friendly technology in organisations is having a negative impact on staff productivity. Similarly, a YouGov survey for Avanade found that UK companies are failing to realise the value from their workplace technology investments.
MIT does not believe CIOs should adopt a command and control strategy to deploy new technology that improves employee experience. But CIOs face a credibility issue. End users often regard IT as a blocker, preventing them from achieving what they need to do. Introducing new tools just gets in the way of what they already have to put up with: multiple logins, the need to rekey information between different applications and sluggish, out-dated PC hardware.
The end of support of Windows 7 on January 14, 2020 provides CIOs with a compelling reason to change end user desktop computing practices.
“Once you put someone in orbit, it is a lot harder to fix it if something goes wrong,” says Paul Kostek, a senior Institute of Electrical and Electronics Engineers (IEEE) member and senior systems expert at Base2.
But things do go wrong. About 1:08:53 into the BBC’s excellent documentary, Eight days: To the moon and back, Neil Armstrong and Buzz Aldrin and mission control at the Kennedy Space Center, realise that the toggle for the engine arm circuit breaker on the Lunar Module launch vehicle had broken off. Without it, they would be stranded on the Moon. Nasa engineers worked through the night, investigating how to bypass the switch, but Aldrin figured out he only needed to make sure the electrical contacts in the switch touched, and used a felt tip pen to achieve this, enabling the two astronauts to take off from the Moon’s surface, and ultimately return to Earth safely.
The computers today are far more clever that the ancient hardware that put man on the Moon, but there is a very real sense that the greater the level of automation, the less human operators are kept in the loop, and can understand how to bypass the systems. The tragedy of Lion Air Flight 610 and Ethiopian Airlines Flight 302, are a testament to the fact that automation can make mistakes, and the overrides to enable humans to take back control must be simple and well-understood.
Designing for humans
Kostek says that there are lots of discussions in commercial aviation about pilots needing to know what to do if there is a problem with the flight systems. But without any relevant, contextual feedback, the pilots may not have all the information they need to overcome the issue the flight system encountered.
Automation is moving beyond space exploration and aircraft flight control towards augmenting vehicles, as a stepping stone to fully autonomous cars. Thanks to GPS, many people have lost commonsense, and will blindly follow instructions from a Sat Nav, that takes them down an impassable route. Automation is a double-edged sword. How different the Apollo mission could have turned out if Buzz Aldrin hadn’t stepped up to the mark, and taken the initiative.
The recent spate of band reunions and Glastonbury illustrates the longevity of rock music.
But the software industry seems to be stuck on finding the next big thing. Who would ever describe Windows 3.1 as a “classic”? Yet there are some products that somehow get the basics just right, and later releases do not really advance the tech innovation.
The question is how does one go about predicting what technologies will have longevity. There is a lot of industry hype and it is easy today to jump on the artificial intelligence bandwagon or focus on Bitcoin or internet of things. But a quarter of a century ago, there was no such thing as blockchain or Bitcoin and the closest people got to artificial intelligence was at the movies.
Macros and DOS configurations
The big thing then was WordPerfect and config.sys files.
Back in those days, Bruce Momjian, co-founder of the Postgres Core Development Group and chief database architect at EnterpriseDB used to worked as a Unix admin at a law firm, and recalls a story about one of the IT people who also worked at the firm. “There was one guy who created config.sys files and he was an expert in WordPerfect. I’m sure he does nothing with config.sys and WordPerfect macros today. It’s just about trying to figure out the hacks and it is not foundational. While what I learnt as a Unix admin is still relevant.”
He says there is a risk that chasing after what’s hot today may mean you end up with a skill that is not so hot in the future.
“If I wanted to be successful, why would I spend a bunch of years playing with a free piece of software? It doesn’t make any sense. I should have been working on WordPerfect macros.”
Doing practical things works in the short term, but he urges young IT folk to “do the crazy thing” and really look at how they should invest in their long term careers. “It takes a certain insanity and a certain disregard of practicality. It’s not about looking at the technology in order to try to get something done today, but instead understanding the full stack. I say, ignoring today and look to find answers just because you want to find an answer.”
Why system tools are boring
And while everyone know Linus Torvalds, in general, says Momjian, “If you are a creator of an infrastructure tool, you sit in an office and maybe you’re at a conference once every other month.” He argues that no IT decision maker really plans their IT strategy around a scripting language, a compiler or a text editor, or base it around some of the virtualisation tools out there. “They are interesting, but not a core part of a business process in organisations,” he says.
But compared to the early 1990s when Momjian was a Unix admin, proprietary Unix systems are on life support. Compare the proprietary Unix vendors to the like Microsoft and Oracle, who are still selling relational databases. Since the early 2000s, Momjian has been a database man. “There is a lot of people who find databases really interesting,” he adds.
For Momjian, the database industry is a good industry to be in. And there are some people in the open source community who are jetted around the world to speak to thousands of delegates about their contribution to database technologies.
For Momjian, these are the true rock stars of the software industry.
There are certain functions in database administration, that are good candidates for automation. These are the nuts and bolts of the job – like how to partition the storage, what happens if a disk fails or determining which physical disk data needs to be stored on. Some of these more mundane tasks no longer require as much manual intervention, because software has evolved to where the machine can take care of them. But this automation is not the same as autonomous database management.
Computer Weekly recently met up with Ravi Mayuram, senior vice president of products and engineering at Couchbase. Mayuram likens the current state of database administration as analogous to the automatic gearboxes in a car. Clearly the automatic gearbox is a long way from fully autonomous vehicles. But it offers a degree of convenience, which is where some of the automation now possible in database management have evolved to. In terms of databases, for Mayuram, in a fully autonomous database management system, the system will be able to fix itself. A fully autonomous database will more or less drive itself .But he reckons there will always be situations where humans will be needed.
Automation is nothing new for IT folk. Managing operating systems and software infrastructure used to be achieved using scripting. But Mayuram believes that the reason why full automation is taking a while to become part of the DBAs toolset is that databases are very unique. This is due to the way the relational database was originally engineered in the 1970s. “The whole database architecture is a bit like a car with a manual gearbox. It just cannot be automated,” says Mayuram. As such, it has provided gainful employment for the DBAs in many organisations
Self- driving database management
But, in time automation of database management will be possible, which will lead to DBAs no longer having the role of tinkering with the relational database system.
What will the expert DBA spend their time doing, when database admin tasks are automated?
Mayuram says: “With any change, it’s myopic to look at how many jobs will be eliminated. All the DBAs will not be eliminated because data is more essential now than it has ever been before.” He believes their job description will change such that DBAs will have a more pivotal role to play in helping the business run faster. The DBA will be the person who is capable of reducing the complexity of providing access to enterprise’s data in the right format at the time the business requires information to make decisions.
Last week shares in the tech giants plummeted following reports that US regulators from the Department of Justice and Federal Trade Commission were looking at investigating their business.
On the one hand, this may be good for consumers, since the technological era the web giants have created, means people’s data and online browsing history are treated as commodities that can be traded.
But the global internet platforms also bring people together in ways that no single government can ever hope to achieve. It is this power to enable people to rally around a particular ideology or campaign, that makes the internet both a force of power and danger for governments.
Make the world a better place through tech
Many believe technology can become a driving force for good when governments lack the appetite to tackle complex societal, sustainability and climate change issues. Social entrepreneur, Paul Polizzolto, is CEO Givewith, an organisation that provides a platform that links business activities to social impacts. Within the business-to-business market,Polizzolto believes social impact can be more valuable to a buyer than a negotiated best price. It can differentiate one seller from another and also delivers shareholder value both to the seller and the buyer. “Businesses have an opportunity to make ethics a core value,” he said, during a sustainability summit organised by SAP Ariba.
As Computer Weekly has previously reported, human rights lawyer, Amal Clooney believes businesses have an opportunity to step in and fill a void when governments do not live up to expectations. Speaking at the SAP Ariba Live conference in Barcelona at the end of May, she said: “We want to harness people like you in this room to solve public sector problems. There is a great opportunity.”
Access to accurate data can help organisations achieve sustainability targets and support climate change initiatives. Speaking at the sustainability summit, Sebastian Ociepka head of business intelligence, airline IAG, discussed how the airline industry, can use big data to gain proper insight across supply chain, which can help reduce CO2 emissions. “If all passengers put 1 kg less in their suitcase, it would save 30,000 tonnes of CO2 a year,” he said.
Ethics and sustainability in global supply chains
Across global supply chains, businesses have an opportunity to tackle modern day slavery, reduce CO2 in their supply chain and operate in a more sustainable fashion. Angel Pes, president, of the UN Global Compact in Spain also spoke at the SAP Ariba sustainability summit. Pes warned that the biggest risk in supply chains relates to human rights. This, he said, is the most pressing issue for multinationals. For Pes, the other factor global supply chains need to consider is the environmental risk. “This requires systems of control, auditing and taking tough decisions to change suppliers if they are not committed to global principals.”
BSR is a not-for-profit organisation working on building sustainable businesses. During the sustainability summit, its managing director Tara Norton, discussed the challenge in operating a supply chain that promotes ethical values. She said: “We need to think about changing the dynamics in the supply chain. What is the incentive for suppliers? Yes there is the global agenda, and yes there is value for big companies, but what about SMEs?” For Norton, adoption of ethical and sustainability practices is not going to happen unless everyone in the supply chain can benefit.
An ethical and sustainable supply chain will be most effective if it is driven from the bottom up by businesses working towards common goals, rather than top down through multinational alliances. As the US regulators start to investigate the business practices of the internet giants, a key question for internet users is whether the benefit they, themselves receive from these services, is worth the price of their data and internet privacy. But, there is an equally valid question that must be answered. Do we believe these platforms help engender a better society than one driven by a political agenda?
The recent story reported in Computer Weekly about Revlon’s SAP woes, illustrates the importance of tech due diligence. Its CTO, Chau Banks, who previously worked as CIO at New York & Company, is responsible for the company’s global technology strategy. But she joined the company in January 2018, just a month before the new SAP project started going wrong.
During a quarterly earnings call in March 2019, Revlon’s CFO, Victoria Dolan told financial analysts that a supply chain issue the company experienced in February 2018, after the implementation of a new SAP system, was now full resolved. But that did not stop its share price from sliding.
Shareholders blame Revlon
A number of US law firms have now announced class action lawsuits on behalf of shareholders, alleging that Revlon failed to prepare properly for the disruption caused when it deployed the SAP ERP system at its Oxford facility in North Carolina in February 2018. For its Q1 2018 earnings, posted in May 2018, the company reported revenue of $560.7m, missing its target by $31.9m.
According to the transcript of the earning call posted on Seeking Alpha, the company’s chief operating officer, Christopher Peterson, admitted sales and gross margins were directly impacted by the SAP disruption.
When has an ERP project ever been flawless?
When asked about the timing of the disruption. Peterson said: “Honestly, it’s just we did not expect to have the issues on the facility that we did. So in hindsight, the phasing looks a little off, but the reality is we were expecting to execute flawlessly on the SAP situation at Oxford.”
According to the transcript on Seeking Alpha, he had previously told analysts during the Q4 2017 earnings call that implementation of the new SAP ERP system, which was designed to support new customer technologies and processes and improve performance, was “on schedule”.
When Computer Weekly looked at the company’s Security and Exchange Commission (SEC) quarterly filings, it is interesting to see that SAP was only mentioned after the disruption. While implementing ERP systems is not a core business activity for the cosmetics giant, the unfolding story illustrates just how much of a business impact, a problematic ERP implementation can have. Which significant IT project has ever been executed flawlessly?
The issue of racist bias encoded in software made mainstream news last week, with a report on Channel Four news highlighting how software for profiling criminal suspects, tend to have racial biases.
Such software relies on a dataset, which is weighted against non-white individuals. Arrest data collected by US law enforcement tends to shows that, statistically, there is a strong correlation between skin colour and criminal activities. Law enforcement use this data to stop, search and arrest members of the public, which leads to more arrests of non-white suspects; the dataset of non-white arrests grows and so the data bias becomes self-fulfilling.
During a broadcast on Thursday 16 May, Channel Four covered the issue of racist software in its flagship new programme. During the broadcast, Peter Eckersley, director of research at the Partnership on AI was interviewed about the challenge with biased data. He said: “Many countries and states in the US have started used machine learning system to make predictions that determine if people will reoffend, largely based on arrest data.” Eckersley pointed out that arrest data or conviction data is racially biased: The chance of being stopped, charged, or convicted varies depending on your race and where you are located.
The datasets used are discriminatory. Joy Buolamwini, a computer scientist a at MIT Media Lab, who is exhibiting at the What makes us Human exhibition at London’s Barbican Centre told Channel Four news presenter Jon Snow that some of the larger data sets are mainly based on samples of white men. “They are failing the rest of the world – the under sampled majority are not included.”
Computer Weekly recently spoke to Ruha Benjamin, an associate professor of African American studies at Princeton University, about discrimination in algorithms. Her new book, Race After Technology, which is out in June, explores the biases in algorithms. The discriminatory design behind algorithms are being used to automate decisions in the IT systems used in healthcare, law enforcement, financial services and education. They are used by officials to make decisions that affect people’s lives, health, freedom; their ability to get a loan or insurance or even a job. Such algorithmic bias can therefore have a detrimental effect on racial minorities. She said: “I want people to think about how automation allows the propagation of traditional biases – even if the machine seems neutral.”
Diversity in the workforce
The answer is not about hiring more people from diverse racial backgrounds. Benjamin’s research found that people’s backgrounds tend to take a back seat in the race for tech innovation. The values in the tech sector appear incompatible with diversity. Software tends to be released as fast as possible, with little thought given to its broader social impact.
While people generally recognise their own human bias, for Benjamin, outsourcing decisions to objective systems that have biased algorithms, simply shifts the bias to the machine.
Roger Taylor, from the Centre for Data Ethics and Innovation told Channel Four News: “The problem is that AI is like holding a mirror up to the biases in human beings. It is hard to teach [AI algorithms] that the flaws they see are not the future we want to create.”
A demo at this week’s .Next conference in Anaheim gave a snapshot of how far modern VDI has come. Virtual desktops should not be considered just as an option for users who only require low computing needs. During the demo, Nikola Bozinovic, vice president and general manager at Nutanix showed how it was possible to mask out the infamous Starbucks cup scene in Game of Thrones, editing a 4k, 60 frames per second video clip in Adobe Premiere Pro streamed to his Chrome browser, via Nutanix’ new Frame product. This enables the user interface on a Windows desktop to appear in an HTML 5-capable browser.
The year of VDI
Bozinovic believes 2019 is the year VDI finally breaks through into the mainstream. Since Citrix pioneered virtual desktop, it has been possible to stream the Windows user interface to thin client devices. Enterprises IT could create a gold image of the user’s desktop environment and provide secure access to it. But while it was certainly secure, and ensured a stable and robust desktop image could be streamed to users, it tended to be deployed to provide access to relatively simple applications such as providing Excel spreadsheet access. Since the application of VDI tended to be for lightweight user access to PC resources, IT generally did not consider virtual desktops as viable for mainstream desktop computing.
What has now changed? For Bozinovic the light bulb moment happened on March 19, when Google unveiled Stadia, a 4k stream gaming platform. Bozinovic says that Stadia shows how VDI can now make use of GPU acceleration (graphics processing units), to stream computationally and graphics intensive games to players direct to their web browser or via YouTube. Stadia is set to become the platform for next generation VDI, using its open source Vulcan GPU drivers to stream a new generation of graphics-intensive games running on AMD graphics chips. Dov Zimring, Google Stadia developer platform product lead, said: “Google and AMD share a commitment to open-source with expertise in Vulkan, open-source Vulkan GPU drivers, and open-source graphics optimisation tools. We’re humbled by the spirit of innovation and collaboration that exists throughout the gaming industry and look forward to pioneering the future of graphics technology with game developers, in open-source.”
What Stadia means for enterprise IT
For enterprise IT, this announcement represents the consumerisation of virtual desktops. No longer will VDI be regarded as only suitable for low graphics applications, people will be streaming high performance games at home and wonder why high performance graphics application can’t be streamed at work.
In a work environment, end users won’t require ultra high end engineering workstations to run computer aided design, video editing or other GPU-intensive applications. They will, however, still require ultra high definition, colour accurate displays with fast, flicker-fee refresh rates, to use the graphics applications comfortably. But, the raw processing power is available on the back end, and the user interface can be streamed.
If the gaming industry gets behind the idea of streaming GPU-intensive PC games, consumers will no longer need high-end gaming machines. Anyone, with a fast connection will be able to access gaming streams. For the gamers, their experience will be measured based on the quality of their display, network latency, audio and input devices.
In the enterprise, the IT industry looks like it is warming to the idea of using a PC as a thin client for desktop as a service (DaaS). The basic hardware just needs to run a browser like Chrome really well. Any resources, whether it is CPU, GPU, memory, storage or network bandwidth required to run resource-heavy PC applications, can simply be dialled up and down as needed in the cloud, or via the admin console on the back-end VDI. The applications are no longer limited by the restrictions imposed by the physical PC hardware.
None of this is new. Thin clients have existed almost since the dawn of computing. One could argue a VT100 terminal provided a means to stream the text-based user interface on a mainframe applications. Today, the reality is that Windows desktop applications are not going away. If Windows application developers start reworking their software to optimise them for VDI, desktop IT could be reimagined as a streamed service.
It has been a week of activism and a call for action, with the Extinction Rebellion, causing major disruptions in London.
But alongside grass roots demonstrations, there appears to be a greater awareness in the corporate world, that every business has a role to play in tackling the catastrophic environmental risk the world now faces.
On Monday, Legal and General Investment Management (LGIM), the UK’s largest institutional investment firm, announced that as part of its Climate Impact Pledge, it would not hold eight large global companies in the Future World funds. LGIM stated: “Where such companies are seen to take insufficient actions on climate risks, LGIM will also vote against the chairs of their boards, across the entire equity holdings.”
On Thursday, Bank of England governor, Mark Carney published an open letter describing the findings of a new report in which global banking, central banks, supervisors and the financial community have signed up to a set of deliverable goals to help to ensure a smooth transition to a low-carbon economy.
From a banking perspective, the focus is to build knowledge and share data such that the monitoring of climate-related financial risks are integrated in day-to-day supervisory work.
The promise of a “tech solution”
As Computer Weekly recently reported, PwC and Microsoft believe technology has an important role to play in helping businesses and governments to support climate change and environmental issues. PwC’s new How AI can enable a Sustainable Future report, looks at the use of AI-enabled smart technology across agriculture, water, energy and transport.
Examples include AI-infused clean distributed energy grids, precision agriculture, sustainable supply chains, environmental monitoring and enforcement, and enhanced weather and disaster prediction and response.
In the sectors it focused on, PwC estimated that AI applications could reduce global greenhouse gases by 4% – more than the current annual emissions of Australia, Canada and Japan combined.
The experts are in agreement that climate change will not only ruin the planet, kill off the polar bears along with millions of other species and result in unimaginable human suffering across the globe, it will also have a major impact on global business. It may not have the immediate impact of the Extinction Rebellion, but the words from Mark Carney, the LGIM statement and PwC’s findings may actually resonate more with business leaders. And if PwC’s forecasts are accurate, AI has the potential to enable every country to meet the target to become carbon neutral by 2050, as set out by the Paris Agreement.