Once a new technology becomes mainstream the industry needs to find a new thing to sell its customers. It’s been this way since the time IBM provided big metal boxes with integrated software and called the system a mainframe. The recently introduced z15 is the latest version of this approach to IBM computing, with a single, highly integrated enterprise server to run application workloads.
But vertical integration relies on highly resilient, high performance hardware and software system, and these tend to come at a high cost. So the alternative is distributed computing, where workloads can be organised to run across farms of low-cost servers. These servers used to be physical, then they were virtualised, and now everyone in the industry talks about containers and using Kubernetes for container orchestration. Then there is the public cloud, which is providing a template to show IT departments how they can build internal IT as a service, that mimics the ease-of provisioning of IaaS, PaaS and SaaS.
All of these things are welcome additions to the IT toolbox. But the issue the CIO faces, is that enterprise IT is renowned for hoarding old tech: nothing ever really gets thrown away. Just consider the legacy acronyms from the last few decades. Each promised to revolutionise the way software systems were developed. There was the Common Object Request Broker Architecture (Corba) from the Object Management Group; the Service Oriented Architecture (SOA) and Enterprise Service Bus (ESB). Each offered an industry vision for another three letter acronym: EAI, or Enterprise Application Integration.
These days, EAI is often considered too complex and costly a programme to run. The gaps between business processes running on different IT systems is now called digital transformation. And rather than re-keying, or cutting and pasting from one IT system to another, Robotic Process Automation (RPA) does this rekeying under the control of a bot running a script.
What is fascinating is that the approaches being taken to achieve RPA are not dissimilar to how legacy green screen systems used to be given a shiny new Windows front-end that re-keyed user input into a terminal session, or how the early price comparison sites in the late 1990s screen-scraped pricing information.
The z15 may be the latest incarnation of what IT people used to call the legacy “mainframe” system. But legacy is embedded everywhere in modern IT. The DNA inside SOA, ESB and even RPA have much in common with modern cloud architectures.
A few months ago Automation Anywhere began collaborating with freelance software developer recruitment platform, Toptal, on robotic process automation in the human workforce. The concept is called the “digital worker”.
There appears to be a number of forces coming together. The so-called gig-economy is showing that there are people happy to make themselves available for work on a zero hours style of work contract. And this trend is not limited to the likes of Uber and Deliveroo drivers. The nature of IT contracting is also changing . “People don’t want to work on a year long contract. They may only want to use their technical skills for work one day a month,” says Nick Woodward, CEO of ETZ Payments, a company that aims to simplify payments for freelance workers.
Bola Rotibi, research director, software development, at CCS Insight believes demand for IT services offered through the gig economy may increase as a result of tech skills shortages. “They work well when the work needed is very standardardised and well-scoped so that there is no ambiguity and you don’t have to open the company so that information flows out,” she said.
Robotise work, backfill with humans
Automation Anywhere provides a way to automate manual processes. But if this work cannot be accomplished through automation, it needs to be handed over to a real person. Today, RPA and hiring new talent are not directly connected. But imagine if the platform was smart enough to understand what IT work could be automated and where an IT job needed handing over to real people. The algorithm could then find a suitable IT contractor that met a predetermined set of requirements, similar to how Uber’s platform identifies the closest driver who is free to pick up a customer.
Such a scenario does not put much value on the human workforce. Neither does the terms “human capital management”, headcount or man-hours. However, if an RPA cannot perform a task, but can identify a real person free to complete the job, then that has to be a good thing?
The algorithm only really has to do the job of an applicant tracking system in reverse – to identify contractors with availability who have the right skillset. Perhaps add in weightings based on the sentiment of Checkatrade style reviews from previous employers and some smarts around recommendations. How hard can it be? The sad truth is that automated systems for identifying talent are often woefully inadequate.
Gartner’s latest magic quadrant report on cloud infrastructure puts IBM in the bottom left quadrant, as a niche player, alongside Oracle and Alibaba. IBM is hoping the £34bn it has spent on buying Red Hat will propel its cloud infrastructure strategy.
IBM’s core enterprise customers do not necessarily expect Big Blue to sell them the latest, greatest tech. They often see IBM as a trusted partner, providing systems that support core business functions. As the saying goes: “No one ever got fired for buying IBM.” One could argue this has left those companies that entrust IBM with their technology decisions, on a slower tech adoption curve. Yes, IBM has blockchain, Watson for deep learning and a quantum computing roadmap, but it takes a long time for the culture of a large enterprise to change. And this slowness to adapt arguably affects both its customers and IBM itself.
IBM used to be good at reinventing itself: from a hardware business to a software business and then a services business. Each change has seen IBM offer its traditional enterprise customers a way to evolve their IT function. The cloud is a phenomenon traditional enterprises have been slow to adopt and IBM has made several attempts at providing an IBM cloud for these customers. “IBM appeals to its existing customers who have a strong preference to purchase most of their technology from IBM,” Gartner noted in its magic quadrant for cloud infrastructure. From a cloud infrastructure perspective, Gartner sees the IBM cloud as an option for “Mode 1” IT – which comprises mainly of systems of record type workloads.
The Red Hat acquisition represents a $34bn gamble: will those businesses that have stuck with IBM remain bedded to a traditional approach to IT, or will they eventually shift the majority of their enterprise workloads onto the public cloud? If they chose the latter, AWS or Azure and Google – rated as leaders by Gartner – may, arguably, offer more fully-formed public cloud strategies. But, for IBM, there is a $350bn opportunity – so long as it can convince these organisations that it is the safest pair of hands to make them cloud-ready.
What does a modern desktop look like? For the last few decades since the desktop PC established its place in offices, the role of desktop IT has been to provide end-users with access to corporate approved software and enterprise data, file servers and printers.
But the ease with which end users can create and distribute information has introduced its own set of problems. For instance, the success of email has made it far too easy for people to have several revisions of the same document exist on the corporate network. Logically, it makes sense to deploy cloud-based storage. This enables colleagues to collaborate and share information via the cloud, such that a single version of the document can be shared, changes can be tracked and people can comment on revisions. But if one user in the team continues to use email, this often determines how other colleagues communicate.
The road ahead for the CIO and IT department will involve showing the tech laggards in the business how new cloud-based tools, simplify traditional email-based workflows.
Change how people use PCs for work
It’s about providing people with the right technology to deliver a good level of employee experience, such that they realise there are benefits in adapting old working practices.
In 2016, MIT’s Building business value with employee experience report found that companies with scores in the top quartile of employee experience, were twice as innovative as the bottom quartile, based on the percentage of revenue generated from new products and services in the last two years.
Focus on a good employee experience
Insight’s recent Employee Experience report found that a lack of engagement and the unavailability of useful, user-friendly technology in organisations is having a negative impact on staff productivity. Similarly, a YouGov survey for Avanade found that UK companies are failing to realise the value from their workplace technology investments.
MIT does not believe CIOs should adopt a command and control strategy to deploy new technology that improves employee experience. But CIOs face a credibility issue. End users often regard IT as a blocker, preventing them from achieving what they need to do. Introducing new tools just gets in the way of what they already have to put up with: multiple logins, the need to rekey information between different applications and sluggish, out-dated PC hardware.
The end of support of Windows 7 on January 14, 2020 provides CIOs with a compelling reason to change end user desktop computing practices.
“Once you put someone in orbit, it is a lot harder to fix it if something goes wrong,” says Paul Kostek, a senior Institute of Electrical and Electronics Engineers (IEEE) member and senior systems expert at Base2.
But things do go wrong. About 1:08:53 into the BBC’s excellent documentary, Eight days: To the moon and back, Neil Armstrong and Buzz Aldrin and mission control at the Kennedy Space Center, realise that the toggle for the engine arm circuit breaker on the Lunar Module launch vehicle had broken off. Without it, they would be stranded on the Moon. Nasa engineers worked through the night, investigating how to bypass the switch, but Aldrin figured out he only needed to make sure the electrical contacts in the switch touched, and used a felt tip pen to achieve this, enabling the two astronauts to take off from the Moon’s surface, and ultimately return to Earth safely.
The computers today are far more clever that the ancient hardware that put man on the Moon, but there is a very real sense that the greater the level of automation, the less human operators are kept in the loop, and can understand how to bypass the systems. The tragedy of Lion Air Flight 610 and Ethiopian Airlines Flight 302, are a testament to the fact that automation can make mistakes, and the overrides to enable humans to take back control must be simple and well-understood.
Designing for humans
Kostek says that there are lots of discussions in commercial aviation about pilots needing to know what to do if there is a problem with the flight systems. But without any relevant, contextual feedback, the pilots may not have all the information they need to overcome the issue the flight system encountered.
Automation is moving beyond space exploration and aircraft flight control towards augmenting vehicles, as a stepping stone to fully autonomous cars. Thanks to GPS, many people have lost commonsense, and will blindly follow instructions from a Sat Nav, that takes them down an impassable route. Automation is a double-edged sword. How different the Apollo mission could have turned out if Buzz Aldrin hadn’t stepped up to the mark, and taken the initiative.
The recent spate of band reunions and Glastonbury illustrates the longevity of rock music.
But the software industry seems to be stuck on finding the next big thing. Who would ever describe Windows 3.1 as a “classic”? Yet there are some products that somehow get the basics just right, and later releases do not really advance the tech innovation.
The question is how does one go about predicting what technologies will have longevity. There is a lot of industry hype and it is easy today to jump on the artificial intelligence bandwagon or focus on Bitcoin or internet of things. But a quarter of a century ago, there was no such thing as blockchain or Bitcoin and the closest people got to artificial intelligence was at the movies.
Macros and DOS configurations
The big thing then was WordPerfect and config.sys files.
Back in those days, Bruce Momjian, co-founder of the Postgres Core Development Group and chief database architect at EnterpriseDB used to worked as a Unix admin at a law firm, and recalls a story about one of the IT people who also worked at the firm. “There was one guy who created config.sys files and he was an expert in WordPerfect. I’m sure he does nothing with config.sys and WordPerfect macros today. It’s just about trying to figure out the hacks and it is not foundational. While what I learnt as a Unix admin is still relevant.”
He says there is a risk that chasing after what’s hot today may mean you end up with a skill that is not so hot in the future.
“If I wanted to be successful, why would I spend a bunch of years playing with a free piece of software? It doesn’t make any sense. I should have been working on WordPerfect macros.”
Doing practical things works in the short term, but he urges young IT folk to “do the crazy thing” and really look at how they should invest in their long term careers. “It takes a certain insanity and a certain disregard of practicality. It’s not about looking at the technology in order to try to get something done today, but instead understanding the full stack. I say, ignoring today and look to find answers just because you want to find an answer.”
Why system tools are boring
And while everyone know Linus Torvalds, in general, says Momjian, “If you are a creator of an infrastructure tool, you sit in an office and maybe you’re at a conference once every other month.” He argues that no IT decision maker really plans their IT strategy around a scripting language, a compiler or a text editor, or base it around some of the virtualisation tools out there. “They are interesting, but not a core part of a business process in organisations,” he says.
But compared to the early 1990s when Momjian was a Unix admin, proprietary Unix systems are on life support. Compare the proprietary Unix vendors to the like Microsoft and Oracle, who are still selling relational databases. Since the early 2000s, Momjian has been a database man. “There is a lot of people who find databases really interesting,” he adds.
For Momjian, the database industry is a good industry to be in. And there are some people in the open source community who are jetted around the world to speak to thousands of delegates about their contribution to database technologies.
For Momjian, these are the true rock stars of the software industry.
There are certain functions in database administration, that are good candidates for automation. These are the nuts and bolts of the job – like how to partition the storage, what happens if a disk fails or determining which physical disk data needs to be stored on. Some of these more mundane tasks no longer require as much manual intervention, because software has evolved to where the machine can take care of them. But this automation is not the same as autonomous database management.
Computer Weekly recently met up with Ravi Mayuram, senior vice president of products and engineering at Couchbase. Mayuram likens the current state of database administration as analogous to the automatic gearboxes in a car. Clearly the automatic gearbox is a long way from fully autonomous vehicles. But it offers a degree of convenience, which is where some of the automation now possible in database management have evolved to. In terms of databases, for Mayuram, in a fully autonomous database management system, the system will be able to fix itself. A fully autonomous database will more or less drive itself .But he reckons there will always be situations where humans will be needed.
Automation is nothing new for IT folk. Managing operating systems and software infrastructure used to be achieved using scripting. But Mayuram believes that the reason why full automation is taking a while to become part of the DBAs toolset is that databases are very unique. This is due to the way the relational database was originally engineered in the 1970s. “The whole database architecture is a bit like a car with a manual gearbox. It just cannot be automated,” says Mayuram. As such, it has provided gainful employment for the DBAs in many organisations
Self- driving database management
But, in time automation of database management will be possible, which will lead to DBAs no longer having the role of tinkering with the relational database system.
What will the expert DBA spend their time doing, when database admin tasks are automated?
Mayuram says: “With any change, it’s myopic to look at how many jobs will be eliminated. All the DBAs will not be eliminated because data is more essential now than it has ever been before.” He believes their job description will change such that DBAs will have a more pivotal role to play in helping the business run faster. The DBA will be the person who is capable of reducing the complexity of providing access to enterprise’s data in the right format at the time the business requires information to make decisions.
Last week shares in the tech giants plummeted following reports that US regulators from the Department of Justice and Federal Trade Commission were looking at investigating their business.
On the one hand, this may be good for consumers, since the technological era the web giants have created, means people’s data and online browsing history are treated as commodities that can be traded.
But the global internet platforms also bring people together in ways that no single government can ever hope to achieve. It is this power to enable people to rally around a particular ideology or campaign, that makes the internet both a force of power and danger for governments.
Make the world a better place through tech
Many believe technology can become a driving force for good when governments lack the appetite to tackle complex societal, sustainability and climate change issues. Social entrepreneur, Paul Polizzolto, is CEO Givewith, an organisation that provides a platform that links business activities to social impacts. Within the business-to-business market,Polizzolto believes social impact can be more valuable to a buyer than a negotiated best price. It can differentiate one seller from another and also delivers shareholder value both to the seller and the buyer. “Businesses have an opportunity to make ethics a core value,” he said, during a sustainability summit organised by SAP Ariba.
As Computer Weekly has previously reported, human rights lawyer, Amal Clooney believes businesses have an opportunity to step in and fill a void when governments do not live up to expectations. Speaking at the SAP Ariba Live conference in Barcelona at the end of May, she said: “We want to harness people like you in this room to solve public sector problems. There is a great opportunity.”
Access to accurate data can help organisations achieve sustainability targets and support climate change initiatives. Speaking at the sustainability summit, Sebastian Ociepka head of business intelligence, airline IAG, discussed how the airline industry, can use big data to gain proper insight across supply chain, which can help reduce CO2 emissions. “If all passengers put 1 kg less in their suitcase, it would save 30,000 tonnes of CO2 a year,” he said.
Ethics and sustainability in global supply chains
Across global supply chains, businesses have an opportunity to tackle modern day slavery, reduce CO2 in their supply chain and operate in a more sustainable fashion. Angel Pes, president, of the UN Global Compact in Spain also spoke at the SAP Ariba sustainability summit. Pes warned that the biggest risk in supply chains relates to human rights. This, he said, is the most pressing issue for multinationals. For Pes, the other factor global supply chains need to consider is the environmental risk. “This requires systems of control, auditing and taking tough decisions to change suppliers if they are not committed to global principals.”
BSR is a not-for-profit organisation working on building sustainable businesses. During the sustainability summit, its managing director Tara Norton, discussed the challenge in operating a supply chain that promotes ethical values. She said: “We need to think about changing the dynamics in the supply chain. What is the incentive for suppliers? Yes there is the global agenda, and yes there is value for big companies, but what about SMEs?” For Norton, adoption of ethical and sustainability practices is not going to happen unless everyone in the supply chain can benefit.
An ethical and sustainable supply chain will be most effective if it is driven from the bottom up by businesses working towards common goals, rather than top down through multinational alliances. As the US regulators start to investigate the business practices of the internet giants, a key question for internet users is whether the benefit they, themselves receive from these services, is worth the price of their data and internet privacy. But, there is an equally valid question that must be answered. Do we believe these platforms help engender a better society than one driven by a political agenda?
The recent story reported in Computer Weekly about Revlon’s SAP woes, illustrates the importance of tech due diligence. Its CTO, Chau Banks, who previously worked as CIO at New York & Company, is responsible for the company’s global technology strategy. But she joined the company in January 2018, just a month before the new SAP project started going wrong.
During a quarterly earnings call in March 2019, Revlon’s CFO, Victoria Dolan told financial analysts that a supply chain issue the company experienced in February 2018, after the implementation of a new SAP system, was now full resolved. But that did not stop its share price from sliding.
Shareholders blame Revlon
A number of US law firms have now announced class action lawsuits on behalf of shareholders, alleging that Revlon failed to prepare properly for the disruption caused when it deployed the SAP ERP system at its Oxford facility in North Carolina in February 2018. For its Q1 2018 earnings, posted in May 2018, the company reported revenue of $560.7m, missing its target by $31.9m.
According to the transcript of the earning call posted on Seeking Alpha, the company’s chief operating officer, Christopher Peterson, admitted sales and gross margins were directly impacted by the SAP disruption.
When has an ERP project ever been flawless?
When asked about the timing of the disruption. Peterson said: “Honestly, it’s just we did not expect to have the issues on the facility that we did. So in hindsight, the phasing looks a little off, but the reality is we were expecting to execute flawlessly on the SAP situation at Oxford.”
According to the transcript on Seeking Alpha, he had previously told analysts during the Q4 2017 earnings call that implementation of the new SAP ERP system, which was designed to support new customer technologies and processes and improve performance, was “on schedule”.
When Computer Weekly looked at the company’s Security and Exchange Commission (SEC) quarterly filings, it is interesting to see that SAP was only mentioned after the disruption. While implementing ERP systems is not a core business activity for the cosmetics giant, the unfolding story illustrates just how much of a business impact, a problematic ERP implementation can have. Which significant IT project has ever been executed flawlessly?
The issue of racist bias encoded in software made mainstream news last week, with a report on Channel Four news highlighting how software for profiling criminal suspects, tend to have racial biases.
Such software relies on a dataset, which is weighted against non-white individuals. Arrest data collected by US law enforcement tends to shows that, statistically, there is a strong correlation between skin colour and criminal activities. Law enforcement use this data to stop, search and arrest members of the public, which leads to more arrests of non-white suspects; the dataset of non-white arrests grows and so the data bias becomes self-fulfilling.
During a broadcast on Thursday 16 May, Channel Four covered the issue of racist software in its flagship new programme. During the broadcast, Peter Eckersley, director of research at the Partnership on AI was interviewed about the challenge with biased data. He said: “Many countries and states in the US have started used machine learning system to make predictions that determine if people will reoffend, largely based on arrest data.” Eckersley pointed out that arrest data or conviction data is racially biased: The chance of being stopped, charged, or convicted varies depending on your race and where you are located.
The datasets used are discriminatory. Joy Buolamwini, a computer scientist a at MIT Media Lab, who is exhibiting at the What makes us Human exhibition at London’s Barbican Centre told Channel Four news presenter Jon Snow that some of the larger data sets are mainly based on samples of white men. “They are failing the rest of the world – the under sampled majority are not included.”
Computer Weekly recently spoke to Ruha Benjamin, an associate professor of African American studies at Princeton University, about discrimination in algorithms. Her new book, Race After Technology, which is out in June, explores the biases in algorithms. The discriminatory design behind algorithms are being used to automate decisions in the IT systems used in healthcare, law enforcement, financial services and education. They are used by officials to make decisions that affect people’s lives, health, freedom; their ability to get a loan or insurance or even a job. Such algorithmic bias can therefore have a detrimental effect on racial minorities. She said: “I want people to think about how automation allows the propagation of traditional biases – even if the machine seems neutral.”
Diversity in the workforce
The answer is not about hiring more people from diverse racial backgrounds. Benjamin’s research found that people’s backgrounds tend to take a back seat in the race for tech innovation. The values in the tech sector appear incompatible with diversity. Software tends to be released as fast as possible, with little thought given to its broader social impact.
While people generally recognise their own human bias, for Benjamin, outsourcing decisions to objective systems that have biased algorithms, simply shifts the bias to the machine.
Roger Taylor, from the Centre for Data Ethics and Innovation told Channel Four News: “The problem is that AI is like holding a mirror up to the biases in human beings. It is hard to teach [AI algorithms] that the flaws they see are not the future we want to create.”