Cliff Saran’s Enterprise blog


July 9, 2019  8:47 AM

How to be an IT rock star

Cliff Saran Profile: Cliff Saran
Uncategorized

The recent spate of band reunions and Glastonbury illustrates the longevity of rock music.

But the software industry seems to be stuck on finding the next big thing. Who would ever describe Windows 3.1 as a “classic”? Yet there are some products that somehow get the basics just right, and later releases do not really advance the tech innovation.

The question is how does one go about predicting what technologies will have longevity. There is a lot of industry hype and it is easy today to jump on the artificial intelligence bandwagon or focus on Bitcoin or internet of things. But a quarter of a century ago, there was no such thing as blockchain or Bitcoin and the closest people got to artificial intelligence was at the movies.

Macros and DOS configurations

The big thing then was WordPerfect and config.sys files.

Back in those days, Bruce Momjian, co-founder of the Postgres Core Development Group and chief database architect at EnterpriseDB used to worked as a Unix admin at a law firm, and recalls a story about one of the IT people who also worked at the firm. “There was one guy who created config.sys files and he was an expert in WordPerfect. I’m sure he does nothing with config.sys and WordPerfect macros today. It’s just about trying to figure out the hacks and it is not foundational. While what I learnt as a Unix admin is still relevant.”

He says there is a risk that chasing after what’s hot today may mean you end up with a skill that is not so hot in the future.

“If I wanted to be successful, why would I spend a bunch of years playing with a free piece of software? It doesn’t make any sense. I should have been working on WordPerfect macros.”

Doing practical things works in the short term, but he urges young IT folk to “do the crazy thing” and really look at how they should invest in their long term careers. “It takes a certain insanity and a certain disregard of practicality. It’s not about looking at the technology in order to try to get something done today, but instead understanding the full stack. I say, ignoring today and look to find answers just because you want to find an answer.”

Why system tools are boring

And while everyone know Linus Torvalds, in general, says Momjian, “If you are a creator of an infrastructure tool, you sit in an office and maybe you’re at a conference once every other month.” He argues that no IT decision maker really plans their IT strategy around a scripting language, a compiler or a text editor, or base it around some of the virtualisation tools out there. “They are interesting, but not a core part of a business process in organisations,” he says.

But compared to the early 1990s when Momjian was a Unix admin, proprietary Unix systems are on life support. Compare the proprietary Unix vendors  to the like Microsoft and Oracle, who are still selling relational databases. Since the early 2000s, Momjian has been a database man. “There is a lot of people who find databases really interesting,” he adds.

For Momjian, the database industry is a good industry to be in. And there are some people in the open source community who are jetted around the world to speak to thousands of delegates about their contribution to  database technologies.

For Momjian, these are the true rock stars of the software industry.

June 25, 2019  1:45 PM

Why database admins should embrace automation

Cliff Saran Profile: Cliff Saran
Uncategorized

There are certain functions in database administration, that are good candidates for automation. These are the nuts and bolts of the job – like how to partition the storage, what happens if a disk fails or determining which physical disk data needs to be stored on. Some of these more mundane tasks no longer require as much manual intervention, because software has evolved to where the machine can take care of them. But this automation is not the same as autonomous database management.

Computer Weekly recently met up with Ravi Mayuram, senior vice president of products and engineering at Couchbase. Mayuram likens the current state of database administration as analogous to the automatic gearboxes in a car. Clearly the automatic gearbox is a long way from fully autonomous vehicles. But it offers a degree of convenience, which is where some of the automation now possible in database management have evolved to. In terms of databases, for Mayuram, in a fully autonomous database management system, the system will be able to fix itself. A fully autonomous database will more or less drive itself .But he reckons there will always be situations where humans will be needed.

Automation is nothing new for IT folk. Managing operating systems and software infrastructure used to be achieved using scripting. But Mayuram believes that the reason why full automation is taking a while to become part of the DBAs toolset is that databases are very unique. This is due to the way the relational database was originally engineered in the 1970s. “The whole database architecture is a bit like a car with a manual gearbox. It just cannot be automated,” says Mayuram. As such, it  has provided gainful employment for the DBAs in many organisations

Self- driving database management

But, in time automation of database management will be possible, which will lead to DBAs no longer having the role of tinkering with the relational database system.

What will the expert DBA spend their time doing, when database admin tasks are automated?

Mayuram says: “With any change, it’s myopic to look at how many jobs will be eliminated. All the DBAs will not be eliminated because data is more essential now than it has ever been before.” He believes their job description will change such that DBAs will have a more pivotal role to play in helping the business run faster. The DBA will be the person who is capable of reducing the complexity of providing access to enterprise’s data in the right format at the time the business requires information to make decisions.


June 10, 2019  10:48 AM

Why Tech should lead, when governments lag

Cliff Saran Profile: Cliff Saran
Uncategorized

Last week shares in the tech giants plummeted following reports that US regulators from the Department of Justice and Federal Trade Commission were looking at investigating their business.

On the one hand, this may be good for consumers, since the technological era the web giants have created, means people’s data and online browsing history are treated as commodities that can be traded.

But the global internet platforms also bring people together in ways that no single government can ever hope to achieve. It is this power to enable people to rally around a particular ideology or campaign, that makes the internet both a force of power and danger for governments.

Make the world a better place through tech

Many believe technology can become a driving force for good when governments lack the appetite to tackle complex societal, sustainability and climate change issues. Social entrepreneur, Paul Polizzolto, is CEO Givewith, an organisation that provides a platform that links business activities to social impacts. Within the business-to-business market,Polizzolto believes social impact can be more valuable to a buyer than a negotiated best price. It can differentiate one seller from another and also delivers shareholder value both to the seller and the buyer. “Businesses have an opportunity to make ethics a core value,” he said, during a sustainability summit organised by SAP Ariba.

As Computer Weekly has previously reported, human rights lawyer, Amal Clooney believes businesses have an opportunity to step in and fill a void when governments do not live up to expectations. Speaking at the SAP Ariba Live conference in Barcelona at the end of May, she said: “We want to harness people like you in this room to solve public sector problems. There is a great opportunity.”

Access to accurate data can help organisations achieve sustainability targets and support climate change initiatives. Speaking at the sustainability summit, Sebastian Ociepka  head of business intelligence, airline IAG, discussed how the airline industry, can use big data to gain proper insight across supply chain, which can help reduce CO2 emissions. “If all passengers put 1 kg less in their suitcase, it would save 30,000 tonnes of CO2 a year,” he said.

Ethics and sustainability in global supply chains

Across global supply chains, businesses have an opportunity to tackle modern day slavery, reduce CO2 in their supply chain and operate in a more sustainable fashion. Angel Pes, president, of the UN Global Compact in Spain also spoke at the SAP Ariba sustainability summit. Pes warned that the biggest risk in supply chains relates to human rights. This, he said, is the most pressing issue for multinationals. For Pes, the other factor global supply chains need to consider is the environmental risk. “This requires systems of control, auditing and taking tough decisions to change suppliers if they are not committed to global principals.”

BSR is a not-for-profit organisation working on building sustainable businesses. During the sustainability summit, its managing director Tara Norton, discussed the challenge in operating a supply chain that promotes ethical values. She said: “We need to think about changing the dynamics in the supply chain. What is the incentive for suppliers? Yes there is the global agenda, and yes there is value for big companies, but what about SMEs?” For Norton, adoption of ethical and sustainability practices is not going to happen unless everyone in the supply chain can benefit.

An ethical and sustainable supply chain will be most effective if it is driven from the bottom up by businesses working towards common goals, rather than top down through multinational alliances. As the US regulators start to investigate the business practices of the internet giants, a key question for internet users is whether the benefit they, themselves receive from these services, is worth the price of their data and internet privacy. But, there is an equally valid question that must be answered. Do we believe these platforms help engender a better society than one driven by a political agenda?


June 3, 2019  7:22 PM

Take flawless IT project execution with a pinch of salt

Cliff Saran Profile: Cliff Saran
Uncategorized

The recent story reported in Computer Weekly about Revlon’s SAP woes, illustrates the importance of tech due diligence. Its CTO, Chau Banks, who previously worked as CIO at New York & Company, is responsible for the company’s global technology strategy. But she joined the company in January 2018, just a month before the new SAP project started going wrong.

During a quarterly earnings call in March 2019, Revlon’s CFO, Victoria Dolan told financial analysts that a supply chain issue the company experienced in February 2018, after the implementation of a new SAP system, was now full resolved. But that did not stop its share price from sliding.

Shareholders blame Revlon

A number of US law firms have now announced class action lawsuits on behalf of shareholders, alleging that Revlon failed to prepare properly for the disruption caused when it deployed the SAP ERP system at its Oxford facility in North Carolina in February 2018. For its Q1 2018 earnings, posted in May 2018, the company reported revenue of $560.7m, missing its target by $31.9m.

According to the transcript of the earning call posted on Seeking Alpha, the company’s chief operating officer, Christopher Peterson, admitted sales and gross margins were directly impacted by the SAP disruption.

When has an ERP project ever been flawless?

When asked about the timing of the disruption. Peterson said: “Honestly, it’s just we did not expect to have the issues on the facility that we did. So in hindsight, the phasing looks a little off, but the reality is we were expecting to execute flawlessly on the SAP situation at Oxford.”

According to the transcript on Seeking Alpha, he had previously told analysts during the Q4 2017 earnings call that implementation of the new SAP ERP system, which was designed to support new customer technologies and processes and improve performance, was “on schedule”.

When Computer Weekly looked at the company’s Security and Exchange Commission (SEC) quarterly filings, it is interesting to see that SAP was only mentioned after the disruption. While implementing ERP systems is not a core business activity for the cosmetics giant, the unfolding story illustrates just how much of a business impact, a problematic ERP implementation can have.  Which significant IT project has ever been executed flawlessly?


May 20, 2019  2:08 PM

How racist bias is embedded in software systems

Cliff Saran Profile: Cliff Saran
Uncategorized

The issue of racist bias encoded in software made mainstream news last week, with a report on Channel Four news highlighting how software for profiling criminal suspects, tend to have racial biases.

Such software relies on a dataset, which is weighted against non-white individuals. Arrest data collected by US law enforcement tends to shows that, statistically, there is a strong correlation between skin colour  and criminal activities. Law enforcement use this data to stop, search and arrest members of the public, which leads to more arrests of non-white suspects; the dataset of non-white arrests grows and so the data bias becomes self-fulfilling.

During a broadcast on Thursday 16 May, Channel Four covered the issue of racist software in its flagship new programme. During the broadcast, Peter Eckersley, director of research at the Partnership on AI was interviewed about the challenge with biased data. He said: “Many countries and states in the US have started used machine learning system to make predictions that determine if people will reoffend, largely based on arrest data.” Eckersley pointed out that arrest data or conviction data is racially biased: The chance of being stopped, charged, or convicted varies depending on your race and where you are located.

The datasets used are discriminatory. Joy Buolamwini, a computer scientist a at MIT Media Lab, who is exhibiting at the What makes us Human exhibition at London’s Barbican Centre told Channel Four news presenter Jon Snow that some of the larger data sets are mainly based on samples of white men. “They are failing the rest of the world – the under sampled majority are not included.”

Discriminatory design

Computer Weekly recently spoke to Ruha Benjamin, an associate professor of African American studies at Princeton University, about discrimination in algorithms. Her new book, Race After Technology, which is out in June, explores the biases in algorithms. The discriminatory design behind algorithms are being used to automate decisions in the IT systems used in healthcare, law enforcement, financial services and education. They are used by officials to make decisions that affect people’s lives, health, freedom; their ability to get a loan or insurance or even a job. Such algorithmic bias can therefore have a detrimental effect on racial minorities. She said: “I want people to think about how automation allows the propagation of traditional biases – even if the machine seems neutral.”

Diversity in the workforce

The answer is not about hiring more people from diverse racial backgrounds. Benjamin’s research found that people’s backgrounds tend to take a back seat in the race for tech innovation. The values in the tech sector appear incompatible with diversity. Software tends to be released as fast as possible, with little thought given to its broader social impact.

While people generally recognise their own human bias, for Benjamin, outsourcing decisions to objective systems that have biased algorithms, simply shifts the bias to the machine.

Roger Taylor, from the Centre for Data Ethics and Innovation told Channel Four News: “The problem is that AI is like holding a mirror up to the biases in human beings. It is hard to teach [AI algorithms] that the flaws they see are not the future we want to create.”


May 9, 2019  5:32 PM

VDI reimagined

Cliff Saran Profile: Cliff Saran
Uncategorized

A demo at this week’s .Next conference in Anaheim gave a snapshot of how far modern VDI has come. Virtual desktops  should not be considered just as an option for users who only require low computing needs. During the demo, Nikola Bozinovic, vice president and general manager at Nutanix showed how it was possible to mask out the infamous Starbucks cup scene in Game of Thrones, editing a 4k, 60 frames per second video clip in Adobe Premiere Pro streamed to his Chrome browser, via Nutanix’ new Frame product. This enables the user interface on a Windows desktop to appear in an HTML 5-capable browser.

The year of VDI

Bozinovic believes 2019 is the year VDI finally breaks through into the mainstream. Since Citrix pioneered virtual desktop, it has been possible to stream the Windows user interface to thin client devices. Enterprises IT could create a gold image of the user’s desktop environment and provide secure access to it. But while it was certainly secure, and ensured a stable and robust desktop image could be streamed to users, it tended to be deployed to provide access to relatively simple  applications such as providing Excel spreadsheet access. Since the application of VDI tended to be for lightweight user access to PC resources, IT generally did not consider virtual desktops as viable for mainstream desktop computing.

What has now changed? For Bozinovic the light bulb moment happened on March 19, when Google unveiled Stadia, a 4k stream gaming platform. Bozinovic says that Stadia shows how VDI can now make use of GPU acceleration (graphics processing units), to stream computationally and graphics intensive games to players direct to their web browser or via YouTube. Stadia is set to become the platform for next generation VDI, using its open source Vulcan GPU drivers to stream a new generation of graphics-intensive games running on AMD graphics chips. Dov Zimring, Google Stadia developer platform product lead, said: “Google and AMD share a commitment to open-source with expertise in Vulkan, open-source Vulkan GPU drivers, and open-source graphics optimisation tools. We’re humbled by the spirit of innovation and collaboration that exists throughout the gaming industry and look forward to pioneering the future of graphics technology with game developers, in open-source.”

What Stadia means for enterprise IT

For enterprise IT, this announcement represents the consumerisation of virtual desktops. No longer will VDI be regarded as only suitable for low graphics applications, people will be streaming high performance games at home and wonder why high performance graphics application can’t be streamed at work.

In a work environment, end users won’t require ultra high end engineering workstations to run computer aided design, video editing or other GPU-intensive applications. They will, however,  still require ultra high definition, colour accurate displays with fast, flicker-fee refresh rates, to use the graphics applications comfortably. But, the raw processing power is available on the back end, and the user interface can be streamed.

If the gaming industry gets behind the idea of streaming GPU-intensive PC games, consumers will no longer need high-end gaming machines. Anyone, with a fast connection will be able to access gaming streams. For the gamers, their experience will  be measured based on the quality of their display, network latency, audio and input devices.

In the enterprise, the IT industry looks like it is warming to the idea of using a PC as a thin client  for desktop as a service (DaaS). The basic hardware just needs to  run a browser like Chrome really well. Any resources, whether it is CPU, GPU, memory, storage or  network bandwidth required to run resource-heavy PC applications, can simply be dialled up and down as needed in the cloud, or via the admin console on the back-end VDI. The applications are no longer limited by the restrictions imposed by the physical PC hardware.

None of this is new. Thin clients have existed almost since the dawn of computing. One could argue a VT100 terminal provided a means to stream the text-based user interface on a mainframe applications. Today, the reality is that Windows desktop applications are not going away. If Windows application developers start reworking their software to optimise them for VDI, desktop IT could be reimagined as a streamed service.


April 18, 2019  10:07 AM

Banking and finance bigwigs yank climate change chord

Cliff Saran Profile: Cliff Saran

It has been a week of activism and a call for action, with the Extinction Rebellion, causing major disruptions in London.

But alongside grass roots demonstrations, there appears to be a greater awareness in the corporate world, that every business has a role to play in tackling the catastrophic environmental risk the world now faces.

On Monday, Legal and General Investment Management (LGIM),  the UK’s largest institutional investment firm, announced that as part of its Climate Impact Pledge, it would not hold eight large global companies in the Future World funds. LGIM stated: “Where such companies are seen to take insufficient actions on climate risks, LGIM will also vote against the chairs of their boards, across the entire equity holdings.”

Data sharing

On Thursday, Bank of England governor, Mark Carney published an open letter describing the findings of a new report in which global banking, central banks, supervisors and the financial community have signed up to a set of deliverable goals to help to ensure a smooth transition to a low-carbon economy.

From a banking perspective, the focus is to build knowledge and share data  such that the monitoring of climate-related financial risks are integrated in day-to-day supervisory work.

The promise of a “tech solution”

As Computer Weekly recently reported, PwC and Microsoft believe technology has an important role to play in helping businesses and governments to support climate change and environmental issues. PwC’s new How AI can enable a Sustainable Future report, looks at the use of AI-enabled smart technology across agriculture, water, energy and transport.

Examples include AI-infused clean distributed energy grids, precision agriculture, sustainable supply chains, environmental monitoring and enforcement, and enhanced weather and disaster prediction and response.

In the sectors it focused on, PwC estimated that AI applications could reduce global greenhouse gases by 4% – more than the current annual emissions of Australia, Canada and Japan combined.

The experts are in agreement that climate change will not only ruin the planet, kill off the polar bears along with millions of other species and result in unimaginable human suffering across the globe, it will also have a major impact on global business. It may not have the immediate impact of the Extinction Rebellion, but the words from Mark Carney, the LGIM statement and PwC’s findings may actually resonate more with business leaders. And if PwC’s forecasts are accurate, AI  has the potential to enable every country to meet the target to become carbon neutral by 2050, as set out by the Paris Agreement.


April 15, 2019  11:28 AM

Single source ERP fails to match business needs

Cliff Saran Profile: Cliff Saran
Uncategorized

There was a time when the major ERP providers were considered allies of the CIO. They were trusted advisors. From an IT decision-maker perspective, ERP software aimed to encapsulate best-in-class business processes in a single package.

In theory, different enterprise packages from the same ERP company would be able share data; the packages would be pre-integrated and, as for master data management, the customer would have a single version of the truth.

The technique of bundling “free” or discounted enterprise software as part of a sales pitch, meant that organisations were often enticed to buy more products from the same ERP company. On paper, at least, it made sense to invest in a single supplier for core enterprise systems.

Drawbacks of having a single source for ERP

But there are many drawbacks. Organisations standardising on a single ERP provider’s software stack are often relentlessly pursued by account teams at these companies to buy more and more stuff.

These days, that stuff tends to be cloud offerings.

As Computer Weekly has previously reported, Oracle executives referred to “cloud services“ six times in the transcript of the 45-minute, third-quarter 2019 earnings call in March, posted on the Seeking Alpha financial blogging site. Similarly, SAP executives made three long statements regarding “as-a-service” in their fourth-quarter 2018 earnings call in January 2019, according to the transcript on Seeking Alpha.

SaaS increases ERP choice

KBV research’s Global Software as a Service (SaaS) Market, report has forecast that the SaaS market is expected to attain a market size of $185.8 billion by 2024, growing over 21% a year. The ERP providers have spent the last few years fleshing out their SaaS strategies through strategic acquisitions and building out cloud-based ERP software. However, they are no longer the only options available to the CIO. While ERP tends to be able to run a large chunk of a company’s business processes there may be gaps and missing functionality. In the past, so-called “point solutions” filled these gaps. In the era of SaaS, the cloud has enabled companies like Salesforce and WorkDay to establish themselves as dominant players. Often, these SaaS products are best in class.

Multi-sourcing is the future of ERP

It used to be the case that the bulk of enterprise software spending usually went to a single ERP provider. Today, savvy IT decision makers are building out enterprise SaaS portfolios, with products and services from multiple SaaS providers.

Such a strategy breaks the grip the traditional ERP providers have had with their customers. The traditional ERP companies are set up to offer the CIO a one-stop-shop for enterprise software. But businesses no longer standardise solely on a SAP or Oracle suite of products to meet their enterprise software requirements.  As a consequence, the traditional ERP providers have gone shopping for smaller SaaS firms, in an attempt to sell what they would consider “a complete solution”, that they can then claim meets all the requirements a customer needs from enterprise software.

Such is the nature of innovation in the software industry that someone is bound to invent something new and original, which subsequently gains traction. Clearly, it is not going to be particularly realistic for the major ERP providers to buy every SaaS business that has an offering which fills a gap in their product portfolios.

Make integration a key requirement

When shopping for enterprise software, Forrester principal analyst, Duncan Jones, believes IT buyers need to put integration with open APIs, high on their list of priorities. Businesses are told to reduce their reliance on custom code in ERP implementation. This should also apply to the customisations required to integrate the ERP with a third-party product.

If open APIs are made available, third party SaaS companies can create pre-integrated products that fill in the gaps in functionality that exists in the product portfolios from the traditional ERP providers. Assuming the enterprise SaaS landscape becomes more and more fragmented, IT buyers should expect enterprise software companies to provide greater and greater support for integration with third party SaaS products. For the traditional ERP providers, this is likely to be both cost effective and strategically more sustainable long-term, than attempting to acquire every SaaS startup that has an interesting product.


April 5, 2019  1:12 PM

Why are we serving up chips?

Cliff Saran Profile: Cliff Saran
Uncategorized

Custom hardware is usually the only option available to organisations that need to achieve the ultimate level of performance for AI applications. But Nvidia has taken massive strides in flipping the unique selling point of its graphics processor units from the ultimate 2D and 3D rendering demanded by hard core gamers to the world of accelerated machine learning.

While it has been late to the game, Intel, has quickly built out a set of technologies from field programmable gate arrays (FPGA) to processor cores optimised for machine learning.

For the ultimate level of performance, creating a custom application specific integrated circuit (Asic), means that the microelectronics can be engineered to perform a given task with the least amount of latency.

Custom approach: Tensor processing unit

Google has been pioneering this approach for a number of years, using a custom chip called a TPU, as the basis for accelerating its Tensorflow open source machine learning platform.

Its TPU hardware tops the MLPerf v0.5 machine learning benchmarks of December 2018.

Beyond Asics, IBM is now investigating how, in certain, very specific application areas, quantum computing could be applied to accelerate supervised machine learning. It is actively looking to crowd source research that can identify which datasets are well suited to quantum computing accelerated machine learning.

Another option is the FPGA. Since it can be reprogrammed, an FPGA offers a cheaper alternative to an Asic. This is the reason why Microsoft is looking at using FPGAs in its Brainwave initiative for accelerating machine learning on the cloud.

GPUs rule mainstream ML

Nvidia has carved a niche for more mainstream AI acceleration using its GPU chips. According to a transcript of its Q4 2019 earnings call posted on the Seeking Alpha financial blogging site, the company believes deep learning offers a massive growth opportunity.

Nvidia CFO, Colette Kress, said that while deep learning and inference currently drives less than 10% of the company’s datacentre business, it represents a significant expansion of its addressable market opportunity going forward.

In a recent whitepaper, describing the benefits of GPUs, Nvidia stated that neural networks rely heavily on matrix math operations, and complex multi-layered networks require tremendous amounts of floating-point performance and bandwidth for both efficiency and speed. “GPUs have thousands of processing cores optimized for matrix math operations, providing tens to hundreds of TFLOPS of performance. GPUs are the obvious computing platform for deep neural network-based artificial intelligence and machine learning applications,” it claimed.

Opimising x86 CPUs

Intel’s chips are CPUs, optimsied for general purpose computing. However the company has begun to expand its Xeon processor with DL Boost (deep learning) capabilities. Intel claims this has been designed to optimises frameworks like TensorFlow, PyTorch, Caffe, MXNet and Paddle Paddle.

It hopes organisations will choose its CPUs over GPUs, because they generally fit in with what businesses already have. For instance, Siemens Heathineers, which is a pioneer in the use of AI for medical applications, decided to build its AI system around Intel technology, rather than GPUs. The healthcare technology provider stated: “Accelerators such as GPUs are often considered for AI workloads, but may add system and operational costs and complexity and prevent backward compatibility. Most systems deployed by Siemens Healthineers are already powered by Intel CPUs.” the company aims to use its existing Intel CPU-based infrastructure to run AI inference workloads.

So it seems developments in hardware is becoming increasingly important. Web giants and the leading tech firms are investing heavily in AI acceleration hardware.  At the recent T3CH conference in Madrid, Gustavo Alonso, systems group at the Department of Computer Science ETH Zürich, noted that AI and ML learning are expensive! “Training large models can cost hundreds of thousands of dollars per model. Access to specialised hardware and the ability to use it will be a competitive advantage,” he said in his presentation.


March 29, 2019  11:33 AM

Complexities of safety critical augmented systems

Cliff Saran Profile: Cliff Saran
Uncategorized

There are no lessons that can be gleaned from the tragic loss of life following the Ethiopian Airlines Flight 302 crash on March 10, 2019. As has been reported across the web, the crash bears remarkable similarities to Indonesia’s Lion Air Crash of October 29, 2018. Both involved Boeing 737 MAX aircraft. To quote from a statement made by Ethiopian Airlines’ group CEO, Tewolde GebreMariam,: “Until we have answers, putting one more life at risk is too much.”

What is known today is that the crash appears to be a side-effect of a software system known as the Maneuvering Characteristics Augmentation System (MCAS). Boeing says MCAS has been designed and certified for the 737 MAX to enhance the pitch stability of the airplane. Across the web there have been reports of how the system got confused during take-off, forcing the nose down to prevent the aircraft from stalling. The plane continued to dive, despite efforts by the pilots to try to regain control of the aircraft. Reporting the preliminary findings of the investigation into the Ethiopian Airlines Flight 302 crash, the Wall Street Journal  noted that a suspect flight-control feature automatically activated before the plane nose-dived into the ground.

Software update

Technically speaking, MCAS is a stall prevention system. According to CNBC, since the crashes of the two 737 Max planes, Boeing has faced fierce criticism for not doing more to tell flight crews about the stall prevention system or alert them when the technology kicks in. It reported that only one Angle of Attack (AOA) sensor for MCAS was fitted as standard. airlines were asked to for additional payment to have a second AOA installed.

Earlier this week Boeing issued a software update. According to Boeing this update has been put through hundreds of hours of analysis, laboratory testing, verification in a simulator and two test flights, including an in-flight certification test with Federal Aviation Administration (FAA) representatives on board as observers.

It said the flight control system will now compare inputs from both AOA sensors. “If the sensors disagree by 5.5 degrees or more with the flaps retracted, MCAS will not activate. An indicator on the flight deck display will alert the pilots.”

Balancing Safety critical  automation with human operators

What is clear from these reports is the complex technical and ethical issues that must be addressed in developing safety critical augmented systems that need to coexist with highly trained individuals. Neither entrusting everything to the computer system nor deferring every decision to a human, are the right approach. While the FAA investigation is likely to conclude that the Ethiopian Airlines Flight 302 crash was down to software, could a tragedy like the Germanwings Flight 9525 crash on 24 March 2015 have been avoided if the flight control software actively prevented the co-pilot from flying the aircraft into the Alps?


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: