CCS Insight has predicted that by 2023, psychometric testing of software developers will become commonplace.
The analyst firm believes that ethical psychometric testing will appear more regularly as part of the interview process, particularly in areas where software developers are being hired for jobs involving the analysis of personal data and, of course, AI, where biases need to be identified.
Why wait until 2023? Such tests may have more near term benefits. There is a question on the Myers & Briggs psychometric test concerning someone’s affinity to the words “spire” and “foundation”. Would one use the attribute “spire” or “foundation” to describe a software architect? What about an application developer or a software engineer in customer experience?
According to Ross Mason, founder of API management company, Mulesoft, software developers tend to deliver code for an IT project that has no uses outside of the domain it was designed for. He says: “We have to start helping developers think about the value they provide in the software they build.”
Consider the Lego analogy for how software is built: programmers use the bricks as building blocks to make things that other people can then see and experience. No one really wants to be the person who actually makes the Lego bricks.
Look at the Casa Batlló in Barcelona’s Gothic quarter and the genius of Antoni Gaudi. It is a finished product that visitors from around the world can experience.
In software development, should the goal be the finished product? Mason does not think so. Developing software that meets a specific business need may be seen as aligning IT with the business, but some argue that this goal-based approach lacks depth.
The idea of a “cloud first” strategy is built on a foundation of code reuse through software libraries and microservices accessed in a controlled manner via published APIs. While not every business will become a software company, there are plenty of cases where a factory for producing reusable software components makes sense.
Back to the foundation/spire question: the La Sagrada Familia is still under construction. Yet Gaudi’s vision for the cathedral is being built piece by piece, over 90 years since his death. In software, the ability to reuse Lego building blocks of code quickly and easily will be a digital differentiator. And the developers of these Lego pieces are both the foundations and the spires.
During the Future Decoded conference in London, Microsoft discussed a paradigm shift in application development. But it was not setting out its plans for a new software developer’s kit, programming language or the next big thing after cloud native applications and microservices.
Instead, the company wanted delegates to appreciate the importance of data. Why the paradigm shift? Industry pundits describe data as the new oil. The biggest companies in the world are data-driven.
Microsoft believes that artificial intelligence will change the way data-centric applications are developed and deployed. Rather than hard-code software to make use of the vast amounts of data being generated, the application being developed will evolve out of data models that feed AI-based inference engines. Programming becomes supervised machine learning; quality assurance is the feedback loop that applies the data model to real world data and continually optimises it to improve its accuracy; the application is the predictions the data model makes.
If Microsoft’s assertions are true, and data becomes the new programming paradigm, business and IT need a different approach to how they identify valuable data. It will often prove almost impossible to spot something useful in the reams of data an organisation has collected.
Is there a needle in this haystack?
According to Scott Crawford, head of data science enablement at 84.51°, the technology business at retail giant Kroger, the future for many companies will be determined by their ability to collaborate, using diverse teams to help them to look for a needle in a haystack of data. Crawford believes that as data science expands in an organisation, that diversity of capabilities becomes critical.
IT leaders will also need to assess when a request from the business is a defined project with design, build, test and deploy phases – or when it is more of a programme of continuous development and improvement.
As Crawford points out, there are plenty of use cases that solve a particular problem which means that the data team can crunch the data, create the data model and move on. But there are also cases which build evolution into a data science programme to provide better estimates, using the most recent set of data. One example in retailing may be when the data team is asked to build a better model for forecasting instore product demand. While it may indeed be possible for the team to develop a superior data model, Crawford said: “Very little thought goes into what happens if the new model wins.” Potentially, the business may change they way it does something. This change in business could invalidate the data model, or at the very least, mean that it requires further refinement, leading to further adaption of the business process ad infinitum.
In the age of the AI-driven data applications, such questions will need to be raised every time the business wants something new done.
Last year, Singapore Airlines (SIA) announced a strategic relationship with the Chinese cloud giant, Alibaba. The term “strategic” is often bounced around the IT industry and usually means the organisation in question has an IT strategy in which a major component is based on products and services from a so-called strategic vendor.
But in the case of SIA, the strategy is very much business focused: from using AliPay to enable customers to pay for the airline’s products and services to the use of the Alibaba ecommerce platform. SIA has also been working closely with Cainiao Network, the logistics arm of Alibaba Group, to enhance international air cargo services, joining Cainiao Network’s broader efforts in building a global smart logistics network that delivers across China within 24 hours and globally within 72 hours.
At the Gartner Catalyst conference in London this week, Gartner analyst Dave Aron said that the airline has now extended the relationship even further, enabling passengers to buy gifts via Alibaba’s ecommerce platform, and have them delivered to their destination airport. Gartner believes that web giants like Amazon, Google, Baidu, Tencent and Alibaba can extend into every industry sector because they provide so many products and services that they are effectively infrastructure for everything. And it is no surprise that car makers have partnered with the likes of Apple and Google, to offer value-added services on their cars, which then become four-wheeled mobile technology platforms.
Are mega vendors strategic?
What is interesting about Aron’s observations, is that the traditional IT mega vendors – the likes of IBM, Microsoft, Oracle and SAP – are not present. In the past, for an airline like SIA to integrate its products and services with another ecommerce platform, would have required a huge amount of costly work, probably using products and services from one or more these mega vendors. While it may well be complex, connecting and taking advantage of the network of products and services available through an organisation like Alibaba, via an application programming interface, has made it far easier to achieve business objectives that would have seemed near impossible a few years ago.
For the CIO, the question is: how strategic is IBM, Microsoft, Oracle or SAP to your business? This is not a question about whether they are relevant to the IT strategy, but how well these IT giants fit with the organisation’s mid and long-term business strategy. If these mega vendors are not aligned close enough, then CIOs should actively look at reducing their footprint, in order to invest more of their valuable IT budget with the new breed of strategic IT providers that are better aligned.
Once a new technology becomes mainstream the industry needs to find a new thing to sell its customers. It’s been this way since the time IBM provided big metal boxes with integrated software and called the system a mainframe. The recently introduced z15 is the latest version of this approach to IBM computing, with a single, highly integrated enterprise server to run application workloads.
But vertical integration relies on highly resilient, high performance hardware and software system, and these tend to come at a high cost. So the alternative is distributed computing, where workloads can be organised to run across farms of low-cost servers. These servers used to be physical, then they were virtualised, and now everyone in the industry talks about containers and using Kubernetes for container orchestration. Then there is the public cloud, which is providing a template to show IT departments how they can build internal IT as a service, that mimics the ease-of provisioning of IaaS, PaaS and SaaS.
All of these things are welcome additions to the IT toolbox. But the issue the CIO faces, is that enterprise IT is renowned for hoarding old tech: nothing ever really gets thrown away. Just consider the legacy acronyms from the last few decades. Each promised to revolutionise the way software systems were developed. There was the Common Object Request Broker Architecture (Corba) from the Object Management Group; the Service Oriented Architecture (SOA) and Enterprise Service Bus (ESB). Each offered an industry vision for another three letter acronym: EAI, or Enterprise Application Integration.
These days, EAI is often considered too complex and costly a programme to run. The gaps between business processes running on different IT systems is now called digital transformation. And rather than re-keying, or cutting and pasting from one IT system to another, Robotic Process Automation (RPA) does this rekeying under the control of a bot running a script.
What is fascinating is that the approaches being taken to achieve RPA are not dissimilar to how legacy green screen systems used to be given a shiny new Windows front-end that re-keyed user input into a terminal session, or how the early price comparison sites in the late 1990s screen-scraped pricing information.
The z15 may be the latest incarnation of what IT people used to call the legacy “mainframe” system. But legacy is embedded everywhere in modern IT. The DNA inside SOA, ESB and even RPA have much in common with modern cloud architectures.
A few months ago Automation Anywhere began collaborating with freelance software developer recruitment platform, Toptal, on robotic process automation in the human workforce. The concept is called the “digital worker”.
There appears to be a number of forces coming together. The so-called gig-economy is showing that there are people happy to make themselves available for work on a zero hours style of work contract. And this trend is not limited to the likes of Uber and Deliveroo drivers. The nature of IT contracting is also changing . “People don’t want to work on a year long contract. They may only want to use their technical skills for work one day a month,” says Nick Woodward, CEO of ETZ Payments, a company that aims to simplify payments for freelance workers.
Bola Rotibi, research director, software development, at CCS Insight believes demand for IT services offered through the gig economy may increase as a result of tech skills shortages. “They work well when the work needed is very standardardised and well-scoped so that there is no ambiguity and you don’t have to open the company so that information flows out,” she said.
Robotise work, backfill with humans
Automation Anywhere provides a way to automate manual processes. But if this work cannot be accomplished through automation, it needs to be handed over to a real person. Today, RPA and hiring new talent are not directly connected. But imagine if the platform was smart enough to understand what IT work could be automated and where an IT job needed handing over to real people. The algorithm could then find a suitable IT contractor that met a predetermined set of requirements, similar to how Uber’s platform identifies the closest driver who is free to pick up a customer.
Such a scenario does not put much value on the human workforce. Neither does the terms “human capital management”, headcount or man-hours. However, if an RPA cannot perform a task, but can identify a real person free to complete the job, then that has to be a good thing?
The algorithm only really has to do the job of an applicant tracking system in reverse – to identify contractors with availability who have the right skillset. Perhaps add in weightings based on the sentiment of Checkatrade style reviews from previous employers and some smarts around recommendations. How hard can it be? The sad truth is that automated systems for identifying talent are often woefully inadequate.
Gartner’s latest magic quadrant report on cloud infrastructure puts IBM in the bottom left quadrant, as a niche player, alongside Oracle and Alibaba. IBM is hoping the £34bn it has spent on buying Red Hat will propel its cloud infrastructure strategy.
IBM’s core enterprise customers do not necessarily expect Big Blue to sell them the latest, greatest tech. They often see IBM as a trusted partner, providing systems that support core business functions. As the saying goes: “No one ever got fired for buying IBM.” One could argue this has left those companies that entrust IBM with their technology decisions, on a slower tech adoption curve. Yes, IBM has blockchain, Watson for deep learning and a quantum computing roadmap, but it takes a long time for the culture of a large enterprise to change. And this slowness to adapt arguably affects both its customers and IBM itself.
IBM used to be good at reinventing itself: from a hardware business to a software business and then a services business. Each change has seen IBM offer its traditional enterprise customers a way to evolve their IT function. The cloud is a phenomenon traditional enterprises have been slow to adopt and IBM has made several attempts at providing an IBM cloud for these customers. “IBM appeals to its existing customers who have a strong preference to purchase most of their technology from IBM,” Gartner noted in its magic quadrant for cloud infrastructure. From a cloud infrastructure perspective, Gartner sees the IBM cloud as an option for “Mode 1” IT – which comprises mainly of systems of record type workloads.
The Red Hat acquisition represents a $34bn gamble: will those businesses that have stuck with IBM remain bedded to a traditional approach to IT, or will they eventually shift the majority of their enterprise workloads onto the public cloud? If they chose the latter, AWS or Azure and Google – rated as leaders by Gartner – may, arguably, offer more fully-formed public cloud strategies. But, for IBM, there is a $350bn opportunity – so long as it can convince these organisations that it is the safest pair of hands to make them cloud-ready.
What does a modern desktop look like? For the last few decades since the desktop PC established its place in offices, the role of desktop IT has been to provide end-users with access to corporate approved software and enterprise data, file servers and printers.
But the ease with which end users can create and distribute information has introduced its own set of problems. For instance, the success of email has made it far too easy for people to have several revisions of the same document exist on the corporate network. Logically, it makes sense to deploy cloud-based storage. This enables colleagues to collaborate and share information via the cloud, such that a single version of the document can be shared, changes can be tracked and people can comment on revisions. But if one user in the team continues to use email, this often determines how other colleagues communicate.
The road ahead for the CIO and IT department will involve showing the tech laggards in the business how new cloud-based tools, simplify traditional email-based workflows.
Change how people use PCs for work
It’s about providing people with the right technology to deliver a good level of employee experience, such that they realise there are benefits in adapting old working practices.
In 2016, MIT’s Building business value with employee experience report found that companies with scores in the top quartile of employee experience, were twice as innovative as the bottom quartile, based on the percentage of revenue generated from new products and services in the last two years.
Focus on a good employee experience
Insight’s recent Employee Experience report found that a lack of engagement and the unavailability of useful, user-friendly technology in organisations is having a negative impact on staff productivity. Similarly, a YouGov survey for Avanade found that UK companies are failing to realise the value from their workplace technology investments.
MIT does not believe CIOs should adopt a command and control strategy to deploy new technology that improves employee experience. But CIOs face a credibility issue. End users often regard IT as a blocker, preventing them from achieving what they need to do. Introducing new tools just gets in the way of what they already have to put up with: multiple logins, the need to rekey information between different applications and sluggish, out-dated PC hardware.
The end of support of Windows 7 on January 14, 2020 provides CIOs with a compelling reason to change end user desktop computing practices.
“Once you put someone in orbit, it is a lot harder to fix it if something goes wrong,” says Paul Kostek, a senior Institute of Electrical and Electronics Engineers (IEEE) member and senior systems expert at Base2.
But things do go wrong. About 1:08:53 into the BBC’s excellent documentary, Eight days: To the moon and back, Neil Armstrong and Buzz Aldrin and mission control at the Kennedy Space Center, realise that the toggle for the engine arm circuit breaker on the Lunar Module launch vehicle had broken off. Without it, they would be stranded on the Moon. Nasa engineers worked through the night, investigating how to bypass the switch, but Aldrin figured out he only needed to make sure the electrical contacts in the switch touched, and used a felt tip pen to achieve this, enabling the two astronauts to take off from the Moon’s surface, and ultimately return to Earth safely.
The computers today are far more clever that the ancient hardware that put man on the Moon, but there is a very real sense that the greater the level of automation, the less human operators are kept in the loop, and can understand how to bypass the systems. The tragedy of Lion Air Flight 610 and Ethiopian Airlines Flight 302, are a testament to the fact that automation can make mistakes, and the overrides to enable humans to take back control must be simple and well-understood.
Designing for humans
Kostek says that there are lots of discussions in commercial aviation about pilots needing to know what to do if there is a problem with the flight systems. But without any relevant, contextual feedback, the pilots may not have all the information they need to overcome the issue the flight system encountered.
Automation is moving beyond space exploration and aircraft flight control towards augmenting vehicles, as a stepping stone to fully autonomous cars. Thanks to GPS, many people have lost commonsense, and will blindly follow instructions from a Sat Nav, that takes them down an impassable route. Automation is a double-edged sword. How different the Apollo mission could have turned out if Buzz Aldrin hadn’t stepped up to the mark, and taken the initiative.
The recent spate of band reunions and Glastonbury illustrates the longevity of rock music.
But the software industry seems to be stuck on finding the next big thing. Who would ever describe Windows 3.1 as a “classic”? Yet there are some products that somehow get the basics just right, and later releases do not really advance the tech innovation.
The question is how does one go about predicting what technologies will have longevity. There is a lot of industry hype and it is easy today to jump on the artificial intelligence bandwagon or focus on Bitcoin or internet of things. But a quarter of a century ago, there was no such thing as blockchain or Bitcoin and the closest people got to artificial intelligence was at the movies.
Macros and DOS configurations
The big thing then was WordPerfect and config.sys files.
Back in those days, Bruce Momjian, co-founder of the Postgres Core Development Group and chief database architect at EnterpriseDB used to worked as a Unix admin at a law firm, and recalls a story about one of the IT people who also worked at the firm. “There was one guy who created config.sys files and he was an expert in WordPerfect. I’m sure he does nothing with config.sys and WordPerfect macros today. It’s just about trying to figure out the hacks and it is not foundational. While what I learnt as a Unix admin is still relevant.”
He says there is a risk that chasing after what’s hot today may mean you end up with a skill that is not so hot in the future.
“If I wanted to be successful, why would I spend a bunch of years playing with a free piece of software? It doesn’t make any sense. I should have been working on WordPerfect macros.”
Doing practical things works in the short term, but he urges young IT folk to “do the crazy thing” and really look at how they should invest in their long term careers. “It takes a certain insanity and a certain disregard of practicality. It’s not about looking at the technology in order to try to get something done today, but instead understanding the full stack. I say, ignoring today and look to find answers just because you want to find an answer.”
Why system tools are boring
And while everyone know Linus Torvalds, in general, says Momjian, “If you are a creator of an infrastructure tool, you sit in an office and maybe you’re at a conference once every other month.” He argues that no IT decision maker really plans their IT strategy around a scripting language, a compiler or a text editor, or base it around some of the virtualisation tools out there. “They are interesting, but not a core part of a business process in organisations,” he says.
But compared to the early 1990s when Momjian was a Unix admin, proprietary Unix systems are on life support. Compare the proprietary Unix vendors to the like Microsoft and Oracle, who are still selling relational databases. Since the early 2000s, Momjian has been a database man. “There is a lot of people who find databases really interesting,” he adds.
For Momjian, the database industry is a good industry to be in. And there are some people in the open source community who are jetted around the world to speak to thousands of delegates about their contribution to database technologies.
For Momjian, these are the true rock stars of the software industry.
There are certain functions in database administration, that are good candidates for automation. These are the nuts and bolts of the job – like how to partition the storage, what happens if a disk fails or determining which physical disk data needs to be stored on. Some of these more mundane tasks no longer require as much manual intervention, because software has evolved to where the machine can take care of them. But this automation is not the same as autonomous database management.
Computer Weekly recently met up with Ravi Mayuram, senior vice president of products and engineering at Couchbase. Mayuram likens the current state of database administration as analogous to the automatic gearboxes in a car. Clearly the automatic gearbox is a long way from fully autonomous vehicles. But it offers a degree of convenience, which is where some of the automation now possible in database management have evolved to. In terms of databases, for Mayuram, in a fully autonomous database management system, the system will be able to fix itself. A fully autonomous database will more or less drive itself .But he reckons there will always be situations where humans will be needed.
Automation is nothing new for IT folk. Managing operating systems and software infrastructure used to be achieved using scripting. But Mayuram believes that the reason why full automation is taking a while to become part of the DBAs toolset is that databases are very unique. This is due to the way the relational database was originally engineered in the 1970s. “The whole database architecture is a bit like a car with a manual gearbox. It just cannot be automated,” says Mayuram. As such, it has provided gainful employment for the DBAs in many organisations
Self- driving database management
But, in time automation of database management will be possible, which will lead to DBAs no longer having the role of tinkering with the relational database system.
What will the expert DBA spend their time doing, when database admin tasks are automated?
Mayuram says: “With any change, it’s myopic to look at how many jobs will be eliminated. All the DBAs will not be eliminated because data is more essential now than it has ever been before.” He believes their job description will change such that DBAs will have a more pivotal role to play in helping the business run faster. The DBA will be the person who is capable of reducing the complexity of providing access to enterprise’s data in the right format at the time the business requires information to make decisions.