Last week Microsoft chief, Satya Nadella, made the headlines, with an audacious plan to eradicate the company’s historic carbon footprint, by reversing all its emissions since 1975.
As political and business leaders head to the World Economic Forum in Davos this week, the risks associated with climate change and a fall in biodiversity are top of the agenda. The signals are there: the weather extremes in 2019, the fires across Australia, flooding in Lincolnshire, the decline of pollinators and the millions of hectares of Amazon deforestation.
Extreme weather, failure of climate change mitigation and human-made environmental damage top the latest Global Risk Report 2020 from the World Economic Forum. Børge Brende, president of the World Economic Forum has called on industry and political leaders to work across all sectors of society to repair and reinvigorate systems of corporation, to tackle deep-rooted risks. But there is a resurgence of isolationism and some political leaders lack the will to push through environmental policies that may be unpopular with voters or make their economy less competitive.
Then there is the hypocrisy of the major economies trying to encourage the developing world to make sustainable decisions. After all, the most successful countries have exploited the environment and natural resources without considering the future impact. What right do they have to stop developing countries from progressing?
Børge believes business leaders should step up to the mark, and lead, where the politicians have thus far failed. So for the CEO of one of the most successful businesses to commit to reversing the company’s carbon footprint, is significant. Detractors will argue that the tech industry is a clean industry, and Microsoft’s impact will not have a significant effect on tackling climate change.
But climate change has a direct impact on economic growth. For instance, how will agriculture be impacted by a decline in pollinators, droughts or flooding? Even financial services and insurance sectors will be impacted. Speaking prior to the start of the WEF, Peter Giger, group chief risk officer, Zurich Insurance Group, said: “If weather patterns change, how do I look at the probability of events occurring?”
There is also a growing realisation that people are empathising with brands that are aligned with their own values. In 2018, Apple CEO, Tim Cook asked the tech sector to consider what sort of world it wanted people to live in.
Apple has made the privacy of its users a top priority. Nadella wants Microsoft to be seen as an environmental champion. No one company or country can turnaround the environmental crisis that is unfolding. But Microsoft’s carbon reduction strategy will directly impact its global supply chain and may indirectly influence other manufacturers whose businesses are heavily dependent on its technology. It’s a small step, a drop in the ocean, but global warming is something the global community must address.
In this guest blog post, Satyam Vaghani VP/GM, IoT & AI, Nutanix discusses the way IT will need to change to support edge computing.
The rise of the edge will require a redistribution of the centre of gravity, as well as a major shift in compute paradigm from mainly “human-oriented” at the core (applications like email, surfing the web, social media and so on) to predominantly “machine-oriented” processing – as when collecting sensor data and using AI and analytics techniques to convert raw data into business insights. For example, enterprises will need to manage security and infrastructure lifecycle at thousands of mission critical datacentres, as well as hundreds or thousands of applications, most of which will require weekly, or even daily, refresh. It also calls for a much more distributed and interconnected approach between the core and hundreds if not thousands of edges to make sure they work as a holistic whole.
Take a holistic approach
In just a few years we’ve seen devices deployed at the network edge increase almost exponentially to process more data than all the public and private clouds of the world combined. Managing the volume, velocity, and variety of data at industrial scale, and refining it at the edge to get actionable insights, often under tight time constraints, are key problems which, as yet, nobody has fully solved although many are trying and making good progress.
IoT is never an edge-only or a cloud-only problem. Most IoT apps span the edge and the cloud with major implications for the security of both, not least because of the sheer size of the potential attack surface. Nothing can be assumed; new and innovative threats are being released all the time, making robust edge security with minimal oversight at the core an absolute must.
It may have gone unnoticed with the January 14 end of support deadline for Windows 7, but Microsoft’s 10 year old OS, had one last Patch Tuesday update. And , surprise, surprise, this included a critical security update for the CVE-2020-0611, that the NSA reported is a remote desktop vulnerability, which affects Windows 7 and newer operating systems.
In the past, Microsoft has remained committed to releasing the most critical security patches for unsupported operating systems, such as the Windows XP fix for the WannaCry attack, that afflicted systems around the world, including legacy hardware at the NHS. In February 2018, the Lessons learned review of the WannaCry Ransomware Cyber Attack report for NHS England reported that 80 out of 236 hospital trusts across England were affected; 595 out of 7,4545 GP practices (8%) and eight other NHS and related organisations were infected.
Organisations have had several years to migrate to Windows 10, which was released in 2015, starting the five year countdown to Windows 7 end of support. But, businesses do not generally shift from something that works well – like Windows 7 – to a new operating system, just because Microsoft has released a new version. Migrating large PC estates can take years, as older PCs are replaced with new ones, running the latest Windows OS. Certain applications and embedded systems, cannot easily be migrated to the new OS, and remain on an unsupported operating system, leaving them vulnerable to cyberattacks.
Could something like WannaCry happen again, with a vulnerability impacting legacy Windows 7 machines? Certainly every Patch Tuesday from now on will list critical vulnerabilities in Windows 10. How many of these also impact Windows 7?
“WannaCry was a clear example of the dangers that businesses can face when they are using software that has reached end of life,” says Ian Wood, Senior Director, EMEA Cloud & Governance Business Practice at Veritas.
Critical to health
Looking at the health service, due to device impact and criticality to clinical workflows and patient care delivery, many unsupported devices cannot simply be disconnected from clinical networks without severely disrupting operations. For example, MRI machines can be operational for over 20 years, far outliving their operating systems. The more devices there are running on unsupported operating systems translates into larger attack surfaces and indefinite exposure to cyber risk.
Data pooled across several hospitals from healthcare cybersecurity specialist, Cynerio, has found that radiology departments are most affected. Its research found that 40% of all connected medical devices run on Windows and almost 45% of devices like MRIs, CTs, and X-Rays run on Windows 7. These machines have particularly long life cycles. From this data, Cynerio estimated that over 20% of all medical devices run on the unsupported Windows 7 OS. Unsupported devices cannot be fully secured unless taken offline. “No device is risk free, especially network-connected devices. Medical devices are the weakest link: they are not designed with security in mind, have extensive lifecycles, and often cannot afford any downtime,” says Leon Lerman, Cynerio’s CEO and co-founder.
Nelson Petracek, chief technology officer at Tibco says that one of the issues in deploying and managing edge computing devices is how will the device metadata be managed and governed. Such metadata may contain the device location, manufacturer, date of installation and last maintenance date.
In this guest blog post Petracek discusses how the device topology and relationships can be managed and governed and how this representation can be kept in sync with the physical layout:
With respect to device metadata, there will usually be a catalogue or metadata repository included as part of the overall architecture. It’s most likely that it will be in the datacentre or in the cloud and will act as a centralised function. Not only will this catalogue provide a picture of what is deployed and where, but there may also be data that is needed from this catalogue by different areas of the IoT processing pipeline during device data processing.
When running logic, say at the gateway level, it may be necessary to draw upon reference data to make an educated decision. For example, various parts of metadata around the devices, including the manufacturer, when it was put into service, or its maintenance history might be required in order to complete the decision making process.
It is unlikely that organisations will want 100,000 locations hitting this central store for metadata every time a device activates. Instead, it’s more likely that metadata will be applied closer to the datacentre or to the cloud, especially when using it as part of the model or the rules.
Maintaining and managing the overall relationships between devices – the topology – is also critical. It’s key to understand where the device is, to what it is attached, and how everything is linked together. This information is key to understanding the behaviour of an IoT network, and can help in ensuring that decisions are determined and optimised in the correct context and state.
It is unlikely that organisations will want 100,000 locations hitting their central store for metadata every time a device activates
One well-known example comes from what you would see in a power grid. For example, if you’re a utility company responsible for distributing power to consumers, you are concerned with how power gets from the source to a meter attached to a house. Many pieces are equipment and thus devices are involved in this process, including the meter, transformers, substations, etc. When looking at a distribution network for electricity, you have a vast network of linked devices, and for a variety of reasons (safety, accurate maintenance, capacity, thresholds) it is important to have an accurate picture of not only the devices themselves, but also their relationships. If a new meter or line is added or a transformer is changed, the blueprints and recorded topology must also reflect this change. Changes must make their way back to the metadata management environment so that proper decisions can be made, both in batch (future infrastructure planning) and real-time (power delivery and restoration in the event of a failure).
There are complete systems for managing this, but it can be quite complicated. However, that level of complexity isn’t always required in order to achieve the capability, but some mechanism is necessary for capturing changes – whether it’s just through automated processes or a periodic introspection, or a combination thereof.
What’s actually out and deployed must be reflected in the device and metadata catalogue. We are not yet at the point where we can dynamically go out and automatically discover all the devices that are running everywhere along with their relationships and metadata. One can do some of this automatically, but there is likely some aspect that unfortunately is still going to be manual.
The shortage of Intel processors is having a knock-on effect on IT’s ability to upgrade whatever legacy Windows 7 PCs still remain in the organisation onto new Windows 10 hardware.
At the time of writing, Microsoft said there were no plans to delay the end of support for Windows 7. “We have been partnering closely with our OEMs to prepare,” it told Computer Weekly.
However, the supply shortage of Intel powered PCs through the channel to business customers has led to concern about planned shipments.
From Computer Weekly’s understanding of the situation, it seems the PC channel has prioritised orders that are for Windows 10 upgrades. This has been happening since the start of 2019, and so far, the demand and prioritisation has meant that any of the big planned rollouts appear to be continuing. Burt smaller rollouts, or new orders are being impacted,
Upgrading older PCs
While some Windows 7 PCs can be updated to Windows 10, these PCs tend to be a few years old, and close to the time they would naturally be retired. Updating them now, not only has the potential to make them unstable and run slower, but the time and expense of the update would be lost, if the machine is replaced in six months or so, when the Intel supply is set to return to normal.
A recent freedom of information submitted by Citrix to NHS trusts, reported that half of trusts are running Windows 7 machines. Similarly, there are reports on the internet suggesting that 1800 HMRC PCs are still on Windows 7.
As Computer Weekly has previously reported, some PC channel companies have been told by manufacturers to prioritise shipments relating to Windows 7 end of support.
But inevitably, some organisations will find they need new PC desktop and notebook equipment. One PC channel organisation Computer Weekly spoke to said: “We can’t give an assurance on new orders being here in the next six weeks – so it’s more impacting people that haven’t planned ahead or have relatively small requirements, as they’re hitting the back of the queue.”
Analyst Gartner has recommended that IT managers desperate to source new business PCs as part of their Windows 10 refresh, purchase AMD-powered hardware. Lenovo, for instance, says it has a wide selection of AMD-based ThinkPads available via the channel to suit the varying needs of business from SMBs through to enterprise and the public sector.
In the last few days Hewlett Packard Inc and Dell have posted results, in which their respective chief financial officers have spoken about the issues their businesses face with the supply of Intel processors. This affects the ability of these PC firms to fulfill orders of new PCs for enterprise customers and consumers of high-end PC devices, that tend to use the Intel chipset.
Struggle with Windows 10 refresh
In a transcript of the earnings call of October 26, posted on the Seeking Alpha financial blogging site, HP Inc’s CFO, Steve Fieler, discussed how the supply issues could impact the company’s enterprise customers, and their ability to refresh Windows 7 PCs with new Windows 10 machines. Microsoft is due to end support of Windows 7 on January 14, 2020, but the supply issues with Intel processors has meant that PC manufacturers are struggling to ship new Windows 10 PCs to their enterprise customers. “It could be that these current supply constraints actually indeed help prolong the Win 10 refresh. And so there’s a lot of dynamics going on. And that’s why I think the seasonal patterns are likely to be affected, both from a supply, but also on the potential extension of the Win 10 refresh,” Fieler said.
Worsening supply of Intel chips
Similarly, Dell admitted that the Intel CPU shortages have worsened quarter-over-quarter the shortages. In the transcript of the earnings call, posted on Seeking Alpha, Jeffrey Clarke, vice chairman, products and operations at Dell, said the supply issue with Intel processors, “is now impacting our commercial PC and premium consumer PC Q4 forecasted shipments.”
Looking at the response from Dell’s CFO, Thomas Sweet, on the question of how Intel’s processor supply issues, affects the ability of Dell to fulfill orders from its enterprise customers, it appears that about two-thirds of Dell’s enterprise customers have migrated from Windows 7 to new Windows 10 PCs. However, that still leaves a third of Dell’s enterprise customers still on Windows 7.
In the Seeking Alpha transcript of its Q3 2019 earnings call, Intel CEO, Bob Swan acknowledged the supply issues with processors saying: “We’re letting our customers down, and they’re expecting more from us. PC demand has exceeded our expectations and surpassed third-party forecasts.”
But failing its customers – the PC manufacturers – has a direct impact on every enterprise IT department’s ability to complete the refresh of Windows 10 before the Windows 7 end of support deadline. The ball is now in Microsoft’s court.
During Gartner’s recent Symposium event in Barcelona, CIOs heard the latest thinking from the analyst firm about where it sees the role of IT leader heading.
Each year, CIOs head to the Symposium events to network and hear from the analyst firm about its latest thinking on IT leadership.
Over the last two decades CIOs have been told to align IT more with the business.The dotcom boom and bust demonstrated that not every technology driven idea is necessarily a good idea. Generally speaking, technology-driven initiatives are only effective if a solid technology architecture is combined with a thorough change management programme.
Sometimes, in the past, CIOs were left out of the loop when enthusiastic non-IT people were tasked with running strategic ecommerce and social media platform initiatives. Such platforms later joined the mountain of technical debt the IT department is then tasked with managing.
There is always room to improve
More recently, companies have embarked on comprehensive digitisation programmes, to join up disconnected business processes. Every company CEO has probably looked enviously at Uber and AirBnB, and worried that someone, somewhere, will develop smart software, which puts a firecracker at the heart of their business value, and acts as an industry catalyst for change. And if they haven’t, business stakeholders should be worried. No company can remain unchanged; even a well-oiled business process has room for improvement.
Gartner has coined the term “TechQuilibrium” to describe the ability for an organisation to take on new technological advancements. “Not every industry needs to be digital in the same way or to the same extent,” says Gartner senior research vice president, Valentin Sribar. “CIOs should partner with their executive teams to design a value proposition that drives the right mix of traditional and digital business.”
CIOs are unlikely to be taken seriously if they start talking about “TechQuilibrium” but there is a conversation the CIO needs to have about how much of an appetite does the business have to adopt new technologies. And if the business is a slower adopter, where does that leave the CIO, who wants to drive technology change? To borrow a term from Gartner’s Magic Quadrant, the TechQuilibrium appears to open up a “trough of disillusionment “in the career path of aspiring tech leaders. If the organisation is not a good fit, should you stay, or should you go?
Can a corporate piece of software change the culture of an organisation. It happened in 1983 when Mitch Kapor developed Lotus 1-2-3 for the IBM PC, the PC’s first killer application. Aldus Pagemaker revolutionised desktop publishing. Email on the global internet revolutionised global business communications.
These days there are a handful of de facto technology standards that define not only the corporate desktop IT environment, but also the way people are expected to use these tools as part of their day-to-day work.
Office productivity is synonymous with Microsoft. While there are several alternatives, it remains the first choice among government and major corporations. Through its iterations MS Office has defined a protocol through which businesses exchange information, crunch numbers and share information. No business strategy is complete without a full PowerPoint slide deck.
Peter Jensen, chief of digital innovation at Moleskine, the maker of paper notebooks, believes that far too often it is misused. “There is a very real sense that the presentation is a corporate document, and must be perfect in every way. Everything needs to be super perfect.” Moleskine was set up in 1997 as a business that wanted to rekindle the creative spirit, by emulating the rough notes and sketches great artists and writers like Vincent van Gogh, Pablo Picasso, Ernest Hemingway and Bruce Chatwinmade made in their notepads.
Diane Chaleff, the G-Suite lead in the office of the CTO in Google cloud, believes traditional office productivity tools limit creativity and ideation. “Traditional technologies enforce a pattern of behaviour, which requires you to complete something.” This formality often means people share ideas much later and the feedback received is usually irrelevant, according to Chaleff. “You’ve wasted time creating the perfect presentation.” She says work colleagues then see something that looks polished and complete. Their initial reaction is ‘this must be approved’.
The corporate standard templates, fonts, logos, layout and colours hide the core idea, which may never actually get discussed. Rather than being used as a tool in public speaking to present information, the PowerPoint slide deck is almost always used as a corporate reference document.
Google’s rival offering does pretty much the same thing. Both Microsoft Office 365 and G-Suite offer collaboration and tools to encourage new working practices. But G-Suite was invented by Google, not Microsoft. Chaleff claims this is enough, for people in business to break from the past and traditional office productivity, to try a new, more collaborative approach to work.
CCS Insight has predicted that by 2023, psychometric testing of software developers will become commonplace.
The analyst firm believes that ethical psychometric testing will appear more regularly as part of the interview process, particularly in areas where software developers are being hired for jobs involving the analysis of personal data and, of course, AI, where biases need to be identified.
Why wait until 2023? Such tests may have more near term benefits. There is a question on the Myers & Briggs psychometric test concerning someone’s affinity to the words “spire” and “foundation”. Would one use the attribute “spire” or “foundation” to describe a software architect? What about an application developer or a software engineer in customer experience?
According to Ross Mason, founder of API management company, Mulesoft, software developers tend to deliver code for an IT project that has no uses outside of the domain it was designed for. He says: “We have to start helping developers think about the value they provide in the software they build.”
Consider the Lego analogy for how software is built: programmers use the bricks as building blocks to make things that other people can then see and experience. No one really wants to be the person who actually makes the Lego bricks.
Look at the Casa Batlló in Barcelona’s Gothic quarter and the genius of Antoni Gaudi. It is a finished product that visitors from around the world can experience.
In software development, should the goal be the finished product? Mason does not think so. Developing software that meets a specific business need may be seen as aligning IT with the business, but some argue that this goal-based approach lacks depth.
The idea of a “cloud first” strategy is built on a foundation of code reuse through software libraries and microservices accessed in a controlled manner via published APIs. While not every business will become a software company, there are plenty of cases where a factory for producing reusable software components makes sense.
Back to the foundation/spire question: the La Sagrada Familia is still under construction. Yet Gaudi’s vision for the cathedral is being built piece by piece, over 90 years since his death. In software, the ability to reuse Lego building blocks of code quickly and easily will be a digital differentiator. And the developers of these Lego pieces are both the foundations and the spires.
During the Future Decoded conference in London, Microsoft discussed a paradigm shift in application development. But it was not setting out its plans for a new software developer’s kit, programming language or the next big thing after cloud native applications and microservices.
Instead, the company wanted delegates to appreciate the importance of data. Why the paradigm shift? Industry pundits describe data as the new oil. The biggest companies in the world are data-driven.
Microsoft believes that artificial intelligence will change the way data-centric applications are developed and deployed. Rather than hard-code software to make use of the vast amounts of data being generated, the application being developed will evolve out of data models that feed AI-based inference engines. Programming becomes supervised machine learning; quality assurance is the feedback loop that applies the data model to real world data and continually optimises it to improve its accuracy; the application is the predictions the data model makes.
If Microsoft’s assertions are true, and data becomes the new programming paradigm, business and IT need a different approach to how they identify valuable data. It will often prove almost impossible to spot something useful in the reams of data an organisation has collected.
Is there a needle in this haystack?
According to Scott Crawford, head of data science enablement at 84.51°, the technology business at retail giant Kroger, the future for many companies will be determined by their ability to collaborate, using diverse teams to help them to look for a needle in a haystack of data. Crawford believes that as data science expands in an organisation, that diversity of capabilities becomes critical.
IT leaders will also need to assess when a request from the business is a defined project with design, build, test and deploy phases – or when it is more of a programme of continuous development and improvement.
As Crawford points out, there are plenty of use cases that solve a particular problem which means that the data team can crunch the data, create the data model and move on. But there are also cases which build evolution into a data science programme to provide better estimates, using the most recent set of data. One example in retailing may be when the data team is asked to build a better model for forecasting instore product demand. While it may indeed be possible for the team to develop a superior data model, Crawford said: “Very little thought goes into what happens if the new model wins.” Potentially, the business may change they way it does something. This change in business could invalidate the data model, or at the very least, mean that it requires further refinement, leading to further adaption of the business process ad infinitum.
In the age of the AI-driven data applications, such questions will need to be raised every time the business wants something new done.