Cliff Saran’s Enterprise blog

January 16, 2020  9:57 AM

Windows 7: Don’t cry for me

Cliff Saran Profile: Cliff Saran

It may have gone unnoticed with the January 14 end of support deadline for Windows 7, but Microsoft’s 10 year old OS, had one last Patch Tuesday update. And , surprise, surprise, this included a critical security update for the CVE-2020-0611, that the NSA reported is a remote desktop vulnerability, which affects Windows 7 and newer operating systems.

In the past, Microsoft has remained committed to releasing the most critical security patches for unsupported operating systems, such as the Windows XP fix for the WannaCry attack, that afflicted systems around the world, including legacy hardware at the NHS. In February 2018, the Lessons learned review of the WannaCry Ransomware Cyber Attack report for NHS England reported that 80 out of 236 hospital trusts across England were affected; 595 out of 7,4545 GP practices (8%) and eight other NHS and related organisations were infected.

Organisations have had several years to migrate to Windows 10, which was released in 2015, starting the five year countdown to Windows 7 end of support. But, businesses do not generally shift from something that works well – like Windows 7 – to a new operating system, just because Microsoft has released a new version. Migrating large PC estates can take years, as older PCs are replaced with new ones, running the latest Windows OS. Certain applications and embedded systems, cannot easily be migrated to the new OS, and remain on an unsupported operating system, leaving them vulnerable to cyberattacks.

Could something like WannaCry happen again, with a vulnerability impacting legacy Windows 7 machines? Certainly every Patch Tuesday from now on will list critical vulnerabilities in Windows 10. How many of these also impact Windows 7?

“WannaCry was a clear example of the dangers that businesses can face when they are using software that has reached end of life,” says Ian Wood, Senior Director, EMEA Cloud & Governance Business Practice at Veritas.

Critical to health

Looking at the health service, due to device impact and criticality to clinical workflows and patient care delivery, many unsupported devices cannot simply be disconnected from clinical networks without severely disrupting operations. For example, MRI machines can be operational for over 20 years, far outliving their operating systems. The more devices there are running on unsupported operating systems translates into larger attack surfaces and indefinite exposure to cyber risk.

Data pooled across several hospitals from healthcare cybersecurity specialist, Cynerio, has found that radiology departments are most affected. Its research found that 40% of all connected medical devices run on Windows and almost 45% of devices like MRIs, CTs, and X-Rays run on Windows 7. These machines have particularly long life cycles. From this data, Cynerio estimated that over 20% of all medical devices run on the unsupported Windows 7 OS. Unsupported devices cannot be fully secured unless taken offline. “No device is risk free, especially network-connected devices. Medical devices are the weakest link: they are not designed with security in mind, have extensive lifecycles, and often cannot afford any downtime,” says Leon Lerman, Cynerio’s CEO and co-founder.

January 14, 2020  8:37 AM

The challenge of metadata at the edge

Cliff Saran Profile: Cliff Saran

Nelson Petracek, chief technology officer at Tibco  says that one of the issues in deploying and managing edge computing devices is how will the device metadata be managed and governed. Such metadata may contain the device location, manufacturer, date of installation and last maintenance date.

In this guest blog post Petracek discusses how the device topology and relationships can be managed and governed and how this representation can be kept in sync with the physical layout:

With respect to device metadata, there will usually be a catalogue or metadata repository included as part of the overall architecture. It’s most likely that it will be in the datacentre or in the cloud and will act as a centralised function. Not only will this catalogue provide a picture of what is deployed and where, but there may also be data that is needed from this catalogue by different areas of the IoT processing pipeline during device data processing.

When running logic, say at the gateway level, it may be necessary to draw upon reference data to make an educated decision. For example, various parts of metadata around the devices, including the manufacturer, when it was put into service, or its maintenance history might be required in order to complete the decision making process.

It is unlikely that organisations will want 100,000 locations hitting this central store for metadata every time a device activates. Instead, it’s more likely that metadata will be applied closer to the datacentre or to the cloud, especially when using it as part of the model or the rules.

Maintaining and managing the overall relationships between devices – the topology – is also critical. It’s key to understand where the device is, to what it is attached, and how everything is linked together. This information is key to understanding the behaviour of an IoT network, and can help in ensuring that decisions are determined and optimised in the correct context and state.

It is unlikely that organisations will want 100,000 locations hitting their central store for metadata every time a device activates

One well-known example comes from what you would see in a power grid. For example, if you’re a utility company responsible for distributing power to consumers, you are concerned with how power gets from the source to a meter attached to a house. Many pieces are equipment and thus devices are involved in this process, including the meter, transformers, substations, etc. When looking at a distribution network for electricity, you have a vast network of linked devices, and for a variety of reasons (safety, accurate maintenance, capacity, thresholds) it is important to have an accurate picture of not only the devices themselves, but also their relationships. If a new meter or line is added or a transformer is changed, the blueprints and recorded topology must also reflect this change. Changes must make their way back to the metadata management environment so that proper decisions can be made, both in batch (future infrastructure planning) and real-time (power delivery and restoration in the event of a failure).

There are complete systems for managing this, but it can be quite complicated. However, that level of complexity isn’t always required in order to achieve the capability, but some mechanism is necessary for capturing changes – whether it’s just through automated processes or a periodic introspection, or a combination thereof.

What’s actually out and deployed must be reflected in the device and metadata catalogue. We are not yet at the point where we can dynamically go out and automatically discover all the devices that are running everywhere along with their relationships and metadata. One can do some of this automatically, but there is likely some aspect that unfortunately is still going to be manual.

December 11, 2019  10:31 AM

Intel delay: Update an old Windows 7 PC, or buy AMD

Cliff Saran Profile: Cliff Saran

The shortage of Intel processors is having a knock-on effect on IT’s ability to upgrade whatever legacy Windows 7 PCs still remain in the organisation onto new Windows 10 hardware.

At the time of writing, Microsoft said there were no plans to delay the end of support for Windows 7. “We have been partnering closely with our OEMs to prepare,” it told Computer Weekly.

However, the supply shortage of Intel powered PCs through the channel to business customers has led to concern about planned shipments.

From Computer Weekly’s understanding of the situation, it seems the PC channel has prioritised orders that are for Windows 10 upgrades. This has been happening since the start of 2019, and so far, the demand and prioritisation has meant that any of the big planned rollouts appear to be continuing. Burt smaller rollouts, or new orders are being impacted,

Upgrading older PCs

While some Windows 7 PCs can be updated to Windows 10, these PCs tend to be a few years old, and close to the time they would naturally be retired. Updating them now, not only has the potential to make them unstable and run slower, but the time and expense of the update would be lost, if the machine is replaced in six months or so, when the Intel supply is set to return to normal.

A recent freedom of information submitted by Citrix to NHS trusts, reported that half of trusts are running Windows 7 machines. Similarly, there are reports on the internet suggesting that 1800 HMRC PCs are still on Windows 7.

As Computer Weekly has previously reported, some PC channel companies have been told by manufacturers to prioritise shipments relating to Windows 7 end of support.

But inevitably, some organisations will find they need new PC desktop and notebook equipment. One PC channel organisation Computer Weekly spoke to said: “We can’t give an assurance on new orders being here in the next six weeks – so it’s more impacting people that haven’t planned ahead or have relatively small requirements, as they’re hitting the back of the queue.”

Analyst Gartner has recommended that IT managers desperate to source new business PCs as part of their Windows 10 refresh, purchase AMD-powered hardware. Lenovo, for instance, says it has a wide selection of AMD-based ThinkPads available via the channel to suit the varying needs of business from SMBs through to enterprise and the public sector.

November 28, 2019  8:28 AM

Intel delays means Windows 7 support deadline needs extending

Cliff Saran Profile: Cliff Saran

In the last few days Hewlett Packard Inc and Dell have posted results, in which their respective chief financial officers have spoken about the issues their businesses face with the supply of Intel processors. This affects the ability of these PC firms to fulfill orders of new PCs for enterprise customers and consumers of high-end PC devices, that tend to use the Intel chipset.

Struggle with Windows 10 refresh

In a transcript of the earnings call of October 26,  posted on the Seeking Alpha financial blogging site, HP Inc’s CFO, Steve Fieler, discussed how the supply issues could impact the company’s enterprise customers, and their ability to refresh Windows 7 PCs with new Windows 10 machines. Microsoft is due to end support of Windows 7 on January 14, 2020, but the supply issues with Intel processors has meant that PC manufacturers are struggling to ship new Windows 10 PCs to their enterprise customers. “It could be that these current supply constraints actually indeed help prolong the Win 10 refresh. And so there’s a lot of dynamics going on. And that’s why I think the seasonal patterns are likely to be affected, both from a supply, but also on the potential extension of the Win 10 refresh,” Fieler said.

Worsening supply of Intel chips

Similarly, Dell admitted that the Intel CPU shortages have worsened quarter-over-quarter the shortages. In the transcript of the earnings call, posted on Seeking Alpha, Jeffrey Clarke, vice chairman, products and operations at Dell, said the supply issue with Intel processors, “is now impacting our commercial PC and premium consumer PC Q4 forecasted shipments.”

Looking at the response from Dell’s CFO, Thomas Sweet, on the question of how Intel’s processor supply issues, affects the ability of Dell to fulfill orders from its enterprise customers, it appears that about two-thirds of Dell’s enterprise customers have migrated from Windows 7 to new Windows 10 PCs. However, that still leaves a third of Dell’s enterprise customers still on Windows 7.

In the Seeking Alpha transcript of its Q3 2019 earnings call, Intel CEO, Bob Swan acknowledged the supply issues with processors saying: “We’re letting our customers down, and they’re expecting more from us. PC demand has exceeded our expectations and surpassed third-party forecasts.”

But failing its customers – the PC manufacturers – has a direct impact on every enterprise IT department’s ability to complete the refresh of Windows 10 before the Windows 7 end of support deadline. The ball is now in Microsoft’s court.

November 12, 2019  3:14 PM

Calling tech leaders: Should you stay or should you go?

Cliff Saran Profile: Cliff Saran

During Gartner’s recent Symposium event in Barcelona, CIOs heard the latest thinking from the analyst firm about where it sees the role of IT leader heading.

Each year, CIOs head to the Symposium events to network and hear from the analyst firm about its latest thinking on IT leadership.

Over the last two decades CIOs have been told to align IT more with the business.The dotcom boom and bust demonstrated that not every technology driven idea is necessarily a good idea. Generally speaking, technology-driven initiatives are only effective if a solid technology architecture is combined with a thorough change management programme.

Sometimes, in the past, CIOs were left out of the loop when enthusiastic non-IT people were tasked with running strategic ecommerce and social media platform initiatives. Such platforms later joined the mountain of technical debt the IT department is then tasked with managing.

There is always room to improve

More recently, companies have embarked on comprehensive digitisation programmes, to join up disconnected business processes. Every company CEO has probably looked enviously at Uber and AirBnB, and worried that someone, somewhere, will develop smart software, which puts a firecracker at the heart of their business value, and acts as an industry catalyst for change. And if they haven’t, business stakeholders should be worried. No company can remain unchanged; even a well-oiled business process has room for improvement.

Gartner has coined the term “TechQuilibrium” to describe the ability for an organisation to take on new technological advancements. “Not every industry needs to be digital in the same way or to the same extent,” says Gartner senior research vice president, Valentin Sribar. “CIOs should partner with their executive teams to design a value proposition that drives the right mix of traditional and digital business.”

CIOs are unlikely to be taken seriously if they start talking about “TechQuilibrium” but there is a conversation the CIO needs to have about how much of an appetite does the business have to adopt new technologies. And if the business is a slower adopter, where does that leave the CIO, who wants to drive technology change? To borrow a term from Gartner’s  Magic Quadrant, the TechQuilibrium appears to open up a “trough of disillusionment “in the career path of aspiring tech leaders. If the organisation is not a good fit, should you stay, or should you go?

October 28, 2019  4:21 PM

Time to unplug PowerPoint

Cliff Saran Profile: Cliff Saran

Can a corporate piece of software change the culture of an organisation. It happened in 1983 when Mitch Kapor developed Lotus 1-2-3 for the IBM PC, the PC’s first killer application. Aldus Pagemaker revolutionised desktop publishing. Email on the global internet revolutionised global business communications.

These days there are a handful of de facto technology standards that define not only the corporate desktop IT environment, but also the way people are expected to use these tools as part of their day-to-day work.

Office productivity is synonymous with Microsoft. While there are several alternatives, it remains the first choice among government and major corporations. Through its iterations MS Office has defined a protocol through which businesses exchange information, crunch numbers and share information. No business strategy is complete without a full PowerPoint slide deck.

Peter Jensen, chief of digital innovation at Moleskine, the maker of paper notebooks, believes that far too often it is misused. “There is a very real sense that the presentation is a corporate document, and must be perfect in every way. Everything needs to be super perfect.” Moleskine was set up in 1997 as a business that wanted to rekindle the creative spirit, by emulating the rough notes and sketches great artists and writers like Vincent van Gogh, Pablo Picasso, Ernest Hemingway and Bruce Chatwinmade made in their notepads.

Diane Chaleff, the G-Suite lead in the office of the CTO in Google cloud, believes traditional office productivity tools limit creativity and ideation. “Traditional technologies enforce a pattern of behaviour, which requires you to complete something.” This formality often means people share ideas much later and the feedback received is usually irrelevant, according to Chaleff. “You’ve wasted time creating the perfect presentation.” She says  work colleagues then see something that looks polished and complete. Their initial reaction is ‘this must be approved’.

The corporate standard templates, fonts, logos, layout and colours hide the core idea, which may never actually get discussed. Rather than being used as a tool in public speaking to present information, the PowerPoint slide deck is almost always used as a corporate reference document.

Google’s rival offering does pretty much the same thing. Both Microsoft Office 365 and G-Suite offer collaboration and tools to encourage new working practices. But G-Suite was invented by Google, not Microsoft. Chaleff claims this is enough, for people in business to break from the past and traditional office productivity, to try a new, more collaborative approach to work.

October 14, 2019  2:56 PM

Software development spires and foundations

Cliff Saran Profile: Cliff Saran

CCS Insight has predicted that by 2023, psychometric testing of software developers will become commonplace.

The analyst firm believes that ethical psychometric testing will appear more regularly as part of the interview process, particularly in areas where software developers are being hired for jobs involving the analysis of personal data and, of course, AI, where biases need to be identified.

Why wait until 2023? Such tests may have more near term benefits. There is a question on the Myers & Briggs psychometric test concerning someone’s affinity to the words “spire” and “foundation”. Would one use the attribute “spire” or “foundation” to describe a software architect? What about an application developer or a software engineer in customer experience?

According to Ross Mason, founder of API management company, Mulesoft, software developers tend to deliver code for an IT project that has no uses outside of the domain it was designed for. He says: “We have to start helping developers think about the value they provide in the software they build.”

Lego architects

Consider the Lego analogy for how software is built: programmers use the bricks as building blocks to make things that other people can then see and experience. No one really wants to be the person who actually makes the Lego bricks.

Look at the Casa Batlló in Barcelona’s Gothic quarter and the genius of Antoni Gaudi. It is a finished product that visitors from around the world can experience.

In software development, should the goal be the finished product? Mason does not think so. Developing software that meets a specific business need may be seen as aligning IT with the business, but some argue that this goal-based approach lacks depth.

The idea of a “cloud first” strategy is built on a foundation of code reuse through software libraries and microservices accessed in a controlled manner via published APIs. While not every business will become a software company, there are plenty of cases where a factory for producing reusable software components makes sense.

Back to the foundation/spire question: the La Sagrada Familia is still under construction. Yet Gaudi’s vision for the cathedral is being built piece by piece, over 90 years since his death. In software, the ability to reuse Lego building blocks of code quickly and easily will be a digital differentiator. And the developers of these Lego pieces are both the foundations and the spires.

October 2, 2019  12:37 PM

Programmes and projects in an age of data applications

Cliff Saran Profile: Cliff Saran

During the Future Decoded conference in London, Microsoft discussed a paradigm shift in application development. But it was not setting out its plans for a new software developer’s kit, programming language or the next big thing after cloud native applications and microservices.

Instead, the company wanted delegates to appreciate the importance of data. Why the paradigm shift? Industry pundits describe data as the new oil. The biggest companies in the world are data-driven.

Microsoft believes that artificial intelligence will change the way data-centric applications are developed and deployed. Rather than hard-code software to make use of the vast amounts of data being generated, the application being developed will evolve out of data models that feed AI-based inference engines. Programming becomes supervised machine learning; quality assurance is the feedback loop that applies the data model to real world data and continually optimises it to improve its accuracy; the application is the predictions the data model makes.

If Microsoft’s assertions are true, and data becomes the new programming paradigm, business and IT need a different approach to how they identify valuable data. It will often prove almost impossible to spot something useful in the reams of data an organisation has collected.

Is there a needle in this haystack?

According to Scott Crawford, head of data science enablement at 84.51°, the technology business at retail giant Kroger, the future for many companies will be determined by their ability to collaborate, using diverse teams to help them to look for a needle in a haystack of data. Crawford believes that as data science expands in an organisation, that diversity of capabilities becomes critical.

IT leaders will also need to assess when a request from the business is a defined project with design, build, test and deploy phases – or when it is more of a programme of continuous development and improvement.

As Crawford points out, there are plenty of use cases that solve a particular problem which means that the data team can crunch the data, create the data model and move on. But there are also cases which build evolution into a data science programme to provide better estimates, using the most recent set of data. One example in retailing may be when the data team is asked to build a better model for forecasting instore product demand. While it may indeed be possible for the team to develop a superior data model, Crawford said: “Very little thought goes into what happens if the new model wins.” Potentially, the business may change they way it does something. This change in business could invalidate the data model, or at the very least, mean that it requires further refinement, leading to further adaption of the business process ad infinitum.

In the age of the AI-driven data applications, such questions will need to be raised every time the business wants something new done.

September 20, 2019  8:11 AM

To be strategic, IT companies must align with business strategy

Cliff Saran Profile: Cliff Saran

Last year, Singapore Airlines (SIA) announced a strategic relationship with the Chinese cloud giant, Alibaba. The term “strategic” is often bounced around the IT industry and usually means the organisation in question has an IT strategy in which a major component is based on products and services from a so-called strategic vendor.

But in the case of SIA, the strategy is very much business focused: from using AliPay to enable customers to pay for the airline’s products and services to the use of the Alibaba ecommerce platform. SIA has also been working closely with Cainiao Network, the logistics arm of Alibaba Group, to enhance international air cargo services, joining Cainiao Network’s broader efforts in building a global smart logistics network that delivers across China within 24 hours and globally within 72 hours.

At the Gartner Catalyst conference in London this week, Gartner analyst Dave Aron said that the airline has now extended the relationship even further, enabling passengers to buy gifts via Alibaba’s ecommerce platform, and have them delivered to their destination airport. Gartner believes that web giants like Amazon, Google, Baidu, Tencent and Alibaba can extend into every industry sector because they provide so many products and services that they are effectively infrastructure for everything. And it is no surprise that car makers have partnered with the likes of Apple and Google, to offer value-added services on their cars, which then become four-wheeled mobile technology platforms.

Are mega vendors strategic?

What is interesting about Aron’s observations, is that the traditional IT mega vendors – the likes of IBM, Microsoft, Oracle and SAP – are not present. In the past, for an airline like SIA to integrate its products and services with another ecommerce platform, would have required a huge amount of costly work, probably using products and services from one or more these mega vendors. While it may well be complex, connecting and taking advantage of  the network of  products and services available through an organisation like Alibaba, via an application programming interface, has  made it far easier to achieve business objectives that would have seemed near impossible a few years ago.

For the CIO, the question is: how strategic is IBM, Microsoft, Oracle or SAP to your business? This is not a question about whether they are relevant to the IT strategy, but how well these IT giants fit with the organisation’s mid and long-term business strategy. If these mega vendors are not aligned close enough, then CIOs should actively look at reducing their footprint, in order to invest more of their valuable IT budget with the new breed of strategic IT providers that are better aligned.

September 13, 2019  12:46 PM

There is nothing new in IT

Cliff Saran Profile: Cliff Saran

Once a new technology becomes mainstream the industry needs to find a new thing to sell its customers. It’s been this way since the time IBM provided big metal boxes with integrated software and called the system a mainframe. The recently introduced z15 is the latest version of this approach to IBM computing, with a single, highly integrated enterprise server to run application workloads.

But vertical integration relies on highly resilient, high performance hardware and software system, and these tend to come at a high cost. So the alternative is distributed computing, where workloads can be organised to run across farms of low-cost servers. These servers used to be physical, then they were virtualised, and now everyone in the industry talks about containers and using Kubernetes for container orchestration. Then there is the public cloud, which is providing a template to show IT departments how they can build internal IT as a service, that mimics the ease-of provisioning of IaaS, PaaS and SaaS.

All of these things are welcome additions to the IT toolbox. But the issue the CIO faces, is that enterprise IT is renowned for hoarding old tech: nothing ever really gets thrown away. Just consider the legacy acronyms from the last few decades. Each promised to revolutionise the way software systems were developed. There was the Common Object Request Broker Architecture (Corba) from the Object Management Group; the Service Oriented Architecture (SOA) and Enterprise Service Bus (ESB). Each offered an industry vision  for another three letter acronym: EAI, or Enterprise Application Integration.

Faking EAI

These days, EAI is often considered too complex and costly a programme to run. The gaps between business processes running on different IT systems is now called digital transformation. And rather than re-keying, or cutting and pasting from one IT system to another, Robotic Process Automation (RPA) does this rekeying under the control of a bot running a script.

What is fascinating is that the approaches being taken to achieve RPA are not dissimilar to how legacy green screen systems used to be given a shiny new Windows front-end that re-keyed user input into a terminal session, or how the early price comparison sites in the late 1990s screen-scraped pricing information.

The z15 may be the latest incarnation of what IT people used to call the legacy “mainframe” system. But legacy is embedded everywhere in modern IT. The DNA inside SOA, ESB and even RPA have much in common with modern cloud architectures.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: