Cliff Saran’s Enterprise blog


February 19, 2019  9:39 AM

WWW at 30: the worldwide swamp

Cliff Saran Profile: Cliff Saran
Uncategorized

Next month will be the 30th anniversary of the world wide web. In 1989, who would have thought the web would touch every aspect of people’s lives – not only in a good way but also in ways that seem to undermine the fabric of society? An elegant way for researchers across the globe to collaborate, has evolved from a platform for free speech into a swamp seeping disinformation, hate, paedophilia and online bullying.

For instance, the BBC’s Countryfile, which was broadcast on Sunday 17th February reported on how illegal gambling rings live stream blood sports like hare coursing and cock fighting over Facebook and YouTube. And on today’s web, it seems open debate and fair comment can lead to a tirade of abuse targeted at anyone who appears to have a different opinion.

People must understand they are being nudged

The DCMS’ Fake News and Disinformation report discusses at length how easy it is for organisations to target social media users en masse in the same way online marketing campaigns are used to sell and recommend products.

The techniques have become increasingly more sophisticated. Behavioural economics uses so-called “nudge” technology, to try to influence people. It seems the ability to target individuals online through the use of carefully crafted online advertising campaigns with subliminal messaging is moving beyond the big marketeers and state agencies with a subversive agenda. Now, a service called TheSpinner claims: “TheSpinner enables you to subconsciously influence a specific person, by controlling the content on the websites he or she usually visits.” Sold as a service starting at just $29.00, anyone can sign up and target another individual, such as in the run up to a marriage proposal, by ensuring their special person sees a series of 10 related articles when they are online. “People need to be aware this technology can be used,” warns Bridget Kenyon, global chief information security officer at Thales, mirroring one of the findings in the DCMS report.

Digital literacy is key

Facebook, YouTube, et al, take the original premise of the web and democratise information sharing to the point that anybody can post an update, image or video anywhere and at any time. Anyone can receive this post, no matter how irrelevant or inappropriate it is. However as the DCMS recommends: “Digital literacy should be a fourth pillar of education. People need to be resilient about their relationship with such sites, particular around what they read and what they write.”

February 13, 2019  9:41 AM

GE Digital addendum

Cliff Saran Profile: Cliff Saran
Uncategorized

The term “digital” is only referenced twice, in the 87 minutes of GE’s Q4 2018 earnings call that took place at the end of January. In a transcript of the call, posted on the Seeking Alpha financial blogging site, none of the financial analysts who participated, asked about the company’s digital strategy in spite of GE announcing a new $1.3bn digital business on December 13 2018. Their main concern was the company’s Power business, which is referenced 62 times in the transcript.

GE was among the industrial giants that showed huge potential in evolving from a company that makes big machines to one that sells software-powered services. It was, but where is it now? Last December, Bill Ruh, the GE chief who led this strategy, left the company.

Computer Weekly first started reporting on the GE story in 2013, when Ruh was vice-president for software at GE Research. At the time Ruh talked about how industrial IoT would enable GE to predict machine failures, and power a service-led business. Ruh eventually headed up the company’s digital division GE Digital.

Reimagining industry

GE joined the ranks of traditional organisations pioneering platform businesses. Its former CEO, Jeff Immelt, was regarded as a digital visionary. Under his watch, GE became a 150-year-old startup. In a 2015 McKinsey article he wrote: “We want to treat analytics like it’s as core to the company over the next 20 years as material science has been over the past 50 years.”

Looking at the company’s Q1 2017 earnings statement,  released just a few months before Immelt stepped  down, he stated: “GE is continuing its portfolio transformation and investing in innovations in GE Digital and GE Additive.” The term “digital” appears five times in this statement.

Immelt’s tenure as the CEO of GE was plagued with problems in the company’s power division, something that remains a big issue.

In June 2017, he was replaced by John Flannery, who has subsequently been replaced by Larry Culp. Winding the clock forward to Q4 2018, and in the latest earnings release on Seeking Alpha, “digital” is referenced only twice, to announce that GE Digital will be spun off as a separate company.

GE also announced an agreement to sell a majority stake in field management services firm ServiceMax, a company it acquired in 2017. As Computer Weekly reported at the time of the acquisition,  ServiceMax was among the key components in GE’s digital strategy. When combined with the concept of running digital twins of customers’ machines,  Ruh believed GE could move beyond simply predicting machine failures. Instead, he said GE would be able to deliver business outcomes to its customers, such as higher production yield.

Digital version 2.0

Analyst Forrester believes among the positives to come out of the demise of GE Digital version 1, is that a new version 2 business will be able to operate as a separate company. “GE Digital can focus on developing as a software business and not an internal IT shop for GE industrial units,” Forrester noted.

However, all along, the challenges GE has faced is that it is not being recognised for its software business. Immelt acknowledged this challenge in a 2017 Harvard Business Review article. “It will take years for GE to fully reap the benefits of the transformations,” he wrote.

When Computer Weekly met Ruh in July 2018, he described how Saudi company Obeikan was building food and beverage applications on top of the GE Digital Predix platform, helping GE expand the reach of its software platform. Now that Ruh has left GE, what will GE Digital version 2, evolve into?

Growing a software arm is hard. Looking at GE’s progress to date, there are wider lessons that can be gleaned. Every business that wants to compete effectively with agile startups and the web platform giants will need to go through challenging times before their transformation is complete.Those who start the journey may not be the ones who complete it.

Incidentally, “Additive”, which Immelt highlighted in his 2017 trading statement, refers to using 3D printing in manufacturing. And yes, GE does have an additive manufacturing division. But it is not mentioned a single time in the transcript of the Q4 2018 earnings call, posted on Seeking Alpha.


February 8, 2019  11:41 AM

IT misaligned with business’ appetite for data

Cliff Saran Profile: Cliff Saran
Uncategorized

A recent pop event at Lloyds Bank in London hosted by DominoData has highlighted one of the age-old ailments of IT. It is the illness of being misaligned with the business.

The latest example of this was succinctly described by MoneySuperMarket.com’s analytics head, Harvinder Atwal during a presentation at the event.

One of the surprises that came out from his presentation is that data scientists tend to do their work on their own laptops.

Laptop data models

So data scientists tend to use their own laptops  with real customer data to create data models. The reason they see a need to do this is  because corporate IT puts in layer upon layer of process and procedures.

Data scientists have to request data access from IT. They need to negotiate with IT for the required compute resources, then wait for these resources to be provisioned. They may need to go back to IT to install query tools. “As a data scientist, you just want to use data as quick as possible,” Atwal said in his presentation. But in Atwal’s experience, IT is stuck in a 20th century operating model. “People don’t have access to data warehouses.”

Continuous improvement

Clearly, having real customer data on a laptop, may well infringe data regulations and is definitely a no-go area for IT security and the corporate governance, regulation and compliance teams. Nevertheless, without this data and the right data manipulation tools and environment, data scientists are unable to do their work effectively. Given that businesses hire data scientists in the hope they will discover some hidden meaning in the masses of data they collect, it seems illogical that access to the data and the tooling the experts need is so complex.

This becomes more of a challenge when the data scientists wish to improve the accuracy of their data model by having the data analytics equivalent of continuous testing and delivery. In an ideal world, a data model will continuously improve once deployed. This is because insights from new customer data can provide a feedback loop. The very structured approach to delivering corporate IT is not aligned with the need for business to gain insights from data rapidly.

DataDomimo is one of a number of companies hoping to tackle this problem by providing a containerised environment for data scientists to work in.


January 29, 2019  9:43 AM

AI keeps seeing average people

Cliff Saran Profile: Cliff Saran
Uncategorized

There is a mantra in technology that if a program is fed bad data, it will produce bad results. The same is true of AI. The quality of a machine learning application is directly related to the quality of data it is fed.

Sébastien Boria, mechatronics and robotics technology leader at Airbus, says an AI system that takes a sample of all the people who work on a particular manual process, to understand how they work, may well come up with a machine learning model based on the average of the people sampled. He says: “Just because 50% of the population does something, does not make it the right solution.”

Statistically speaking, population samples generally follow the so-called Normal distribution, where the majority of people are at the top of the bell curve, and 5% at the tail ends of the curve. This may be fine for discarding anomalous results and one-offs, but when applied to machine learning, Boria believes the top performer’s results “may be seen by the machine as an anomaly.”

Miracle on the Hudson

A decade ago captain Chesley Burnett Sullenberger landed US Airways flight 1549 on the Hudson river in New York after both engines on his Airbus A320 failed. In the FAA transcript of the cockpit recording, air traffic control guided Sullenberger to make an emergency landing at Teterboro airport. He replies: “We can’t do it….we’re gonna be in the Hudson.” There is no flight operations manual on how to land an A320 on a river, but Sullenberger did it, saving 155 passengers and crew.

Industry 4.0 needs quality data

The heroic actions of Sullenberger is just one example demonstrating how humans can think outside the box. It is often necessary to work around normal procedures in order to achieve extraordinary results.

AI and machine learning were among the hot topics discussed at the recent World Economic Forum in Davos. Many believe AI technology will be the fuel that powers Industry 4.0. This is not necessarily about fully automated manufacturing. Instead, relatively small changes can be scaled up, significantly improving operational efficiency. But, as Airbus’ Boria has noted, basing machine learning on the average of how people work, results in average AI. Industry 4.0 needs extraordinary AI, and this means capturing exceptional data.


January 24, 2019  12:57 PM

Windows 7: Don’t expect Microsoft to handle all Windows 10 issues

Cliff Saran Profile: Cliff Saran
Uncategorized

It does not seem that long ago that IT departments were battling with upgrading PCs from Windows XP to Windows 7, as Microsoft finally pulled the plug for support on the legacy operating system.

Now it is the turn of Windows 7, which has become almost a de facto PC standard for end user computing. This is mainly down to the absolute flop that was Windows 8. The idea of trying to push a touch-screen optimised user interface without a start menu button into enterprise IT, probably seems ludicrous now. But that is exactly what Microsoft tried to do with Windows 8. Businesses stuck with what they knew: Windows 7. And now, as those still on Windows 7, start counting down the days before January 14 2020, many see Windows 10 as the only route forward.

Yes it does have a start menu, but it is certainly not simply a more modern Windows 7. The radical change Microsoft introduced with Windows 10, is that it manages the operating system updates – not corporate IT.

Auto-updates keep Windows 10 fresh

Microsoft promises there will no longer be big bang, high risk roll-outs of new Windows operating system. Instead, users will receive fresh  features on an ongoing basis, twice a year.

Why would this not be a good idea?

There are plenty of reasons. Which IT admin would entrust Microsoft for the reliability of the updates that impact an entire PC estate, potentially spanning thousands of machines? After all, it is an approach that works on Google’s Android and Apple’s iOS operating systems.

Application compatibility

But one only has to look on the app store to read of horror stories of apps that used to work until Google or Apple updated their respective OSes. Windows is far worse because Windows claims backwards compatibility. Enterprise software providers do not have to rush out a new version of their applications to support the latest Windows OS if their software will run on an older version in so-called “compatibility mode”. But as anyone who has undertaken application compatibility testing will appreciate, just because something should be compatible in theory, does not mean it works when deployed.

Even something as ubiquitous as Google Chrome, has been known to fail. What hope is there for bespoke software or a third-party applications?

Microsoft may well handle the updates of Windows 10. But IT teams and third-party developers will still need to ensure software is compatible with the update.


January 15, 2019  11:37 AM

AWS mixes toxic cocktail for open source

Cliff Saran Profile: Cliff Saran
Uncategorized

There is currently a crisis unfolding in the open source world, with a number of companies changing their licensing to protect revenue. This has arisen due to a potentially toxic situation where public cloud providers have introduced managed services based on free open source products.

Is AWS toxic to open source?

OSS software providers generally commercialise their offerings by developing value-added services, additional enterprise-grade products and support. This is the kind of model adopted by NoSQL database provider, MongoDB. It can reap the rewards of developing the open source NoSQL database as and when commercial customers pay for this enterprise features. This funding is essential in open source software because it helps smaller open source companies continue to invest develop time on open source products.

However, if a public cloud provider offers its own managed service, that funding quickly diminishes. In October 2018, MongoDB introduced a so-called server-side public licence, which effectively curbs the ability of public cloud providers to build their own commercial services based on the free, open source technology.

So has AWS decided to pay for this licence? It seems it may have not.

AWS DocumentDB: It’s MongoDB compatible

AWS recently introduced DocumentDB, a managed NoSQL database service compatible with MongoDB 3.6. While this is not the latest release, AWS now has a rival cloud-hosted NoSQL product. From a customer perspective, AWS can offer unmatched scale and resiliency across multiple availability zones.

Clearly there are many organisations that have built MongoDB systems on-premise, and many use cases where on-premise makes sense. But managed cloud services. represent the path enterprise IT is taking. MongoDB even offers its own, called MongoDB Atlas, available on the AWS marketplace.

But it would appear that AWS has allowed an open source project to thrive and built business offerings around that product. But when the licensing for that product changed, AWS introduced a rival compatible offering. It may well make commercial sense. But morally, is this right approach a company with such a dominant position in the public cloud market should willingly take?

The internet would not exist without open source software. By adding popular and innovative open source projects to their platforms, public cloud provider lower the technical barriers for enterprises to adopt new technologies. But unless there is a viable way to fund innovative open source companies, public cloud companies like AWS will eventually kill off open source innovation.


August 18, 2014  9:58 AM

CIOs need to nurture raw IT talent to power the future economy

Cliff Saran Profile: Cliff Saran
CIO, IT Works, Skills

Each year, with the publication of A-level results, hundreds of thousands of school leavers embark on a journey that ultimately fuels the talent pool feeding the UK economy.

Some argue that a generation brought up on the internet have different expectations of work than previous generations. But, while the worldwide web was never part of their early education, the people who started their careers in the 1990s have only ever worked in the internet era. It is these people who are now the middle and senior managers in business and government.

There is an opportunity for these managers to think differently about how work should be organised and consider the real value of staff and how to motivate them, in a bid to help their organisation become the next Facebook or Google.

Empires are not created overnight; revolution is never instant. But there is a groundswell, a shift in public opinion. The trusted brands of the past look prehistoric compared with the likes of Amazon or Apple.

Two decades after the 30-year-old Jeff Bezos established Amazon, a new generation of business leaders are realising the tools of the internet era – social, cloud, analytics and mobile – offer the potential to rewrite business rules. In this respect, IT can make a difference. Amazon digitised book selling in 1995; today digitisation is set to propel old-school business ideas into the internet era.

How often has IT been greeted by the remark: “I didn’t know it could do that?” More often than not, users complain the application does not do something they want. Today’s school leavers instinctively know what “right” feels like from an application perspective.

Rather than hire to fulfil a current business need, there is a case for CIOs to nurture the skills their organisations will need in the future through job placements, apprenticeships and graduate recruitment programmes. In doing so, the CIO can build an IT talent pool to inform and support senior management on the journey to digitisation.


June 25, 2014  11:37 AM

Blame the Poisson

Cliff Saran Profile: Cliff Saran
Poisson

I recently met Mark Rodbert, CEO of Idax Software, who has an interesting theory on statistics. We often see the ‘Normal’ bell-shaped distribution – where the top of the bell represents the most likely outcomes, and the left and right tips (outliers) are rare events. Rodberts believes real world events are more likely to follow a Poisson distribution – and this has implications for IT. In this guest blog, Rodbert explains the theory:

At idax we spend a lot of time demonstrating that maths really can help describe the real world. As idax uses mathematics to identify individuals with unusual access it’s pretty important that our clients share our understanding.

Of course, people are used to getting on planes, making a phone call or using Amazon, all of which require pretty sophisticated analytics, but in the realms of big data some things are still counter intuitive. If we got two sales leads last week and 1 the week before we’re on an upward trend, if my train was late twice last week, it will be late this week, and most importantly for us, if I find several people with a high risk profile in their access then someone must be someones fault.

London 2012 - Mo Farah

London 2012 – Mo Farah (Photo credit: garda)

But how likely really are these events. Well it turns out that what we need is not someone to blame, but the Poisson distribution. The Poisson is a very versatile statistical tool rather like a lopsided normal distribution, that is good for estimating event frequency, especially if the events are rare. And my all time Poisson concerns the distribution of gold medals for Team GB at the London Olympics. It seems strange to remember that at the start of the games we went a whole three days without a British gold medal. As the press shrieked that we were heading for disaster, unable to meet our targets despite massive investment, the nation held its breath. So what really were Mo Farah’s chances?

Well, as we all now know, actually pretty good. Of course only an idiot would assume that winning 29 medals over 16 days should equate to 2 every day with Sundays off, but how likely was a medal-less day. Well if you assume a Poisson distribution and take an average of 1.8 a day, the chance of a day with no medals is 16%. The chances of a super Saturday with 6 medals were actually 7%.

poisson.png

The bad news is that, as you can see from the chart above the Poisson doesn’t quite fit what actually happened. The good news is that a day without any golds was actually more likely at 38% of all days. The least likely (below 5) was a single gold day, which only happened once. The last day of the boxing, since you ask. So why does any of this matter? Because it shows that human beings are very bad at estimating how frequently things are likely to happen. We assume that events are evenly distributed and get confused when they’re not. Not much of a problem with gold medals; quite a big problem when you’re tying to detect fraud, rogue trading and high levels of access risk. We assume that because unusual failures are, well, unusual they are also uniformly infrequent.

So when it comes to Access and Identity Management its clear that an approach that defines cumulative controls by exception management, otherwise known as “my boss checks my access” – will perform well with the frequent but not so bad but does nothing to stop the infrequent but high risk. So the good news is that if you ask your staff why they have access to something you’ll probably remove a few copies of Visio, but you’re unlikely to spot the guy with access to the general ledger and the payments system who’s ripping the company off. Which just goes to show that what companies need is real analytical capability, and of course a bit of mathematics.

Mark Rodbert is CEO of Idax Software, the identity analytics software provider


May 20, 2014  7:17 AM

Web-scale IT shows what technology can achieve

Cliff Saran Profile: Cliff Saran
Uncategorized

English: A Netflix envelope picture taken by B...

English: A Netflix envelope picture taken by BlueMint. (Photo credit: Wikipedia)

Looking at web companies such as Amazon, eBay, Facebook or NetFlix, it is hard not to be impressed by the scale of the operations they have achieved. 

Few organisations can boast the size of their customer base. The speed with which they have gone from zero to world domination has to be admired. 

At the heart of these organisation is IT. This is not the products IT decision-makers buy off the shelf from a preferred supplier. Web-scale IT – the level of performance and scalability the IT architecture web businesses achieve every day – goes beyond the capabilities of most large IT providers. 

While the industry has done very well in finding existing business problems and building so-called solutions, it has been wholly inadequate at addressing the future problems people haven’t thought about yet. 

And by doing business in ways no-one had previously considered, these web giants soar above those whose business models are limited by traditional thinking. 

Unlike Nicholas Carr’s controversial Harvard Business Review essay – which questioned the relevancy of IT in business – we can learn from Amazon et al and accept that technology can make a difference. In fact, the way some organisations do IT is revolutionising the industry, to the extent that some of their innovations – such as MapReduce, the NoSQL technology originally invented by Google – is now accepted as a way of solving certain IT problems. 

E-commerce, social media and smart data place demands on businesses that relational databases cannot support at speed or reasonable cost. In this week’s issue, Computer Weekly looks at how NoSQL is revolutionising data access. 

The Gartner report, Capacity and performance management form the basis of web-scale IT  which I covered in the Computer Weekly article on web-scale IT infrastructure- describes how eBay, Facebook and NetFlix achieve web scale computing using home-grown and open source tools. While many of the IT operations of these web giants are unfamiliar to most in mainstream IT, they give us an insight into how such organisations drive the business through their pioneering efforts.

Enhanced by Zemanta


May 16, 2014  3:27 PM

Big data technology has its work cut out to harness web analytics

Cliff Saran Profile: Cliff Saran
CIO

English: eBay Logo

English: eBay Logo (Photo credit: Wikipedia)

What can we learn from companies such as eBay and Amazon? These internet businesses are at the cutting edge of technology.

The recent Gartner CRM summit gave delegates an understanding of what CRM means to a web-only retailer. The processing eBay conducts to understand customers better, for example, is eye-watering. The web gives retailers incredible insights into customer service. It is not only possible to track a customer’s identity but, thanks to smart web analytics, eBay can follow the buyer’s journey.

David Stephenson, head of global business analytics at eBay, says it’s a bit like strapping a video camera to a customer’s head. Recording every interaction a customer makes means the auction site collects millions of hours of web analytics. Making sense of it all is a big data problem. In fact, eBay produces 50TB of machine-generated data daily. It also needs to process 100PB of data every day to understand what its customers are doing. Sampling this data may have worked in the past, but this only gives a statistical snapshot.

In the era of customer focus, eBay strives to collect and analyse all the data it collects. With this information, Stephenson believes eBay can offer its customers intuitive, almost intelligent, recommendations. The technology supporting the web analytics eBay undertakes does not come cheap. Nor is it available off the shelf. There is no such thing as a “big data solution” for the level of data processing eBay shoulders

The company needs to work with suppliers to build bespoke hardware and software for its requirements, because using a traditional data warehouse would be too slow and prohibitively expensive to scale. But even a custom data processing engine cannot comprise the whole answer.

The firm uses three systems: a traditional data warehouse appliance, a NoSQL database and the custom appliance to analyse its customers’ journeys. So while it makes perfect sense for businesses of all sizes to use web analytics to understand customer interaction, an immense amount of technical investment and expertise is required to do so effectively.

Enhanced by Zemanta


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: