Tony Collins, investigative journalist and former executive editor of Computer weekly: “Robert was always vibrant, bursting with humour and a wonderful contact because he made it his business to be well informed. In an industry that has an abundance of vanilla characters he always stood out for the best of reasons. He never kowtowed to the conventional view. He will be much missed.”
Jamie Liddell, editor of Outsource Magazine: “I only had the pleasure of meeting Robert on a very few occasions, but was struck – as were a huge number who encountered him – by his remarkable perspicacity, his overwhelming enthusiasm for the industry and for best practice, and above all by his infectious bonhomie: the twinkle in his eye lit up the room, and the sourcing space, alike, and he will be sorely missed. My heartfelt condolences go out to his family, friends and colleagues.”
I recently met Todd Greene, the CEO and founder of PubNub. This is a company that provides a real time data network stream that allows a two way flow of real-time data.
Its customers include Coca-Cola, McDonald’s, Toyota, GetTaxi, and BT Sport. The Internet of Things will only reach its potential if networks exist that can send data two ways in real time.
He had lots of interesting things to say about the Internet of Things. I asked if could put some of this in a guest blog and he agreed.
By Todd Greene,
“Everyone is talking up IoT as the next mega trend. Analysts are predicting that IoT will be a multi-trillion dollar category, and thousands of companies, from GE to Evernote, are redefining themselves as IoT companies. Gartner’s 2014 “Hype Cycle” has “IoT” placed neatly at the zenith of the “Peak of Inflated Expectations.” Companies across the technology spectrum are rushing to build compelling products and claim their IoT stake – cashing in on the gold rush of IoT product development.
The big problem is a lack of a well-understood tech stack- the layers of components or services that are used to provide software for the Internet of Things. This means that IoT developers are building top-to-bottom proprietary systems, with custom software, hardware, and communication layers. Until an IoT tech stack is codified and adopted, IoT will be hobbled by security issues, time to market challenges, and stability and reliability problems.
IoT Generation I – The Custom Stack
The current state of IoT development is heavily risk-prone. Designs often work well “in the lab”, but fail at a high rate when deployed in the wild. Intermittent Internet connectivity, firewalls, proxies, spotty cellular connects, and other “real-world” bumps hamper success. Some of the biggest challenges include:
· Security Holes: The IoT raises a myriad of security concerns. Expecting each IoT development team to engineer best-practice security into each custom stack is leading to well-publicized IoT security breaches (security cameras, wireless routers, and more.)
· Failure Detection & Remote Updates: Most custom stacks don’t easily detect remote failures, nor do they provide a mechanism for updating devices remotely. Expecting manual processes for updating IoT device firmware at scale virtually guarantees disaster.
· Cost and Time-to-Market: Custom stack development costs more, makes delivery dates unpredictable, and increases overall project risk.
· Product Silos: Bespoke communication means no interoperability between disparate devices. This concern will expand as more IoT products are released; enterprises and consumers both will expect their devices to work together across vendors.
· Brittle and Bug-Prone: Bespoke IoT stacks are hard to upgrade, and failure-prone. The detailed knowledge of the custom stack is lost as the SI project ends, or as the IoT team disbands to move to other projects.
IoT Generation II – An IoT Stack Emerges
The good news is that IoT products are maturing, and with them, we’re seeing a stack starting to emerge. Driving this change are three trends. First, fast-growing IoT categories like Smart Home (Nest, Insteon, Dropcam, etc.) and Connected Car (Uber, Lyft, GetTaxi, Delphi, Moj.io, etc) are seeing stiff competition. Budgets and time-to-market are becoming key drivers, and vendors can’t afford to design and build everything from scratch.
Second, the growing availability of affordable hardware components and easy funding (Kickstarter, etc.) are driving grass-roots product development from teams that are unlikely to use large SI firms to build their products. To drive products to market, these bootstrapped companies are pioneering repeatable patterns of development and helping blaze the trail to a codified IoT stack.
Third, consumer IoT rollouts require massive scalable and geographically distributed backend systems that are complex to build and maintain. Customer support for consumer IoT also becomes a key driver: the products must be easy to setup, reliable, and remotely upgradable. “Bricking” consumer devices via a global remote update is the deepest fear of every consumer IoT vendor. The PR fallout from a security breach can be unrecoverable. Consumer IoT vendors want a vetted IoT stack that can mitigate these risks.
Evolving Components of the IoT Stack
Most of the IoT Stack innovation is occurring within the communication layers. While hardware design and server-side “big-data” technologies are relatively mature, the new risks in IoT are almost always connectivity based. These can be described in three categories:
Local Area Communication – There’s no shortage of protocols for local device-to-device communication. Some of these include Zigbee, Insteon, Z-Wave and 6LoWPAN, all vying to deliver reliable local connectivity between devices. However, protocols are just the map. The actual journey requires frameworks and libraries that implement these products. These are emerging in both open source and commercial varieties and in various stages of development.
Internet Communication – Internet connectivity holds the promise to real-time awareness and control of devices from anywhere in the world. But reliable and secure Internet connectivity is fraught with difficulty, since the challenges exist both on the device and the server-side. Devices that “listen” for commands on unprotected Internet IP addresses are guaranteed to be hacked. Server infrastructures must gracefully handle secure signaling to/from devices at massive scale over unreliable connections. Frameworks and libraries built around newer protocols like MQTT, CoAP, and WebSockets are emerging, but don’t address the costs and complexity of vendors operating these infrastructures at scale. Addressing this challenge is the adoption of Data Stream Networks (like PubNub, etc), which are similar to CDNs (Content Delivery Networks) in their global reach but designed specifically for secure communication for the IoT.
Vertical Industry Standards – Interoperability requires standards. Already in Smart Home, we’re seeing announcements of standards from Google, Apple, and others. In consumer electronics, a multi-vendor initiative called the AllSeen Alliance promises eventual cross-vendor compatibility. These standards will battle it out for years, and take time to mature (remember how long after Bluetooth was announced before we could pair our phones to our cars?) Upcoming IoT product releases won’t wait for these standards, but over time and with patience, these standards will eventually succeed.”
In this guest blog Joel Reid, new business sales leader at Intel Services, explains what brands and retailers need to know when specifying technology in an ever evolving world.
Weighing up the options: how brands can assess their technology needs
By Joel Reid
“In the latest Gartner CMO Spend Survey Report 2015, customer experience is predicted to be the top technology investment for the year ahead among the biggest hitting companies.
According to research by Martec, a third of CIOs at the UK’s top 150 retailers say that other departments, such as ecommerce and marketing, invest in technology outside of the IT department’s control, in a bid to keep pace with digitally-savvy shoppers. As marketing has become more digital and data-led, the focus on customer engagement has blurred the organisational lines between the CIO and CMO.
Increasingly technology-driven consumer behaviour is creating new demands on CMOs to connect with its customers in every channel. In turn, these demands have seen CMO buying power burgeon in the last few years, with Forbes suggesting this could be as high as controlling 40% of IT spend.
But, while ecommerce and marketing managers have that all important touch point with the consumer, if the decision making on IT spend becomes siloed, brands and retailers run the risk of losing the business value and efficiencies associated with integrated solutions. It could also lead to overlapping Software as a Service (SaaS) bought in by different departments or used by affiliates, which otherwise could have been consolidated to create cost savings.
This is placing greater pressure than ever on CIOs to regain and retain control of enterprise technology, at the same time as adding value to other departments that the business demands. Their challenge is to connect their information and intelligence in order to create a tighter and more profitable engagement.
To create the seamlessly integrated customer journey that today’s ‘always on’ omnichannel shopper necessitates, retailers should adopt a similarly joined up approach; effectively managing and unifying their activities in all channels. This can be both directly and through a network of partners, affiliates, agencies, developers and media.
CIOs will need to align all these functions under a holistic strategy, identifying where solutions can most influence customer engagement, whilst delivering against business objectives. This will build trust amongst the marketing department that IT can deliver value. And, in turn, make CMOs less likely to bypass IT when outsourcing to vendors, a decision which, if made in isolation, may be made in haste or without the architectural insight CIOs can provide. This means IT architecture needs to be multifunctional, with the flexibility to be implemented by multiple users, as well as being able to respond quickly to changes in customer behaviours and demands.
Application Programming Interfaces (APIs) are one example of technology that can help CIOs achieve this level of integration, acting as the bond between disparate applications and devices, and enabling all the players to meet rapidly changing consumer preferences.
By working together CMOs and CIOs can harness the right technologies and solutions to generate greater benefits for long term business gains.”
Last week this blog featured a guest post about Service Integration and Management (SIAM). A colleague also wrote a story about the challenges facing HMRC when it splits up its Aspire contract with Futjitsu.
Aspire is one of the biggest IT outsourcing deals ever signed by the UK government, costing on average £813m per year over the past 10 years, according to the National Audit Office (NAO). By the time the deal ends in June 2017, prime contractor Capgemini will have received £10.4bn of taxpayers’ money.
Here is a Q&A with a SIAM consultant.
“What can HMRC do to mitigate the risks of breaking up the Aspire contract?
There is significant analogy between this change and many other similar programmes across Government and more broadly in the private sector. Breaking up Aspire, or indeed any of the big early generation outsource deals, is a programmatic challenge – with a lot of detail needed in the definition of new services and how these contracts and the suppliers interact with each other (hard and soft aspects). There is a lot of existing collateral and experience around that they can re-use. Using a proven framework will de-risk delivery, but needs to be done in the context of the specific requirements within HMRC. HMRC should also recognise this change is a cultural and people challenge both for themselves and their suppliers and accordingly pay as much attention to these aspects as the technical /contractual aspects. To minimise risk HMRC leadership should ensure they start by considering what the revised in-house team needs to look like in terms of skills, mind-set, accountability, and how they are going to sponsor, build and motivate this team. The establishment of this new team should be early, and then allowed to drive the change. They should use external support to help accelerate the delivery by helping mature process, governance and tooling designs. This team will need to think hard about information flow and how tooling (across the service management disciplines, including Service Desk) will work in the new operating model.
Is getting the right talent enough?
The new disaggregated delivery models rely on four pillars of capability within the IT delivery model. The need for strong “Service Integration” is key.
-Cultural understanding of the business,
-Well defined, processes that have very clear bounds of responsibility,
-Strong governance against strategic ambitions, the organisations standards and policies and the operational standards,
-Tooling that will enforce the processes and governance structures.
Together with strong leadership, and the right talent to build these pillars, significant benefit will be realised.
There is a large focus on digital hires at the moment, does government need a lot of technical skills for something of this scale?
Government needs very strong Service skills to make these models work. Although, quite rightly there is a move for the retained IT organisation to “re-own” the Service Strategy and Service Design roles (and all the related knowledge/information) within Government, much of the technical design should be left to those that will be eventually delivering the services. It may be noted that in some organisations, this fine balance between “service” and “technical” design has shifted too far. This is likely to result in suppliers being constrained in their delivery, impacting both value for money and service quality/performance. Unlike in true digital/application delivery activities (where the System Design/Integration can retained), Government should continue a focus on service outcomes rather than the detailed design decisions.
What does government need to do?
Government needs to continue to be clear on direction/strategy for these new models. They do need to, however, develop better/clearer service structures in their new procurement frameworks, and then allow departments to have greater autonomy on the decisions that they make in delivery of that direction/strategy alignment. There has been a tendency over recent years for “the centre” to reject departments cases because they don’t fit with the “centres” very latest thinking of the centre. This can often cause significant delay, rework and additional cost.
What do departments need to do?
Many departments have the opportunity of delivering significant savings, improving their performance and become more agile through the acceptance of the new disaggregated operating models with more visability and control held by the retained team. However, to realise these benefits, they must:
·Define, to an appropriate level of detail, their future vision and delivery strategy and get this agreed by GDS
· Communicate these models broadly with a clear plan of how it will be achieved.
· Setup a programme of change that accepts that it will impact the; retained organisation, delivery processes, governance structures, tooling and the supplier landscape
· Define a set of KPIs that will be used to continually measure the delivery of those benefits
· Appoint accountable owners from the new retained organisation to drive the change
· Utilise proven frameworks where possible to accelerate delivery and reduce risk”
With the large number of different suppliers, offering IT for businesses, with trends like cloud computing and the consumerisation of IT this will increase further. Although multi-sourcing is nothing new it used to mean managing a few suppliers, but now it can run into the high double figures and even more for large companies.
Companies like BP have very complex ecosystems of IT suppliers and manage them with an iron rod. Read about BP’s multi-sourcing strategy here.
But not everybody can do so a new industry has grown around this known as Service Integration and Management or SIAM for short. Traditional multi-sourcing contracts would often have a prime supplier that subcontracts or manages other suppliers. SIAM tries to change this through an independent function that does the management and integration. This is often outsourced itself.
Today there are businesses set up that just provide this. End user businesses and service providers want to understand how to evaluate it. Simon Durbin, who leads Information Services Group’s (ISG) UK SIAM practice wrote this guest blog to try and explain.
Evaluating the unknown
By Simon Durbin, ISG
“As happens with any new concept, there is inevitably a lot of debate, discussion and practical learning that takes place in the early stages, followed by a period of evaluation. SIAM is no exception and continues to mature and develop and there is no universally accepted approach for evaluating it.
SIAM exists to manage the complex dynamics of service demand and supply. It is about delivering value to businesses, which want to maximise the value from their service providers (whether internal or external) and ensure that services are aligned to business needs.
So what is the value of SIAM?
A traditional business case for SIAM cannot be created in isolation nor can it be separated from the services being managed. The cost of performing SIAM has to be evaluated as part of the total business case for a strategic sourcing or service initiative.
For example, when moving away from a Prime model, the ‘hidden’ integration cost of the Prime provider cannot be ignored. All too often, clients will move to a multi-source model and fail to consider the integration effort required across the new supplier landscape. When the issue is discovered later, the ‘missing’ SIAM capability is perceived as an extra layer and is very hard to justify.
The value of SIAM can be demonstrated by metrics that focus on customer value and end to end service performance. The ‘watermelon service level’ phenomenon is well known (green on the outside, red on the inside). Avoid this by selecting service measures that look at the ‘whole system’, not just the individual components, and then design the service as a whole to meet these requirements (including the agreement of aligned supplier service levels).
The final area of value to consider is the business interface. SIAM is not exclusively about supply, although it is often positioned as an approach to managing a complex multi-source environment. An approach that does not invest considerable effort in the customer (demand) side of the equation is missing the whole point. Suppliers and services exist for one purpose – to satisfy customer needs and deliver value to the business. Businesses will never value something that they cannot see and don’t understand, so use the metrics to give visibility to demand, consumption and performance.
SIAM cannot be looked at in isolation. By definition it is all about integration and has itself to be integrated across the demand-supply chain. It all starts with the customer. End to end service metrics are critical to define and then manage the services. SIAM is a mandatory consideration in any service or supplier strategy – ignore it at your peril.
Simplicity is key with both implementation and evaluation. Aim to be clear about what you’re trying to achieve in the first place, and then evaluate accordingly.
Simon Durbin is a Director with ISG and leads the SIAM practice in the UK, working as a key member of the ISG global SIAM practice. He has over 25 years’ experience in IT service and supplier management working as both a practitioner and consultant for FTSE 100 and Forbes 500 companies. For the past five years he has specialised in advising on Service Integration and Management.”
India’s IT services industry has grown fast, but who would have thought that TCS is now IBM Global Services’ biggest threat.
That’s what one management consultant told me last week. I was writing an article about talk of IBM reducing its workforce in India. IBM is one of the biggest IT services firms in India in terms of workforce and even if it is true that it will cut about 50,000 staff there, it will still be very big with about 100,000 people.
Read my article here. I have has quite a few emails from readers about this.
But the industry is changing and the provision of low cost full time equivalents is no longer the way to grow and profit for IT services firms. They want non-linear business models and they are doing this with less labour intensive services that harness cloud and automation technologies for example. At the same time customers want services using the latest technologies.
For example Scandinavian IT services firm Cygate has expanded its business without needing to recruit more staff by using automation software from IPSoft. In 2010, the company, which serves more than 1,000 customers including some of the biggest corporates in the Nordic region, was experiencing 20% growth in sales. This meant the company needed to add resources or risk service levels deteriorating. But just adding manpower would have reduced its margins.
So you would think the Indian suppliers who grew their businesses through offering highly skilled IT workers at a lower cost to Western corporates, would struggle the most. But it seems this is not the case. In fact it could be another phase of growth for these firms.
Mark Lewis, outsourcing lawyer at Berwin Leighton Paisner, says, “TCS, India’s biggest IT services supplier, is achieving both linear and non-linear growth. It is still recruiting heavily in India and is building its global workforce at the same time,” said Lewis.
While IBM Global Services is always a default consideration businesses outsource IT it is not winning as many deals as it used to. Peter Schumacher, Value Leadership Group, said conversations with large corporates in Europe reveal that Indian suppliers are now now IBM Global Services’s biggest competitor, and TCS is the biggest of these.
The Indian advantage of lower cost skills may have diminished overtime because western IT services firms have built huge offshore workforces of their own. At the same time wages in India have increased. But during the hay day of low cost IT services companies like TCS, Infosys, Wipro and HCL have build strong businesses and developed domain expertise, by moving beyond pure IT services to business services using IT.
The other interesting point is that western IT services forms have shot them in the
Have the western IT services firms let the foot by reducing the fear associated with offshoring. There was a time that offshoring IT was a brave and perhaps a risky strategy for a big business. But companies like IBM have used offshore staff and as a result made it the norm for outsourced service delivery.
Today service levels from offshore and western suppliers are little different and CIOs will make decisions based on the pure business value, rather than perceived risks.
“In Europe, TCS will add almost $1bn in new business in 2014, which underlines the enormous market momentum and customer confidence they now enjoy,” says Schumacher.
HP’s decision to split the company in two, with its consumer computing and printing departments separated from its software, business servers and IT services operations.
The IT giant is not shy to get itself in the headlines. Back in 2008 when it paid $13.9bn for IT outsourcing pioneer, heads turned and the column widths followed. The same happened a few years later when the value of this acquisition was reduced by almost $9bn.
The news that HP is splitting the enterprise business, including services, from the consumer business triggers a new era in the former EDS business.
I wrote an article about the future of the services business following the split. Here it is.
I have since had conversations about this and quite a few people are describing the split as the first stage in massive restructuring and the sale of the services business.
One source, a former EDSer who now works for a large competitor, says that HP services has been picking up quite a few big deals recently. He said it has also recruited people from Computacenter and recruited former EDS people that had been let go.
But who would buy it and how much would it cost?
The IT sector has changed with digital technologies transforming how businesses receive and use IT and there are lots of suppliers in the sector that are ahead of HP Services in terms. For instance the Indian suppliers have gone from strength to strength and continually manage to report double digit sales growth. They have kept up with digital developments.
This week Computer Weekly wrote an article about Capgemini’s 2014 World Payments report.
The report, which was carried out by Capgemini and RBS, does just as it says on the tin and looks at the world payments industry.
This includes all the latest and greatest mechanisms for sending financial transaction messages across the world.
We think Capgemini’s press team should read it to learn about the fantastic technologies available and of course l’Internet, as the journalist that covered the story was shocked when she requested a copy of the report on a Wednesday. The response on Friday morning that arrived VIA EMAIL, was that they were sending one by post.
Computer Weekly and Capgemini are installing aviaries as we speak to house pigeons to help us communicate in the future.
Could the creation of a tablet that business love but dare not speak its name as a corporate device be Apple’s greatest gift to Microsoft?
I was chatting with a contact today about the company he runs that provides corporate software that enables executives to share and access information via tablets.
Diligent’s Boardbooks software has enabled businesses to replace hundreds of pages of documents needed to prepare for company meeting.
When I wrote about the supplier in December 2012 it only supplied its software to Apple’s iPads. The iPad at the time was the only tablet that was of the right standard to support it.
But almost two years on and things have changed. Well Windows 8.1 has changed things.
Charlie Horrell, managing director Europe at Diligent, told me that since the release of Windows 8.1 devices more interest has been stirred.
Businesses are now looking at the software as an enterprise app rather than one just for board members. The compatibility with Microsoft estates makes it an attractive option, and now that Windows tablets have improved it could result in strong demand.
For example Matthew Oakeley, global head of IT at assent management firm Schroders, told me last year that he does not think iPads will never be a true corporate device because they do not integrate seamlessly with Microsoft.
“I bet a lot of people bought iPads for work but don’t use them for work,” he told Computer Weekly in a recent interview. “The real problem is that, if you run a Microsoft Windows estate, you want something that can talk to it.”
Oakeley said the lack of interoperability between Apple and Microsoft was unlikely to change.
So Apple’s creation of a tablet attractive to the enterprise could be providing the perfect platform for Microsoft to become great again.
It is hardly surprising that businesses are obsessed with data. The online activity of consumers is constantly tracked whether at home or on the move.
Businesses want to know what people want and when they want it so they can make them an offer they can’t refuse. Businesses are also collecting masses of data to help them devise their business strategies.
But to succeed it is much more than just being able to collect the data. Mindtree is a tier two Indian IT services firm that specialises in helping its business customers make better use of data. The company works on the Indian biometric ID project, which involves the details of 1.2 people being collected. So it know all about the collection of, but here in this guest blog post Mark Wilsdon and Soumendra Mohanty of MindTree explain a misconception about data.
Smart Data, Not More Data
By Mark Wilsdon and Soumendra Mohanty
“One of the biggest misconceptions about the data companies collect is to think that more is better. That somehow each incremental piece of data collected adds detail to an insight. In reality, though, it is not the amount of data that matters. It’s how you use it. The question every business should ask itself is this: what questions can be answered by the data it’s collecting?
According to a recent article in Management Review from the MIT Sloan School of Management, a typical company doubles the amount of data it stores every two years. The good news is that this explosion of data opens up a lot of possibilities. However, possibilities and insights are rarely found at the surface of this ocean.
Being smart with data is not about collecting a million tweets. It is about the in-depth analysis of tweets, including analysing hashtags containing metadata around the device type, geographic location, time and the context of the conversation. In these insights are the true value of data, allowing companies to push information to customers that is context-sensitive and meaningful.
Companies that are using data to its full advantage look at it in three ways in order to pull as many insights as possible:
An individualistic approach is mostly gut-driven. It relies on analysts who have spent a long time in a particular domain, finding answers that require access to all types of data across various systems.
Process-centric is a disciplined approach to data. It involves consistently employing common processes, and the reuse of components.
A Data-driven approach identifies evidence-based data analysis traits. This pattern requires deeper data analysis than the two approaches above, as it requires an understanding of context, and the application of sophisticated algorithms to identify patterns.
With such a large volume of data, it is impossible to use all of it to develop values and insights. Because of this, it is extremely important to focus on smarter data rather than more data. By searching, visualizing, analysing and distilling insights from the data you have, you can unlock its essential value.
Now that all of society has essentially moved online, we are witnessing a shift from traditional business models to digital business ones. As this occurs, the way customers engage is more important each day. By focusing on context-sensitive and meaningful insights, companies will effectively utilise the relevant data that is available to them. With more and more data being made available every minute, smart use of data is the sole path to success for companies of the future.”