Write side up - by Freeform Dynamics


November 13, 2018  6:50 PM

Open source is growing up – and here’s how

Bryan Betts Bryan Betts Profile: Bryan Betts
API, Docker, Kubernetes, Open source, OpenStack

If you’re among those who still think that open source is just for hobbyists and academics, think again. Open source is mature now, both as a concept and as tools for building enterprise IT, and we have two major shifts in understanding to thank for that.

The first key change is that there’s a much more mature understanding now of how the layers of IT architecture relate to each other – of what fits where, in other words. Instead of trying to do too much, adding in every feature or capability that might be related, open source projects have become more focused.

For example, instead of misunderstanding them as rivals, we can now see OpenStack and Kubernetes for what they are. The former is an infrastructure layer, upon which you can run platform layers such as the latter.
In parallel with that, open source developers now better understand the need to align their projects with each other. That’s partly driven by users pushing for interoperability – make that easier and you get more users. But it’s also the growing recognition that, as open source usage increases, their project is part of something much bigger – potentially a whole suite of interoperating enterprise-grade software. Open infrastructure, as it’s already being called.

Would you like Kubernetes with that, too?

A good example is the way that hardly a week goes by now without a reference to some new cooperation or interworking with Kubernetes. It was there at the recent Cloud Foundry Summit, it was there when I spoke with Red Hat and SUSE, and it’s here in spades at this week’s OpenStack Summit in Berlin – or the Open Infrastructure Summit, as it’ll be from next year.

What we see is that Kubernetes has won the platform debate. Most major players in the cloud-native ecosystem seem agreed that Kubernetes will be the common platform orchestration layer over or under which they’ll all build – and they have aligned themselves and their own software accordingly.

For example, development frameworks such as OpenShift and Cloud Foundry were once positioned as the key platform layers, but it is more likely now that you will find them used as application layers within Kubernetes, rather than vice versa. And while Docker is still popular as a container format, those containers are likely to run on a Kubernetes platform. Focus on your strengths, is the message.

We should perhaps retain a little caution, of course. The open source community has been notorious in the past for rebel developers deciding that such-and-such a project has “sold out” or taken a wrong turn, and setting up a fork or rival project. After all, even Kubernetes was once the young upstart, with the early attention focused instead on Docker.

The case for open infrastructure

But we should also take comfort in that community’s growing maturity, and with the greater speed of open source. Not just the velocity of development, which is impressive, but also the speed with which the people driving projects are willing to pivot and embrace new and better ideas.

And I get a distinct sense that, as APIs and open standards come to dominate our thinking, modularity is now accepted as the way to get things done. Need a container orchestration layer? Don’t reinvent the wheel. Need converged file, block and object storage? Ceph has come on in leaps and bounds – but even if it doesn’t suit, there’s alternatives.

So yes, open infrastructure is a reality now. Sure, there will be use cases where it’s not appropriate or where proprietary software will be a better fit. But it’s become an attractive alternative for a growing number of organisations.

November 12, 2018  8:26 PM

The mainframe returns – as a platform for large-scale Linux

Tony Lock Profile: Tony Lock
Linux

There are several ways to build large scale Linux server environments, with x86 and public cloud being obvious ones. But there’s another option too, as I reminded myself when I caught up with Adam Jollans, program director for LinuxOne product marketing at IBM. LinuxOne is a solution built by IBM using the mainframe platform as its base, but it’s solely focused on running Linux workloads.

We discussed the way some organisations are using LinuxOne to keep mission-critical open source solutions running without service interruption and, just as importantly, to keep them secure. Typical workloads his customers run include core banking services – where resilience is essential, not just desirable – and similar solutions for Telcos and SPs. These are services that must scale to hundreds or even thousands of virtual machines, doing so both cost-effectively and without risk.

The characteristics of such mission-critical workloads clearly resonate with the traits of the venerable mainframe. After all, the mainframe is regarded by many, even those who have never seen one, as the gold standard for IT resilience and availability. Unfortunately for IBM, and arguably for the wider world, the mainframe is also widely thought of as being out-dated, expensive, and difficult to manage – even though this hasn’t been true for a long time, and is certainly not the case with LinuxOne.

Linux admins, managing mainframes

LinuxOne is built on modern technology, and the management tools available from IBM and other vendors, such as CA Technologies, Compuware and BMC, have done much to simplify everyday tasks. Just as importantly, a trained Linux administrator can look after the platform using the same skills they use on any other Linux system.

While the scalability, security and resilience characteristics of the mainframe are now widely recognised, IBM is still faced with the perception of the mainframe as expensive. True, they’re not cheap, but neither is any system designed to run very large workloads. Indeed, some cost comparison studies indicate that LinuxOne is at least as cost effective as x86 systems to run applications at high scale.

It’s clear that IBM has a significant challenge to educate the market on the qualities of LinuxOne. Perceptions can be difficult to change, especially those that have been actively promoted by other vendors over many years. That said, LinuxOne is picking up new customers in both developed markets and rapidly growing economies around the world. The planned take-over by IBM of Linux and cloud specialist Red Hat will undoubtedly shift the market dynamics in this area, too.

For anyone who operates Linux systems at very large scale to support services which must run without fail, especially where those services have a mainframe history, it may be worthwhile taking a broader look at the options and not just defaulting to x86 or the public cloud.


November 5, 2018  6:27 PM

Last call for a European public cloud?

Richard Edwards Richard Edwards Profile: Richard Edwards
cloud, Europe, Hybrid cloud, Hyperscale computing, PaaS, USA Patriot Act

Europe’s self-described No.1 ‘hyper-scale cloud provider’, OVH, held its 6th annual customer conference in Paris recently. Attendees got to meet the new CEO, Michel Paulin, and hear about the company’s four ‘product universes’, but there was more to it than that.

During his keynote, OVH founder and chairman Octave Klaba gave an impassioned speech about a ‘revolution’ designed to ‘liberate and innovate’. What Klaba called for is an alternative European cloud, one that can take on the might of Amazon, Microsoft and Google. But the question is, has Europe got what it takes to compete? And does it really need a home-grown offering when the giants are opening local data centres across the region?

Multiple parallel cloud universes

In the public cloud arena OVH is indeed surrounded by giants, but that isn’t going to stop it from trying to put together a compelling range of products and services. I found the ‘universes’ branding a little strange (constellations might have worked better), but customers seem to get it. Here’s what’s on offer:

  • OVHmarket is the ‘digital toolbox’ for small businesses and entrepreneurs, with services such as domain names, web hosting and network access services. Microsoft products are also offered here, such as hosted Exchange and SharePoint, and subscriptions to Office 365. This is all very much commodity stuff, but it’s the kind of one-stop shopping that small business seem to like, and the pricing looks competitive too.
  • OVHspirit is the universe of compute, storage and networking infrastructure, and it’s where OVH has its roots. The company offers customers a wide range of dedicated servers, virtual private servers and private cloud at an attractive price/performance ratio. And if you want your dedicated servers to be in, say, the UK, then ‘multi-local’ data centres make this happen.
  • OVHstack is the Platform-as-a-Service (PaaS) universe, built on open-source OpenStack. Designed to remove the hassle associated with infrastructure management, OVHstack takes up the notion of the software defined data centre. And because OpenStack is supported by a range of cloud service providers and vendors, customers should get better system portability, or ‘reversibility’ as OVH likes to call it.
  • OVHenterprise is the hybrid cloud universe. This cloud deployment model offers interoperability and a degree of consistency between two or more distinct public or private cloud infrastructures. This is appealing if you’ve already invested in on-premises IT and private cloud infrastructure, but also want to use public cloud to meet specific business needs.

This line-up of products and services is enabled by some 50 OVH partners, most of whom will be familiar to CIOs and IT professionals. Is it enough to tempt enterprises away from the competition, though? Is there something else of value that OVH can offer?

Would you like Patriot Act/Cloud Act with that?

OVH gained a foothold (and a couple of data centres) in the US when it acquired VMware’s vCloud Air business in May 2017, making it one of the few hyperscalers able to offer cloud services with or without the Patriot Act and CLOUD Act. But this distinction is unlikely to drive the kind of mega-growth required to catch-up with the market leaders, and I’m sure Klaba and his team realise this. So what’s to be done?

They believe the answer to this question lies within the European market itself. Does Europe need, or indeed want, a strong, local native public cloud provider? Klaba seems to think so. This is understandable, as the growth of OVH and other EU cloud providers will ultimately be determined by the region’s response. But what do European businesses, governments, institutions and individuals think? Share your thoughts and let us know.


November 1, 2018  5:59 PM

Industrial IoT makes your people more important, not less

Bryan Betts Bryan Betts Profile: Bryan Betts
Analytics, human factors, Industrial IoT, iot, IoT hardware, Manufacturing, Skills

The recent Industry of Things World conference in Berlin offered insights into the state of the much-hyped Industrial Internet of Things (IIoT). At one end, big early adopters such as Airbus, Western Digital and HPE told how their investments in IIoT are genuinely paying off, while at the other, forward-looking tech developers – both start-ups and established suppliers alike – offered fascinating visions of the future of industry.

HPE was also the conference’s lead sponsor, which was interesting of itself. Although better known these days in the data centre, HPE – or rather its parent HP – has a long heritage in industry. Much of the HP family silver was sold off by past CEOs though, in moves that look even more questionable to me now than they did at the time.

What HPE and the other speakers at IoT World have recognised, unlike those misguided former HP execs, is that we are seeing the digital transformation of industry. Volkhard Bregulla, HPE’s VP of manufacturing, spoke for example about manufacturing’s move from automation to autonomy, and to closed-loop manufacturing where IIoT lets you “make everything transparent.”

Others, such as Dave Rauch, Western Digital’s senior VP for world-wide manufacturing operations, spoke persuasively about the need to integrate IIoT in an ongoing and holistic way. It’s not like an ERP system, he said, where “you assemble a team, create a project and so on, and at some point you’re done.” Like any digital transformation process, IIoT is a journey, not a simple destination.

And of course there will be pitfalls and road-blocks along the way. David Purón, the CEO of IoT management start-up Barbara, talked of the need to overcome expertise shortages and device integration woes, for instance. Other start-ups focused on the need to simplify, from edge analytics developer Crosser, filtering data streams from millions of sources for analysis, to Fero Labs, using machine learning to turn IoT data into actionable – and explainable – advice.

The essential human factors in IIoT

When you step back and look at the broader picture though, the common thread is the human element. Sometimes it’s the skills shortage, other times it’s the need to make the incomprehensibly complex simple enough to understand – and more importantly, to act upon. And in some cases, as with ViveLabErgo, it’s about using virtual reality to simulate the people working alongside your advanced machinery to ensure they’re both safe and efficient.

Overall, it is a reminder that even in the sensored-up and AI-enabled industries of the future, there will still be people. They might be there to do skilled manual work that’s too intricate and low-volume to be automated, to supply social and emotional intelligence, or to assess the automatically generated evidence and make the final decision.

Or like most of the people reading this, they will be filling those roles that require us to comfortably assimilate and intuitively combine a breadth and depth of expertise and knowledge that machines simply can’t handle.

Either way, as you make machines autonomous and instrument your processes, the importance of the remaining human element will actually increase, not fall. Not only is it vital to recognise that, but it’s essential to get the people on-board with the changes. In an IIoT project you underestimate the human element and the degree of cultural change involved at your peril.


October 24, 2018  10:49 AM

Google has got serious about Enterprise IT

Dale Vile Profile: Dale Vile
datacentre, Enterprise IT, Google, Hybrid cloud, Kubernetes

Google has made a complete about-face on the enterprise. It took so long that if you were watching all the time you probably didn’t notice it, but it has turned through 180 degrees.

Thinking back four years to my last major Google event, it was a strange experience. For a whole day I sat there listening to tales of how everything everyone was doing in mainstream business was wrong. We were using all the wrong tools and generally had no idea how to function effectively in the modern world. And corporate IT teams weren’t helping either. They were wasting their time pointlessly running datacentres, and dragging their feet on the move to the cloud – which was obviously the answer to literally everything. Google was going to show us the way and save us all.

Maybe it wasn’t said exactly like that, but it sums up the tone and spirit of what I heard. I came away with two big impressions. The first was that Google execs thought the rest of us were all a bit dim. The second was that Google had very little understanding of real-world complexities, especially in relation to enterprise IT.

Now wind the clock forward to October 2018 and the Google Cloud Next event in the UK, and the change is astonishing. This time I sat there being briefed on how on-premise computing is still the centre of gravity for enterprise IT, and legitimately so. There was clear acknowledgement of the reasons why CIOs continue to invest in the datacentre – proximity, control, compliance and even cost – yes, on-prem systems can be cheaper if you have the scale and the skills.

A full range of cloud choices is essential

Against this background, the big message now is that Google is committed to bringing the benefits of its cloud environment to customers regardless of location – your own datacentre, private hosting, and/or public cloud. A key component here is GKE – the Google Kubernetes Engine – which was recently made available as a fully supported on-prem solution to enable private and/or hybrid cloud platforms. The idea is that you can move applications and workloads seamlessly between physical environments – and even run them in tandem in a fully coordinated manner. This is a key requirement that we have been hearing consistently from IT leaders pretty much since the term ‘cloud’ was originally coined in relation to technology. Google goes even further by highlighting the ease with which a workload can be moved from GKE to a generic Kubernetes environment, thus further reducing the risk of lock in.

Together with some strong and credible viewpoints on business transformation, security and compliance, plus a clear perspective on the role of partners, this real-world take on enterprise requirements came across as very convincing from the main stage and during breakout sessions. The question is though, can Google walk the walk as well as talk the talk? After all, IT vendor executives have been known to exaggerate, spin, and generally tell people what they want to hear, even when there’s little substance to back it up.

My judgement here is that Google’s ‘born again enterprise vendor’ positioning is not just an act, but a genuine transformation. I formed this opinion after speaking less formally with a number of key Google people, some of whom I knew from their previous roles with other technology vendors. The more conversations I had, the more it became obvious that the change in understanding and attitude is at least partly a result of hiring talent and experience into the company from more traditional enterprise IT players. But I have to say that even the long-serving Googlers I chatted with generally talked about mainstream business needs empathetically, in a way that’s hard to fake. The notion that Google now ‘gets it’ was also corroborated by the customers I encountered at the event.

So, what a difference four years has made. In my mind Google has successfully metamorphosed from an idealistic enterprise wannabe to a serious mainstream contender. With the cloud platform and service market actually still very young, and other cloud players also waking up to the hybrid imperative, the next couple of years could get really interesting.


October 23, 2018  12:29 PM

Cloud Foundry’s growth highlights open source success and the cracks in commercial licensing

Bryan Betts Bryan Betts Profile: Bryan Betts
Cloud Applications, Cloud platform, Open source, Open source applications, Platform as a Service

Open source has crested the hill of enterprise acceptance. There’s no longer the fear, uncertainty and doubt (FUD) that there was around open source a few years ago, plus it’s often a lot easier and cheaper to scale open source than commercially-licensed software.

These conclusions were reinforced at the recent Cloud Foundry Summit Europe in Basel by a debate over whether the existing distribution model for Cloud Foundry – a platform-as-a-service (PaaS) for developing and deploying cloud-scale applications – was breaking.

The licensing uncertainty was highlighted by a case study presentation from insurance company Fidelity International, who saved several million dollars in licensing fees by switching from a commercial CF distribution (or distro, as it’s often known in the open source world) to the free open source version.

The cracks are showing in pay-per-use cloud licensing

As Computer Weekly’s news pages have highlighted, this flags up a big problem with the way some licensing fees are calculated. Fidelity’s use of CF has at least quadrupled as its developers shifted to cloud and microservices, and if it had stayed commercial its software licensing fees would most likely have risen similarly.

Of course, this is not just an issue for the commercial distros of “supported open source” – it’s a general problem with all pay-per-use cloud services. Organisations are equally likely to get a nasty shock if their use of cloud storage rockets, for instance. In the world of traditional IT there’s the discipline of capacity management, which aims to anticipate how demand will change over time. This seems to have been lost in the rush to cloud, though – perhaps because it is often driven by people without a traditional IT background.

So yes, the cracks are showing in that pay-per-use licensing model. As Fidelity realised, if the system is strategic enough to build a solid business case around, then once you have built up enough experience, the main thing stopping you moving to open source is FUD.

And it is clearly not alone in realising this. One of the people I talked with at CF Summit was a software engineer for a major CF company whose team is responsible for making sure that their popular add-on tools work on the open source distro, not just on their company’s own pay-per-use commercial distro.

Adopt, offload, embed

So what’s the future for commercial distros? One role may be to help organisations adopt the software and skill-up, with the option to go open source over time. There’s opportunities here for those suppliers who have major consulting practices. Time and again, CF users cite the reason for working with their chosen distro as being their software supplier’s methods and domain expertise.

Another is quite simply that not everyone wants to do their own hosting or can build a business case for it. A third is where the open source software is there, but embedded within something of much broader value. As just one example, the SAP Cloud Platform includes CF as its framework for building Web apps, but the value for SAP users is in the rest of the platform and the support it brings for extending their core SAP systems.

And then of course there are other licensing models available. So yes, there’s always going to be a role for commercial distros. That said, for those who do have the skilled people (and the caveat is that there’s a world-wide shortage in CF and Kubernetes skills; indeed, pretty much everyone at CF Summit seemed to be hiring), why wouldn’t you go open source?


September 12, 2018  12:14 PM

When modern meetings don’t work

Bryan Betts Bryan Betts Profile: Bryan Betts
appointments and meetings, Conference Room, Team Collaboration, Virtual meeting

Once upon a time, pretty much every meeting had a minute-taker – someone who kept notes, summarising who said what. These meeting minutes were circulated afterwards for comment and correction, and to confirm actions that people had said they would carry out.

Then teleconferencing happened, and all those meetings became a whole lot less formal. Now, participants are often expected to keep their own notes, but if everyone takes their own, then there’s no shared or agreed version for those who can’t attend. And while there might be an audio recording of what was actually said, trying to extract the key points or ‘matters arising’ from an hour of audio at a later date is a thankless task.

It’s all about time

Indeed, a general point might be “Don’t rely on recordings!” If someone genuinely hasn’t got time to spend an hour online with you, when at least there’s some real-time interaction possible, it’s unlikely they have time to passively listen to an hour’s recording. There’s a good reason why meeting minutes are usually in summary form, not verbatim. Time is money, after all.

Of course, formal face-to-face meetings often still have minutes. However, we all rely so much now on disparate technological aids for collaboration – smartphones, CRM systems, project management apps – that even if we do have a set of meeting minutes, they need further processing to make them useful.

So what can we do? For a start, we need to bring back meeting minutes, even for e-meetings – but they need to be minutes fit for the 21st century. For a start, that means making them searchable, shareable and connected. And yes, it might mean applying AI to generate a précis.

Once you realise the problem, you probably won’t be surprised to read that there’s quite a few applications offering to do much of this for you. Most are all-in-one solutions, though, combining note-taking with their own task tracking and so on. If you’re happy with working in a single vendor ecosystem then that’s fine – hello Microsoft Teams! – but it’s not the way everyone works.

So there’s also an emerging set of software tools that enable meeting notes to be linked into other tools – turning action items into tasks, activities or tickets in whatever best-of-breed CRM, project management or helpdesk app you happen to use. You can do much this through a collaboration platform – Slack has a lot of relevant integrations here, for example, as in different ways does Dropbox Paper – but I’m thinking more of the likes of Hugo’s eponymous team-working app. I recently talked with Hugo’s founders about making e-meetings more effective and less wasteful, and they were vocal on the foundational role here of ‘actionable’ minutes.

Is your meeting software making you behave badly?

Whichever route you take, it’s all symptomatic of a wider effect we’ve been observing in our research. This is that, yes, we shape our tools, but then our tools tend to shape us – or rather, they shape our behaviours. The resulting behaviours may not be obviously toxic, but they are clearly not optimal either.

How do we pull those behaviours back on course? A first step, obviously, is to recognise that there is a problem. Then it’s understanding where it comes from and the influence our e-meeting toolkit has on it. And lastly, it’s hacking that toolkit to help us rediscover those essential yet near-forgotten meeting skills and disciplines.

Have you hacked your e-meeting toolkit for better results? Let us know in the comments below.


September 7, 2018  10:22 AM

What can Chromebooks offer the enterprise desktop market?

Richard Edwards Richard Edwards Profile: Richard Edwards
Chromebook, Desktop, Google, Microsoft, PC, Windows

If you’re an IT professional managing a large desktop estate or an end user with a Windows 10 PC, you’ll be aware that Windows is undergoing a tortuous transformation, with only enthusiasts and devotees immune to the pain imparted by a thousand tiny cuts. So, is Microsoft caught between a rock and hard place with no viable alternative PC operating system to offer its customers or is there another path for a company that advocates a growth mindset?

Mutton dressed as lamb

The modern PC, no matter how well it’s packaged, is still a workstation in disguise. It runs a complex operating system designed for a different era; one that many have now forgotten and many more have never experienced. And what’s more, this PC model is rooted and jailbroken.

If I were to characterise Windows 10 today, I might describe it as ‘mutton dressed as lamb’. Now, if you’ve developed a taste for Windows over the years, you’ll adapt to the fresh new look of the OS and be all the better for it. But we should recognise that many new entrants to the workforce are raised on the web and suckled on Android and iOS devices. How are they going to adapt to the world of Windows?

Mobile operating systems have partially filled the modern computing gap, but most people, even young Millennials, still recognise the occasional need for a traditional large screen device with keyboard and mouse support. Android and iOS are scaling-up to address this use case, but as they do so, we’re seeing complexity creep in with every update. This gives me that déjà vu feeling and I fear that history is going to repeat itself unless we consider a different operating system approach.

Unleashing the Chromebook hidden inside your PC

Google recently hammered-home the pain points associated with today’s PCs (and Macs) in an ad that openly mocks today’s incumbent desktop operating systems. The back-to-school market is the focus of the ad, which is already responsive to Chromebooks and Chrome OS, but I’m sure it prompted a few IT pros to consider the question ‘Is there any business value in Chromebooks and this style of operating system for my organisation?’

Before we go any further, it’s important to recognise that some jobs and activities are well matched to the capabilities of a modern Windows PC or Mac. Indeed, some roles are almost defined by the applications that run on these platforms. For these use cases, and there are plenty of them, the traditional desktop or laptop computer offers an excellent fit, but we don’t have to look too hard to find occupations that sit comfortably within the sweet-spot of the modern browser-based computer.

So, if I offered you a magic wand to convert some of your legacy PC/Mac purchases into something more akin to a Chromebook, would you want to give it a wave? Well, you can do just that if you’ve got a USB memory stick and a few minutes to spare. CloudReady is an operating system from Google-invested Neverware. It’s based on Google’s open source Chromium OS and shares the same architecture as Google Chrome OS. Booting from the USB drive, I was able to transform a 2006 Dell Inspiron laptop into a very usable computer and, without touching the factory-installed Windows 10 OS, my Lenovo X1 Carbon enjoyed being a sporty Chromebook for a long weekend.

The browser is the most important program on your PC

No one doubts the convenience of being able to access the web with a device that you can carry around in your pocket, but many roles and activities need a large screen experience, usually supported by a keyboard and mouse, to make best use of enterprise web applications and tools.

Alas, for some employees, using a fully managed corporate PC isn’t always the most rewarding of activities, even when it’s just the web browser being used. And our experience tells us that this only gets worse over time due to changes in stuff that has nothing to do with accessing those web-based apps and resources. Chromebooks, on the other hand, are optimised for efficient use of the web and cloud services.

There’s plenty of choice on the market if you’re looking for a new Chromebook, Chromebox or Chromebase, but if you want to explore the business value of these devices, CloudReady Enterprise Edition is well worth evaluating, especially if you’re looking at large PC refresh program (Windows 7 support will end January 14, 2020) or have specialised needs, such as ruggedized PCs or equipment certified for use in hazardous areas.

Who’s got the mindset to back a browser-based OS?

Official figures for Chromebook shipments are hard to find, but StatCounter has Chrome OS at 0.5% of desktop operating system market share worldwide. This means there’s plenty of opportunity for vendors other than Google to make their mark on the browser-based OS market.

All the tech industry’s Titans have the resources, capability and opportunity to create their own operating system based on the Chromium OS project, and it’s not that hard to imagine the likes of Amazon, VMware, IBM, Huawei, etc. taking the plunge, each underpinning its efforts with cloud offerings and complementary services. And it would be an interesting ‘thought experiment’ for Apple. There’s plenty of untapped business value and market potential in the desktop model, and Google’s already out there trying to grab it from under the noses of Microsoft and Apple, especially in the education sector.

Windows 10 in S mode is the only alternative to Windows 10 that Microsoft has to offer right now, and that’s not enough. Microsoft Office 365 already sits comfortably within the browser environment, especially when using the Microsoft Office Online extension, so what’s it got to lose? If we look at Chrome OS, then swapping out Google Drive in favour of OneDrive would seem to be a trivial task. Linux, the foundation on which Chrome OS is built, is no longer anathema to Microsoft, so no problem there either. And of course, the company already makes Edge for iOS and Android devices. It’s beginning to feel like a no-brainer.

The future of managed desktop services

There are many ways to deliver the modern digital workspace, and Chromebooks present enterprises with yet another option. But Microsoft isn’t going to give the desktop away without a fight, and if the rumours are to be believed, we might even see the company moving into the managed desktop services market. Working with its partners, Microsoft might be onto a winner here if it can offer enterprises the right combination of devices, services and support for legacy Windows applications. Windows 10 and Windows 10 in S mode will be part of the mix for sure, but a complementary, browser-based OS would seem to make a sensible addition to the operating system line-up too. What do you think?


August 24, 2018  2:49 PM

DevOps is wasted unless the business goes Agile too

Bryan Betts Bryan Betts Profile: Bryan Betts
Agile, Business transformation, Change management, DevOps

When analysts and journalists write about Agile and DevOps, the focus is typically on the technology and infrastructure – what it means for software and systems development in an age when the nature of business demands the ability to quickly adapt and change. Often, they also discuss the need for a ‘business champion’ who will promote the project to the operational side of the organisation, and in particular to the board.

What doesn’t get so much coverage, as I was reminded recently while talking with Matt Lovell, who runs Centiq, a consultancy specialising in SAP HANA, is that the business also needs to change. In fact, given that IT is normally there to support or deliver the organisation’s commercial aims, not vice versa, there’s not much point making the IT adaptive and flexible if the business isn’t also adaptive and flexible. That’s especially true for projects intended to create real-time business intelligence or manage supply chains and customer relationships, of course.

Yet when we talk to people who work in the field of Agile and DevOps, they report that it can be hard getting this across to clients. Too often, they assume that implementing DevOps will simply let them carry on as before, but with more regularly updated software. What a waste that would be!

Agile change needs a new business mindset

To make it more challenging, it’s not simply a case of changing the business once then letting the ‘new version’ bed in. DevOps is continual, iterative change, and it requires a serious change of mindset. That kind of real-time shift can be just as profitable and productive for the business and its managers, but they are going to require even more help and support than the DevOps folk, which is where Matt and those like him come in.

It’s one thing to adopt Agile methods for development, he says, but embedding the same Agile mindset into customer relationships is another challenge altogether, as is helping companies “fundamentally change how they think, and how they work with customers and the supply chain. [They need to understand that] if we can deliver iteratively on a platform that’s more real-time, we can have more free and open discussion about the business’s future and capabilities.”

In particular, the new perspectives opened up will almost certainly involve technology, but they won’t necessarily be led or constrained by it.

It would be easy to assume here that the main problem is organisational inertia – the “We do it this way because we’ve always done it this way” attitude – and of course that kind of thinking does still exist. But as Matt points out, any organisation that’s investing in Agile and DevOps clearly has an appetite for change, so there must be more to it than that.

You don’t know what you don’t know

“Business people don’t know what they don’t know,” he says. “The business tends to focus on infrastructure and traditional viewpoints – they assume the old constraints are still there. You need to know that the problem is solvable before you can pay attention to it.”

That requires education, expectation setting, business change management and more. Perhaps it could start with going to the business and saying, “What if we take technology off the critical path, how does that change your thinking?”

Have you engineered successful business change along these lines – or did you try it but fail? Either way, we’d love to read your thoughts and advice below.


August 16, 2018  9:33 AM

Is IBM relevant to today’s mainstream?

Dale Vile Profile: Dale Vile
Development, innovation, research, science

When IT vendors talk about the market, they tend to focus on how they would like the world to be rather than how it actually is. This often means they fail to connect as well as they could with the business mainstream. The assumption is that every organisation wants to exploit leading/bleeding edge technology to become a leader at all costs, and regardless of the risk. Rhetoric like “You are either a digital predator or digital prey”, epitomises this disconnect – it just sounds like sensationalist noise to most normal people.

The reality is that the majority of business leaders think more in terms of holding their own in a competitive market. Sure, they don’t want to fall behind, but they also know that it’s usually much more profitable to be a fast follower than a pioneering leader. And if there was any doubt about this principle, just look at Apple in the consumer electronics space. Against this background, most technology companies would find life a lot easier if they simply acknowledged current reality as the starting point for conversations, including the fact that not everyone immediately commits to the latest ideas and technology options promoted by vendors and analysts.

Shaping the future of industry

One IT vendor that can arguably be forgiven for mixing up the present and future tense, however, is IBM. For better or worse (from a shareholder perspective), it has carved out a role for itself in trying to genuinely shape the future of how technology is used to transform not just individual enterprises, but whole industries, cities and communities. In effect, it is constantly trying to bring the future into the present, albeit typically for a minority of organisations, namely those that are very large, highly-progressive and well-funded. This in turn means a big focus on complex, services-led engagements that exploit emerging technologies in areas such as AI, IoT, Blockchain, and so on – clearly not for everyone, but fine if you are also an organisation that wants to change the world.

IBM is obviously not the only player attempting to work at this level, but one thing that sets it apart from its competition is its investment in genuinely pioneering research and development (R&D). It consistently registers more patents on an annual basis than any other company – and by quite a long way. In 2017, for example, the IBM patent tally was reported to be 9043, while the second and third ranking companies, Samsung and Canon, registered 5837 and 3285 patents respectively.

As an aside, Apple, as the most successful ‘fast follower’ on the planet, was in the eleventh slot with 2229 patents; while it’s clearly an innovator, unlike IBM, my impression is that it focuses less on the ‘R’ in ‘R&D’, and more on the ‘D’.

Now I know that the patent-tally is only one measure, and some would say a dubious one at that given all the abuse that goes on in that area. In IBM’s case, though, I was left in no doubt about its commitment to technology innovation when I recently visited the company’s research labs in Zurich. This is one of 12 facilities employing over 3,000 researchers around the world, and by researchers we mean everyone from Nobel Laureates to specialists working in areas ranging from Mathematics, Physics, and Chemistry, through Materials Science, Electrical Engineering and Computer Science, to Computational Biology and Behavioural Science.

When you get to see some of the projects these guys are working on first hand, and hear them explain what they do, it’s pretty impressive. They might not know how everything they are doing in areas such as Quantum Computing and Advanced Cryptography will ultimately translate to practical applications, but you can almost feel the future being shaped.

For someone like me who has had concerns about IBM losing its way from a product strategy and go-to-market perspective, all of this is very reassuring. It tells me that despite some of the company’s widely reported execution challenges, under the surface there is still a very solid culture of innovation, with a steady flow of goodness through into its technology portfolio.

Coming back to where I started, however, while IBM really does have a lot to offer the mainstream, including even smaller companies in the midmarket, I fear the ongoing messaging disconnect – which makes so many ‘normal’ companies perceive that IBM is not for them – will continue to prevent that potential being fully realised. I’m a big fan of aspirational thinking and aiming high when setting objectives, but when the line is crossed into the realm of fantasy for a given audience, it really doesn’t work.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: