Data Matters


November 9, 2018  8:51 AM

Once more on Dominic Raab at Tech Nation

Brian McKenna Profile: Brian McKenna

Dominic Raab, the Secretary of State for Exiting the EU, has been in hot water for his late-dawning admission of the importance of the Dover-Calais route to the UK’s trade in goods.

Raab was speaking at a Tech Nation event in London. In an aside, he candidly admitted he had not previously fully appreciated the extent of how reliant the UK’s trade in goods is on the Calais to Dover route.

Here is the full quote:

“We want a bespoke arrangement on goods which recognises the peculiar, frankly, geographic and economic entity that is the United Kingdom. We are, and I [hadn’t] quite understood the full extent of this, but if you look at the UK, and how we trade in goods, we are super [?] reliant on the Dover-Calais crossing, and that is one of the reasons why – and there has been a lot of controversy on this – we wanted to make sure we have a specific and proximate relationship to the EU to ensure frictionless trade at the border, particularly for just-in-time manufactured goods, whether pharmaceutical goods or perishable goods like food”.

Raab’s political enemies seized on these words as indicating unfitness for his role, ignorance of British geography, failure to do his homework before committing to Brexit ideologically (he is, unlike Theresa May and Philip Hammond, a dyed in the wool Brexiteer), and so on.

Then again, you could say, from a Remain point of view: “better late than never”. Or, from a Brexit point of view, that this half-swallowed, fleeting “gaffe” is part of a cunning plan to persuade Tory dissenters to sign up to the Chequers plan – which is what Raab was advocating.

Or you could just say he was speaking in a relaxed and colloquial matter before a modest gathering of mostly tech entrepreneurs, whose focus is mostly software and services, not physical things, and was just being honest.

Or, yet again, this is an example of the ostensible self-deprecation that used (at least) to be characteristic of university dons professing ignorance of something they actually know a great deal about. (“I really hadn’t appreciated the full significance of Derridean deconstruction until I applied it to that which is not said in the texts of Jane Austen. Pass the port”).

Personally, I would be less sanguine than he expressed himself as being about the future of our AI sector, post-Brexit.

Perhaps Raab will be less candid in future. Politics is a murky business.

November 2, 2018  4:13 PM

The Bond of Intelligence at Oracle Open World

Brian McKenna Profile: Brian McKenna

Oracle Open World 2018 seems a world away. Especially, when you have had a few days’ furlough in between.

Mark Hurd’s second keynote at OOW convoked a common room of former spies from the US and UK intelligence world. It was an interesting session, and I thought I’d offer a few words by way of reflection. A very good account of the session is given by Rebecca Hill over at The Register.

Joining Hurd on stage were Oracle’s own chief corporate architect, Edward Screven, Michael Hayden, former director of the CIA, Jeh Johson, former head of the Department of Homeland Security, and Sir John Scarlett, former head of our own Secret Intelligence Service, also known as MI6.

I’ll put the comments of the Americans to one side. John Scarlett said an arresting thing about how British people typically see the state as benign, whereas Americans do not – partly, he joked, as a result of the actions that provoked their rebellion in 1776. He didn’t mention the British burning down of the White House in 1814, which Malcolm Tucker said we would do again, in the film In the Loop. That might have been a joke too far.

Sir John said: “in the UK, people don’t look at the state as a fundamental threat to our liberties. In the US you have a different mentality – partly because of us”.

But are we really so insouciant about the state and its surveillance? There are many contumacious traditions on these islands. The London Corresponding Society, of which Edward Thompson writes in his magisterial The Making of the English Working Class is one such on the radical left, in the late eighteenth century. And, on the right, there are libertarian Tory anarchists aplenty, from Jonathan Swift to David Davis, the “Brexit Bulldog”. And these anti-statists are just as British as the undoubtedly fine men and women of SIS.

Scarlett did, though, have interesting things to say. For instance, about the Cold War being a riskier time than now. “There is a tendency now to say we live in a world that is more unpredictable than previously. But the Cold War was not a time of predictability and stability. There was the nuclear war scare in 1983. It was the real thing, and not many people know about it”.

However, he said he does now see a perilous return of Great Power tensions and rivalries and that technology is a great leveller in that respect, with the relationship between the US and China being at the centre of that new “great game”. He also said that there is not as yet a “sense of the rules of the game to agree on whether a cyber-attack is an act of war”. And added: “I am wary of talk of a ‘cyber 9-11′.  I think that is to think in old ways”.

Later in the discussion he said he comes from “a world of security and secrecy where you protect what you really need to protect. That is critical. Attack and defence will continue to evolve, but the philosophical point is you need to be completely clear about what you really, really need to protect. You can’t protect everything”.

One small point. I was amused and interested to hear Mark Hurd pronounce SIS as “Sis”. Was John Scarlett too British and polite to correct him? Or is our security service affectionately thus known among the security cognoscenti on the other side of the Pond?

All in all, it was an interesting session. And it caused me to re-read the chapter on the special relationship between the UK and US intelligence communities in Christopher Hitchens’ Blood, Class and Empire. There, in ‘The Bond of Intelligence’, he writes (of an episode in Nelson Aldrich’s book Old Money):

“it is difficult to think of any more harmonious collusion between unequals, or any more friendly rivalry, than that existing between the American and British “cousins” at this key moment in a just war [the Second World War]. In later and more caricatured forms it has furnished moments of semi-affectionate confusion in several score novels and films: the American doing his damnedest to choke down the school-dinner food in his plummy colleague’s Pall Mall Club; the Englishman trying to get a scotch without ice in Georgetown. It is the foundation of James Bond’s husky comradeship with Felix Leiter, and of numerous slightly more awkward episodes in the works of John le Carré”.

 


November 1, 2018  3:01 PM

How data is driving the future of invention

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Julian Nolan, CEO, Iprova

A technology revolution is taking place in the research and development (R&D) departments of businesses around the world. Driven by data, machine learning and algorithms, artificial intelligence (AI) is helping scientists to invent new and better products faster and more effectively. But how is this possible and why is it necessary?

Invention has long been thought of as the product of great minds: the result of well-rounded scholars and thinkers like Leonardo Da Vinci and Thomas Edison making synaptic links between ideas that most people would never consider. And for hundreds of years, this has indeed been the case.

However, the times are changing and we’re currently in a position where information experiences exponential growth, yet innovation and invention has slowed. Great minds are still behind new products and services, but the vast quantity of information now available to mankind exceeds the grasp of a single researcher or R&D team — particularly as many researchers specialise in narrow fields of expertise rather than in multiple disciplines. Developments outside of those fields are often unknown, even though they may be relevant.

As such, we find that many new patented inventions are not the result of making truly novel links between concepts, but rather a linear step forward in the evolution of a product line.

This is now changing by putting artificial intelligence at the core of the invention process itself. At Iprova we have developed a technology that uses advanced algorithms and machine learning to find inventive triggers in a vast array of sources of information, from new scientific papers to market trend research across a broad spectrum of industries.

This technology allows us to review the data in real-time and make inventive connections. That’s why we are able to spot advancements in medical diagnostics and sensor technology and relate them to autonomous vehicles for example.

According to the European Patent Office (EPO), the typical patenting process is 3–4 years. When you consider that the typical research process from conception to invention takes place over a similar amount of time, most companies are looking at a minimum of six years to bring products to market.

This is where machine learning makes a big difference. Our own technology reviews huge amounts of data and identifies new inventive signals at high speed, which means that our invention developers can take an idea and turn it into a patentable invention in only a couple of weeks — significantly reducing the overall lead time and research costs of inventions.

Thinking back to Da Vinci or Edison, the only reason why we still remember their names today is because their inventions were ground breaking at the time. Others may have been working on similar creations, but their names didn’t make history because of one simple fact. They weren’t first. Fast forward to today, being first is all businesses care about when it comes to taking new products to market. Yet, in the age of data explosion this can only be achieved in one way – using artificial intelligence at the invention stage itself.


October 3, 2018  1:14 PM

How planning, data, and ‘digital string’ can combat the global food waste crisis

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Colin Elkins, global industry director, process manufacturing, IFS.

Global food waste, or food loss as it is classified by growers and manufacturers, is around 1.3 billion tons per year, which is one third of all food produced and twice the weight of the world’s current population!

In developing countries, waste is due to inadequate or inappropriate agriculture and storage, whereas in the developed countries it is caused by consumption. In the US consumption accounts for a staggering 40% of all US food wasted, whereas in sub-Saharan Africa only five percent of food is wasted by the consumer.

Our agricultural and production systems today are a wonder to behold, computer-based crop and animal management, GPS based harvesting and field fertilisation techniques, robotic graders and sorters, vison recognition and rejection systems are all part of the journey from our fields to our factories. The utilisation of these and other technologies has made the manufacturing element of the supply chain the lowest waster of food in developed countries. There is however one exception, fresh produce – fruit and vegetables are amongst the highest contributors to waste. This is not due to poor technology or practice, but due to the perceived demands of retailers and consumers.

Retailers desire to make their products look attractive to the consumer has driven up the waste levels for growers and producers. Size, shape, colour and consistency all form part of the ‘retailer specification’ outside of which the produce is simply discarded and returned to the soil. Yields of less than 50% are not uncommon for minor deviations from specification.

Retailers are often seen as the main villain in terms of creating this ‘artificial’ waste, but are they simply pandering to consumer needs? Consumer demands for perfect produce are also a key culprit.

Technology will not solve the food waste problem completely, as a cultural change is required on the part of the consumer, however it can go part of the way. Growers and producers already innovate by developing smarter farms, factories and processes to reduce waste, using the latest technologies like AI, and The Internet of Things (IoT) in conjunction with their ERP Software.

Where the food manufacturers can play a part in waste reduction is in the planning phase. What many consumers don’t realise is the level of planning that goes into the production and packaging, to be able to supply the correct amount for consumer needs. The combination of high demand from retailer and consumers means that if supply does indeed outstrip demand, then both the fresh produce and its packaging are often discarded.

Predictions and planning are an area that technology has already redefined in recent years, and the manufacturing industry is no different. Until fairly recently, food manufacturers were dealing purely with a gut feeling, sporadic weather reports, and disparate systems in order to plan for yield and supply.

In the same way that predicting the weather is now far more data-driven than ever before, the same is true for predicting demand and matching supply for fresh food products. IoT and its network of physical devices, vehicles, sensors, appliances and other items embedded with electronics, are producing vast volumes of potentially critical and valuable data for the food manufacturing industry, and the ability for these devices connect and exchange data with the planning function is key. The future of the industry is not simply in each smart device, but in the connectivity between them, something I refer to as ‘digital string’.

The strings of data must converge to a central, visible and accessible point within the manufacturing organisation, so the planning decision makers have this critical data delivered to them, all in one dashboard. Utilising data from internal sources like sales, promotions and forecasts is normal today, but with the use of Artificial Intelligence it will be possible to integrate that data with external sources like weather patterns, public events and holidays and then analyse and learn from the results to then optimize the plan and reduce waste. The data is out there, ripe for the picking, and if food manufacturers can harvest it and bring it within the factory to inform planning processes, the industry can do much more to reduce waste, whilst making significant savings.

From an ideological perspective, a fundamental change is required on the part of the consumer to reduce packaging consumption and accept a wider spectrum of perceived fresh food ‘quality’ when it comes to fresh produce. Reducing food waste is everyone’s responsibility, and like many other sustainability efforts, technology is available to manufacturers and producers to help them do their bit, whilst driving efficiencies and cost savings within their business. It’s a win, win.


October 3, 2018  1:01 PM

How a microservices-based architecture can reimagine enterprise software

Brian McKenna Profile: Brian McKenna

This is a guest blog post by Savinay Berry, VP of Cloud Services at OpenText

Microservices are an increasingly important way of delivering an alternative to traditionally large, complex and monolithic enterprise applications. With a microservices architecture, applications can be broken down and developed through a collection of loosely coupled services that allow continuous development without having to redeploy the entire application. It is essentially a variant of service-oriented architecture (SOA).

Think like IKEA

For a simpler way to think about how a microservices architecture works, take the example of how the making and buying of furniture for the home has evolved. Traditionally, if someone needed a sofa or a dining room table, they would go out to a furniture store and buy the item for a few thousand pounds. They would get good use out of it and might buy a maintenance plan to cover any repairs.

Then along came Ingvar Kamprad, a 17-year-old Swedish boy who started a business selling replicas of his uncle Ernst’s kitchen table. That company became IKEA and broke the traditional model of buying furniture as a complete product that was already assembled. Instead, each item of furniture was broken up into easily assembled pieces that people could put together themselves. Manufacturing became cheaper as a result, and there was no maintenance or ongoing cost. The customer gets the modern dining room table or bookshelf they wanted, but at a fraction of the cost compared to a traditional furniture store. Efficiencies also resulted for IKEA; the modular design meant that operating logistics were simplified, making things cheaper at the back end.

The same is now happening with software. Traditional on-premises software is expensive, plus there is the cost of ongoing maintenance and support. Like in the IKEA analogy, software is instead broken up into easily consumable services, at a lower cost model.

Adoption of microservices, and cloud microservices in particular, is growing fast. Research and Markets predicts the global cloud microservices market will grow from $683.2m in 2018 to $1.88bn by 2023, driven largely by digital transformation and the retail and ecommerce sector, where customer experience is a key competitive differentiator.

A way to unleash the power of data

How does a microservices architecture work in the real world? Take the example of a large insurance company that provides homes and buildings insurance.

Previously, documents, insurance contracts and financial paperwork would be taken from the content management system and emailed to the builder. The builder would complete them and email them back. The insurance administrator would then print the email, scan the documents and put them into the content management system. This was happening on a daily basis, thousands of times a year.

Using a microservices approach, the insurance company was able to digitise the entire process. The administrator can now share documents by just clicking a couple of buttons, the builder can edit the documents, and have them automatically sent back into the content store. There is no longer a need to print or scan documents. This is what digitising businesses means in the real world. It’s not just a high-level, abstract IT industry concept. It’s a real application that saves time and money for enterprises.

Another example is a company in the utility sector, which will be able to use artificial intelligence (AI) to help maintenance crews search for parts. If the crews are searching for a part they get prompted with similar or related searches to what they are looking for, similar to Amazon’s recommendation engine approach. It is about leveraging the power of data stored in your own systems. And while the algorithm is important, it’s not the key factor. What really matters is the data – that’s the true differentiator.

Enabling innovation and transformation

But microservices isn’t just about making things easier to build and maintain, and cheaper to deliver and run. It’s also about what microservices enables enterprises to do. This approach provides the platform for businesses to completely reimagine applications and, in doing so, transform the customer experience.

A microservices architecture is more resilient and reliable. If one service goes down, the rest of the application will still be available to the customer. Microservices are also more scalable and can easily be configured to automatically scale up and down to meet peaks and troughs in demand, ensuring the customer experience is maintained without holding on to excess capacity.

Microservices also enable faster innovation, development and deployment. Developers have more freedom, because code for different services can be written in different languages. It allows for easy integration and automatic deployment of services and accelerated time-to-market for new services and products, or the ability to easily make enhancements to existing ones.

Microservices are a key enabler in the world of cloud and digital. But it’s not just about developers, code and technology. As with the cloud, it’s not simply a deployment model change – it’s a mindset and cultural change. And that’s what true digitisation provides – while inspiring new ways to work.


September 21, 2018  3:28 PM

It’s time to tackle your dirty data

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Francois Ajenstat, Chief Product Officer, Tableau

Data analysis is only as good as the quality of your data. Anyone who has ever analysed data knows that the information is rarely clean. You may find that your data is poorly structured, full of inaccuracies, or just plain incomplete. You find yourself stuck fixing the data in Excel or writing complex calculations before you can answer even the simplest of questions.

Data preparation is the process of getting data ready for analysis, including data sets, transforming, and filtering—and it is a crucial part of the analytics workflow. People spend 80% of their time prepping data, and only 20% of their time analysing it, according to a recent article from Harvard Business Review.

Most analysts in an organisation feel the pain of dirty data. The amount of time and energy it takes to go from disjointed data to actionable insights leads to inefficient/incomplete analyses and declining trust in data and decisions. These slower processes can ultimately lead to missed opportunities and lost revenue.

Organisations spend a lot of time building curated data sets and data warehouses to support the needs of the business. But even with these practices, it is likely that some level of dirty data will seep through the cracks of day-to-day operations. So, how can we solve the common data prep issues?

Issue #1: More data creates more problems

We live in a world where a variety of data is constantly being generated. With this windfall of data now flowing faster than many businesses processes, organisations are struggling to keep up. We also can’t be certain about how this information will be used in the future.

Solution: Enable self-service data preparation

Visual, self-service data prep tools allow analysts to dig deeper into the data to understand its structure and see relationships between datasets. This enables the user to easily spot unexpected values that need cleaning. Although this technology brings clarity to the data, people will still need support to understand details like field definitions.

Issue #2: Dirty data requires a waterfall approach

Analysts report that the majority of their job is not analysis, but cleaning and reshaping data. Every time new data is received, analysts need to repeat manual data preparation tasks to adjust the structure and clean the data for analysis. This ultimately leads to wasted resources and an increased risk of human error.

Solution: Develop agile processes with the right tools to support them

Every organisation has specific needs and there is no ‘one-size-fits-all’ approach to data preparation, but when selecting a self-service data preparation tool, organisations should consider how the tool will evolve processes towards an iterative, agile approach instead of creating new barriers to entry.

Issue #3: Data preparation requires deep knowledge of organisational data

Before preparing data, it is crucial to understand its location, structure, and composition, along with granular details like field definitions. Some people refer to this process as “a combo of data” and it is a fundamental element of data preparation. You would not start a long journey without a basic understanding of where you’re going, and the same logic applies to data prep.

But often, because of information silos, users have less insight into the entire data landscape of their organisation—what data exists, where it lives, and how it is defined. Confusion around data definitions can hinder analysis or worse, lead to inaccurate analyses across the company.

Self-service data preparation tools put the power in the hands of the people who know the data best—democratising the data prep process and reducing the burden on IT.

Solution: Develop a data dictionary

One way to standardise data definitions across a company is to create a data dictionary. A data dictionary helps analysts understand how terms are used within each business application, showing the fields are relevant for analysis versus the ones that are strictly system-based.

Developing a data dictionary is no small task. Data stewards and subject matter experts need to commit to ongoing iteration, checking in as requirements change. If a dictionary is out of date, it can actually do harm to your data strategy.

Issue #4: “Clean data” is a matter of perspective

Different teams have different requirements and preferences regarding what makes “well-structured” data. For example, database administrators and data engineers prioritise how data is stored and accessed—and columns may be added that are strictly for databases to leverage, not humans.

If the information that data analysts need is not already in the data set, they may need to adjust aggregations or bring in outside sources. This can lead to silos or inaccuracies in the data.

Solution: Put the power in the hands of the data experts

Self-service data prep gives analysts the power to polish data sets in a way that matches their analysis, leading to faster, ad-hoc analyses and allowing them to answer questions as they appear. It also reduces the burden on IT to restructure the data whenever an unanticipated question arises. This can also reduce the amount of duplicated efforts because other analysts can reuse these models. If the datasets are valuable on a wide scale, you can combine them into a canonical set in the future.

Issue #5: The hidden reality of data silos

Self-service business intelligence tools have opened up data analysis capabilities to every level of user, but to get insights into their data, users still need to wait and rely on IT for well-structured data. If people are not willing to wait they will simply extract data into excel. This often leads to data littered with calculation errors that have not been vetted, and eventually, results in inconsistent analysis. Repeating this process leads to an abundance of data silos, which are not efficient, scalable, or governed.

Solution: Create consistency and collaboration within the data prep process

Combatting silos starts with collaboration. Scheduling regular check-ins or a standardised workflow for questions allows engineers to share the most up-to-date way to query and work with valid data, while empowering analysts to prepare data faster and with greater confidence.

Companies should enable employees to do the prep themselves in an easy to use tool that they can share with others. This will mean that organisations can see what data employees use and where it is duplicated, so they can create processes that will drive consistency.


August 15, 2018  11:15 AM

Why Graph Needs an SQL

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Neo4j co-founder and CEO Emil Eifrem, who explains why his sector needs its own ‘General Query Language’.

In the same way that SQL helped facilitate the growth and subsequent domination of RDBMS over the database market for the last two decades, to secure its next stage of enterprise acceptance, graph software needs to create a standard property graph query language.

Why? Because put at its simplest, a common query language could open the doors to graph technology being used more widely in exactly the same way the lingua franca of SQL propelled relational to the summits 30 years ago.

The main reasons why are detailed in an open letter we, as Neo4j, are publishing for the entire database world, the GQL Manifesto, and I’d like to summarise why here. Basically, there are two groups that need to come together to achieve a common graph industry language. There is industry, like Neo4j with Cypher and Oracle with the PGQL language, and academia. The property graph researchers in academia have moved on from query languages like XPath for querying documents or SPARQL and RDF, and are now interested in the wider property graph context.

A next-generation graph query language

Over the last 18 months there’s been a lot of behind-the-scenes activity on unification that led to the Manifesto and there’s a lot more commonality we can build on. Oracle has achieved some great innovations with PGQL, and PGQL is very close to Cypher, with the two languages coming from the same fundamental design approach. That means that you’ve got the primary designers of Cypher and PGQL already collaborating with each other (for example, on limited graph querying extensions to SQL) and with researchers on what a good, next generation graph querying language needs to look like.

So what we need to do is to take all that intellectual ferment and energy and turn it into actuality. There are two key ways to do this. One is making it even easier to express complex patterns in a very concise way in a graph database – regular path queries, and regular expressions, which can be used to define the pattern of things you want to look at. Second, there’s composable graph querying, where you glue together queries and one query can feed off the output of another, using that output as its input.

Put that together and we have a GQL at least as good as SQL’s first implementation – and we need to find a way to make that happen. Many graph database practitioners have, I believe, come to the conclusion that these different languages should come together and be the inputs to a new unified language for the whole market.

Hence the manifesto campaign – a joint, open, collaborative initiative where we want everyone to make a contribution. Including developers or managers or line of business users looking for the kind of innovation and creative data modelling you can only get from this form of database technology.

If you agree with me and graph database advocates like us that the time is right to create one standard property graph query language, then we would love to receive your vote. Find out more at The GQL Manifesto main website [https://gql.today].

We hope you agree – and we can add your voice to the discussion.

The author is co-founder and CEO of Neo4j, (http://neo4j.com/)


August 14, 2018  10:36 AM

How automation can help create value, the 5th ‘v’ of data

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Neil Barton, CTO, WhereScape

For CDOs, defining the data and analytics strategy for their organization is increasingly the most important aspect of their job, with 86% ranking it their top responsibility, according to the 2017 Gartner annual Chief Data Officer Survey.  The four “Vs” of data are well known – volume, velocity, variety and veracity. However, data warehousing infrastructure in many organizations is no longer equipped to handle these. The fifth elusive “V” – value – is even more evasive. Meeting these challenges at the scale of data that modern organizations have requires a new approach – and automation is the bedrock.

It’s all about finding methods of using data for value creation and revenue generation, which occupies 45% of a CDO’s time. This means harnessing the growing beast that is data in a way that is practical, manageable and useful.  That’s where the data warehouse comes in, providing a centralized space for enterprise data that business users, including the CDO, can use to derive insights.

Creating a successful data warehouse, then, is critical for CDOs to succeed in monetizing data within their organization. However, the traditional waterfall approach to data warehousing, first introduced in the 1970s, delivers only a fraction of the value that it could potentially offer. Instead, the approach needs to evolve to become more responsive as organizational needs change, addressing new data sources and adapting to business demand.  Using automation software to design, develop, deploy and operate data warehouses is providing far-reaching value to business leaders. This change is positioning IT to incorporate timely technologies and new data sources more easily, as well as flex when business needs demand it.

The CDO’s role in re-invigorating the data warehouse

As the central storage point for enterprise data, the data warehouse is invaluable for providing business users with information. However, as users become more aware of the potential benefits of data-driven decision making, the gap between user expectations and the data warehouse’s ability to provide current, consumable data in a timely manner has grown. Businesses want insights from data far faster than previously. This is exacerbated with the growth of new forms of data, particularly semi-structured or un-structured information, including social media data, sensor data, real-time messages and client communications, and video/audio files.

Traditionally, data warehouse development and evolution required long-cycle IT projects, incongruent with the needs of a more agile project design and build environment. As the change agents for digital transformation, CDOs must lead in the re-architecting of data warehouses, from artistry to acceleration and automation in order to improve business time-to-value.

Practical steps for the successful implementation of an automated data warehouse

As IT departments are expected to do much more with much less, processes need to change. Data warehouses can no longer be created “artisinally” – rather than crafting a complex infrastructure with unique configurations and a long lifespan, IT teams need to focus on producing an adaptable decision support infrastructure. This means creating a data warehouse that can transform, with ease, as business needs change. Here are five steps for CDOs to help their company achieve this:

  1. Understand the desired outcomes

Before making any decisions as to the future of your data warehouse infrastructure, CDOs need to ensure they understand the specific challenges the business teams are facing where data could help. In essence, the data warehouse automation and modernization program needs to be built around enabling decision-making that will lead to differentiation in the market place.

According to a TDWI survey, re-alignment to business objects is the top reason for data warehouse modernization, selected by 39% of respondents. By enabling collaboration between business teams and IT teams, the CDO helps chart the course for how business goals and technology meet. In turn, this will lead to overall business transformation, accelerated through the new data warehouse’s approach to data-driven decisions.

  1. Understand what you have already

Most organizations already have sophisticated data management tools deployed as part of their infrastructure – however these may not be working to the fullest of their abilities. Organizations already using SQL Server, Oracle, or Teradata, for example, have a range of data management and data movement tools, already within their IT real estate, which can be automated and leveraged more effectively as part of a data warehouse automation push.

However, in that inventorying process, CDOs should be ensuring they have considered the capacity requirements of their data warehouse. Data is expected to continue growing exponentially, so while the data warehouse may be fit for purpose today, it’s important that the automation processes, storage requirements and general infrastructure is of a speed and standard capable of handling this in the future too.

As part of this, data warehouse automation needs to integrate with the business as it currently is and will realistically be in the future, rather than the business as the IT teams wish it might be in an ideal world. CDOs need to encourage their teams to understand the data that is available, and the automated analytics and evaluation processes which can be used to meet specific business priorities. To support this, the data warehouse automation strategy needs to be designed not just for an ideal set up of data, expertly managed and curated, but for the realistic “messiness” of the business data landscape.

  1. Automate efficiently

Data warehouse automation, as with any other large scale transformation project, requires resources – and these are often scarce due to strict budgets and competing priorities. This means that CDOs need to think long and hard about what actually should be automated in order to free up man hours in the future. Some good examples of where automation is a cost-effective step could be in hand-coding SQL, writing scripts or manually managing metadata. All of these are systematic processes, where data warehouse automation can either eliminate the need for human involvement or dramatically accelerate the process.

  1. Embrace change

CDOs should look at data warehouse modernization and automation as an avenue of constant, on-going development. As business needs change and new data sources emerge, CDOs need to be able to re-strategize different parts of the infrastructure to match. Similarly, to minimize disruption and ease the transition for business users, CDOs should look to take a staged approach to the initial automation and modernization process, with a set schedule of when different requirements will be met. Post-production change is inevitable due to evolving business needs, new technologies used and continuous improvement desired. Change needs to be planned for.

At the same time CDOs need to prepare for the human change that automation will create. In business teams, users can be re-deployed to double down on analyzing business intelligence and translating insight into business value. In the IT teams, automation provides new capacity to plan for the future – looking at new analytics tools, or planning for smarter, better ways to deliver on business priorities further down the line.

A data warehouse automation mentality

Data warehouse automation is not solely software you buy. It’s a philosophy and culture you implement. Tools and technologies form the bedrock of the processes, but a data warehouse strategy requires strong leadership, a transparent process, and an unrelenting focus on the business’s end goals in order to succeed.

Without robust data warehouse automation, businesses will struggle to capitalize on the potential of data and its associated technologies. As the strategic lead for data-driven transformation, and the change agents across both business and IT teams, the responsibility falls to CDOs. Professionals in this role need to understand, strategize, and execute on the way that large-scale data usage will influence future business decisions. The adaptability of the supporting data infrastructure can either be a CDO’s greatest weakness or greatest asset. Use the four steps covered in this guide to ensure it is the latter, and to achieve the ultimate goal of any business investment – value.


July 17, 2018  3:31 PM

Era of the Smart Factory: how can manufacturers get to the future quicker?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Antony Bourne, Industries President at IFS

Businesses are constantly told they will go bust if they do not digitally transform and quick, but for manufacturers it’s not quite as simple. If production goes down even for a day, the impact can be irreversible, with millions in revenue potentially lost.

From an ergonomic, economic, and environmental perspective, smart factories – where robots and machines relay real-time data from connected devices to learn and adapt to new demands and autonomously manage entire production processes – are heralded as the way manufacturers can strive towards true digital transformation.

However, for many the smart factory is a faraway utopia. According to a recent Capgemini study, while 76% of manufacturers have an ongoing smart factory initiative or are working towards one, only 14% are satisfied with their level of success. However, manufacturers should not feel overwhelmed, and those embarking on their journey should not change too much too soon or do so without the proper counsel.

Here are some essentials for any manufacturer considering a smart factory rollout.

  1. Business first approach

Business owners or project managers and directors should want to rollout a smart factory initiative not to tell customers or the board that they are ‘digitally transforming’, but to achieve better business results; futureproof its existence, and derive greater value from production plants and the entire ecosystem. Importantly, a smart factory should form part of a wider interconnected IT landscape and be a holistic endeavour.

One business goal for any manufacturer should be achieving full visibility across all assets. As we’ve witnessed recently in the construction industry with the collapse of major contractor Carillion, achieving visibility across a business is paramount. This means manufacturers need to become even more data-driven. Data is the lifeblood of a business, and the more real-time data a manufacturer can draw value from, the more likely they are to achieve full transparency across their estate.

Smart factories give manufacturers the ability to benefit from the accumulation of massive amounts of valuable data. This enables them to identify and address quality issues; rapidly learn and adapt to new demands; predict operational inefficiencies or oscillations in sourcing and demand, and respond to customers’ needs ahead of time.

  1. Connecting physical & digital technologies

Some manufacturers have had mechanical systems in place for over 50 years. With the emergence of technologies such as Robotic Processing Automation (RPA) that present the opportunity to drive transformation in the sector, it is important that manufacturers take advantage and replace outdated machinery when possible, not just update it.

Manufacturers must also move their datacentre to the cloud. Escaping legacy systems enables manufacturers to scale horizontally and vertically quickly, as well as reduce costs, boost efficiency and lower time to recovery. Running off a cloud infrastructure is key to maximising the value and ROI of real-time data that a smart factory produces. For manufacturers yet to make the leap, moving disaster recovery and backups to the cloud is a good place to start.

One technology essential to any smart factory, that sits in-between physical and digital infrastructure, is the Internet of Things (IoT). The benefits of Industrial Internet of Things (IIoT) have been widely espoused. IoT devices derive valuable data from sensors outside of the manufacturing plant and from machines on the shop floor, but importantly, also enable businesses to expand their operations and offer new services to customers. Servitization is upon us, and manufacturers not focused on adding services onto their products to make them more outcome-based with the help of IoT solutions will miss out on valuable business opportunities, or worse.

  1. Focus on people and partnerships

The face of the shop floor will change unimaginably over the coming years. Robots will work autonomously 24/7 in a windowless, lightless environment; but despite this there is one aspect that won’t change – people will still be a manufacturers’ most important asset. Reskilling and upskilling employees, opening their eyes to the art of the possible and getting them on board with change – a change that will likely reshape their purpose in the workplace – is a massive but necessary undertaking.

Manufacturers also need to seek expert advice, or partner with technology providers knowledgeable and proven in the manufacturing arena that can help them realise better business benefits.

  1. Start small and scale

Smart factory investments need to be broken down into bite-sized chunks, with specific opportunities realised first. Value and growth can be created through scaling a single asset and testing the processes and technologies around it. There are some incredible centres in the UK that can help manufacturers test these concepts in a safe, manageable environment.

Take Catapult. Catapult is a network of physical centres designed to help organisations transform ideas into new products and services. Manufacturers can take advantage of initiatives like this to safely test assets with zero downtime in production. Once perfected, the solution can then scale to other assets of the business, such as production lines and factories, with complete confidence in performance.

The true power of the smart factory is its ability to adapt and grow with the shifting needs of an organisation. Manufacturers need to take one step at a time in getting to the future, and recognise there is the right expertise out there to help them achieve their business goals, by implementing a harmonious, automated, and resilient smart factory.


June 21, 2018  10:35 AM

The external workforce: are you realising its full value?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Mikael Lindmark, a senior vice president at SAP Fieldglass

As CIOs and other executives across enterprises turn to contractors, freelancers and other external workers to fill  skills gaps and meet their strategic objectives, they are transforming the way work gets done. However, many organisations lack the management rigour and C-suite attention to reap the full value of their external workforce.

Multi-channel workforce moves to core of business operations

A recent survey we at Fieldglass completed in collaboration with Oxford Economics signaled that UK businesses rely heavily on the external workforce, which includes non-payroll workers and services providers. In fact, nearly nine out of 10 senior executives in the UK (87%) who participated in the survey say the external workforce is essential for them to operate at full capacity and meet market demands—much greater than the global average (65%). Furthermore, this rising multi-channel workforce of contingent workers, independent contractors, alumni (i.e. a business’ former employees) and services providers now accounts for nearly half of workforce spend in the UK (46%, slightly higher than the 44% global average).

When it comes to labour market trends, nearly three-quarters (74%) of UK executives say finding high-quality resources at the right time and in the right place is really or extremely challenging, and in just three years, 87% say the external workforce will be critical for sourcing skills that are in scarce supply. Shortfalls in newer technologies, such as artificial intelligence, machine learning, blockchain, automation, cloud and the Internet of Things are particularly severe in the UK, where 29% of executives compared to 13% globally call this a “major shortfall.”

Cost is not the whole story

Increasingly external workers operate at the core of businesses. Not that long ago, companies discussed how and where to use external workers based on an assessment of what was core or non-core to the business as they sought to lower operational costs. Now, nearly half (45%) of executives believe they could not conduct business as usual without their external workforce. External workers are filling roles across enterprises, and contributing to improvements in businesses’ overall financial performance, according to 55% of respondents in the UK.

In addition to making it possible for businesses to operate at capacity and meet demand, UK executives have a greater tendency than the global average (81% as compared to 62%) to credit the external workforce with being important or extremely important to achieve sustainability goals/shrink their carbon footprint. UK executives also look to their external workforce to increase organisational agility (71%), reduce risk (71%), increase speed to market (68%) and develop or improve products and services (61%). Managing costs remains important (77% compared to 60% globally).

Looking beyond traditional metrics, 71% said their external workforce raises the bar for their employees by bringing in new sources of skills and talent.

Managing and extracting value from the external workforce

Visibility provides a basis for measuring and managing results, yet many businesses lack an adequate line of sight into their external workforce. Management of this workforce segment has not fully matured at most businesses. Consequently, companies are not realizing the most value from this important asset.

Around one in 10 companies that responded to our global survey demonstrate markedly superior performance in managing and extracting value from the external workforce—the “Pace setters.”  They are “highly informed” about five of eight areas that executives consider “very important” for managing non-payroll workers and services providers. For example: What are they doing? Are they being paid negotiated rates? What are the contract terms? What is the duration of their work?  Do they have the required licenses and certifications? What access do they have to systems, confidential information and facilities?

Pace setters also find it easier to manage many, yet not all, aspects of their external workforce. For instance: They have greater confidence in their ability to find high-quality talent at the right time, place and rate. They find it easier to avoid payment of duplicate charges and unauthorised expenditures; track who performed well and who should not be re-engaged; and be compliant on local tax and labor laws, regulatory requirements and candidate privacy requirements.

Pace setters also say their external workforce helps them improve their business’ overall financial/business performance and/or helps them compete in today’s digital world.

The Pace setters show how companies could benefit from better visibility and management of the external workforce. Set goals for your company to emulate or surpass the Pace setters as you establish the management rigor required to harness external resources to achieve better business results.

Mikael Lindmark, a senior vice president at SAP Fieldglass, is responsible for overall operations across Europe. Mikael, who joined SAP Fieldglass more than a decade ago, has more than 20 years of experience in human capital development and management, and is based in the UK.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: