Data Matters

Page 1 of 1412345...10...Last »

October 3, 2018  1:14 PM

How planning, data, and ‘digital string’ can combat the global food waste crisis

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Colin Elkins, global industry director, process manufacturing, IFS.

Global food waste, or food loss as it is classified by growers and manufacturers, is around 1.3 billion tons per year, which is one third of all food produced and twice the weight of the world’s current population!

In developing countries, waste is due to inadequate or inappropriate agriculture and storage, whereas in the developed countries it is caused by consumption. In the US consumption accounts for a staggering 40% of all US food wasted, whereas in sub-Saharan Africa only five percent of food is wasted by the consumer.

Our agricultural and production systems today are a wonder to behold, computer-based crop and animal management, GPS based harvesting and field fertilisation techniques, robotic graders and sorters, vison recognition and rejection systems are all part of the journey from our fields to our factories. The utilisation of these and other technologies has made the manufacturing element of the supply chain the lowest waster of food in developed countries. There is however one exception, fresh produce – fruit and vegetables are amongst the highest contributors to waste. This is not due to poor technology or practice, but due to the perceived demands of retailers and consumers.

Retailers desire to make their products look attractive to the consumer has driven up the waste levels for growers and producers. Size, shape, colour and consistency all form part of the ‘retailer specification’ outside of which the produce is simply discarded and returned to the soil. Yields of less than 50% are not uncommon for minor deviations from specification.

Retailers are often seen as the main villain in terms of creating this ‘artificial’ waste, but are they simply pandering to consumer needs? Consumer demands for perfect produce are also a key culprit.

Technology will not solve the food waste problem completely, as a cultural change is required on the part of the consumer, however it can go part of the way. Growers and producers already innovate by developing smarter farms, factories and processes to reduce waste, using the latest technologies like AI, and The Internet of Things (IoT) in conjunction with their ERP Software.

Where the food manufacturers can play a part in waste reduction is in the planning phase. What many consumers don’t realise is the level of planning that goes into the production and packaging, to be able to supply the correct amount for consumer needs. The combination of high demand from retailer and consumers means that if supply does indeed outstrip demand, then both the fresh produce and its packaging are often discarded.

Predictions and planning are an area that technology has already redefined in recent years, and the manufacturing industry is no different. Until fairly recently, food manufacturers were dealing purely with a gut feeling, sporadic weather reports, and disparate systems in order to plan for yield and supply.

In the same way that predicting the weather is now far more data-driven than ever before, the same is true for predicting demand and matching supply for fresh food products. IoT and its network of physical devices, vehicles, sensors, appliances and other items embedded with electronics, are producing vast volumes of potentially critical and valuable data for the food manufacturing industry, and the ability for these devices connect and exchange data with the planning function is key. The future of the industry is not simply in each smart device, but in the connectivity between them, something I refer to as ‘digital string’.

The strings of data must converge to a central, visible and accessible point within the manufacturing organisation, so the planning decision makers have this critical data delivered to them, all in one dashboard. Utilising data from internal sources like sales, promotions and forecasts is normal today, but with the use of Artificial Intelligence it will be possible to integrate that data with external sources like weather patterns, public events and holidays and then analyse and learn from the results to then optimize the plan and reduce waste. The data is out there, ripe for the picking, and if food manufacturers can harvest it and bring it within the factory to inform planning processes, the industry can do much more to reduce waste, whilst making significant savings.

From an ideological perspective, a fundamental change is required on the part of the consumer to reduce packaging consumption and accept a wider spectrum of perceived fresh food ‘quality’ when it comes to fresh produce. Reducing food waste is everyone’s responsibility, and like many other sustainability efforts, technology is available to manufacturers and producers to help them do their bit, whilst driving efficiencies and cost savings within their business. It’s a win, win.

October 3, 2018  1:01 PM

How a microservices-based architecture can reimagine enterprise software

Brian McKenna Profile: Brian McKenna

This is a guest blog post by Savinay Berry, VP of Cloud Services at OpenText

Microservices are an increasingly important way of delivering an alternative to traditionally large, complex and monolithic enterprise applications. With a microservices architecture, applications can be broken down and developed through a collection of loosely coupled services that allow continuous development without having to redeploy the entire application. It is essentially a variant of service-oriented architecture (SOA).

Think like IKEA

For a simpler way to think about how a microservices architecture works, take the example of how the making and buying of furniture for the home has evolved. Traditionally, if someone needed a sofa or a dining room table, they would go out to a furniture store and buy the item for a few thousand pounds. They would get good use out of it and might buy a maintenance plan to cover any repairs.

Then along came Ingvar Kamprad, a 17-year-old Swedish boy who started a business selling replicas of his uncle Ernst’s kitchen table. That company became IKEA and broke the traditional model of buying furniture as a complete product that was already assembled. Instead, each item of furniture was broken up into easily assembled pieces that people could put together themselves. Manufacturing became cheaper as a result, and there was no maintenance or ongoing cost. The customer gets the modern dining room table or bookshelf they wanted, but at a fraction of the cost compared to a traditional furniture store. Efficiencies also resulted for IKEA; the modular design meant that operating logistics were simplified, making things cheaper at the back end.

The same is now happening with software. Traditional on-premises software is expensive, plus there is the cost of ongoing maintenance and support. Like in the IKEA analogy, software is instead broken up into easily consumable services, at a lower cost model.

Adoption of microservices, and cloud microservices in particular, is growing fast. Research and Markets predicts the global cloud microservices market will grow from $683.2m in 2018 to $1.88bn by 2023, driven largely by digital transformation and the retail and ecommerce sector, where customer experience is a key competitive differentiator.

A way to unleash the power of data

How does a microservices architecture work in the real world? Take the example of a large insurance company that provides homes and buildings insurance.

Previously, documents, insurance contracts and financial paperwork would be taken from the content management system and emailed to the builder. The builder would complete them and email them back. The insurance administrator would then print the email, scan the documents and put them into the content management system. This was happening on a daily basis, thousands of times a year.

Using a microservices approach, the insurance company was able to digitise the entire process. The administrator can now share documents by just clicking a couple of buttons, the builder can edit the documents, and have them automatically sent back into the content store. There is no longer a need to print or scan documents. This is what digitising businesses means in the real world. It’s not just a high-level, abstract IT industry concept. It’s a real application that saves time and money for enterprises.

Another example is a company in the utility sector, which will be able to use artificial intelligence (AI) to help maintenance crews search for parts. If the crews are searching for a part they get prompted with similar or related searches to what they are looking for, similar to Amazon’s recommendation engine approach. It is about leveraging the power of data stored in your own systems. And while the algorithm is important, it’s not the key factor. What really matters is the data – that’s the true differentiator.

Enabling innovation and transformation

But microservices isn’t just about making things easier to build and maintain, and cheaper to deliver and run. It’s also about what microservices enables enterprises to do. This approach provides the platform for businesses to completely reimagine applications and, in doing so, transform the customer experience.

A microservices architecture is more resilient and reliable. If one service goes down, the rest of the application will still be available to the customer. Microservices are also more scalable and can easily be configured to automatically scale up and down to meet peaks and troughs in demand, ensuring the customer experience is maintained without holding on to excess capacity.

Microservices also enable faster innovation, development and deployment. Developers have more freedom, because code for different services can be written in different languages. It allows for easy integration and automatic deployment of services and accelerated time-to-market for new services and products, or the ability to easily make enhancements to existing ones.

Microservices are a key enabler in the world of cloud and digital. But it’s not just about developers, code and technology. As with the cloud, it’s not simply a deployment model change – it’s a mindset and cultural change. And that’s what true digitisation provides – while inspiring new ways to work.


September 21, 2018  3:28 PM

It’s time to tackle your dirty data

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Francois Ajenstat, Chief Product Officer, Tableau

Data analysis is only as good as the quality of your data. Anyone who has ever analysed data knows that the information is rarely clean. You may find that your data is poorly structured, full of inaccuracies, or just plain incomplete. You find yourself stuck fixing the data in Excel or writing complex calculations before you can answer even the simplest of questions.

Data preparation is the process of getting data ready for analysis, including data sets, transforming, and filtering—and it is a crucial part of the analytics workflow. People spend 80% of their time prepping data, and only 20% of their time analysing it, according to a recent article from Harvard Business Review.

Most analysts in an organisation feel the pain of dirty data. The amount of time and energy it takes to go from disjointed data to actionable insights leads to inefficient/incomplete analyses and declining trust in data and decisions. These slower processes can ultimately lead to missed opportunities and lost revenue.

Organisations spend a lot of time building curated data sets and data warehouses to support the needs of the business. But even with these practices, it is likely that some level of dirty data will seep through the cracks of day-to-day operations. So, how can we solve the common data prep issues?

Issue #1: More data creates more problems

We live in a world where a variety of data is constantly being generated. With this windfall of data now flowing faster than many businesses processes, organisations are struggling to keep up. We also can’t be certain about how this information will be used in the future.

Solution: Enable self-service data preparation

Visual, self-service data prep tools allow analysts to dig deeper into the data to understand its structure and see relationships between datasets. This enables the user to easily spot unexpected values that need cleaning. Although this technology brings clarity to the data, people will still need support to understand details like field definitions.

Issue #2: Dirty data requires a waterfall approach

Analysts report that the majority of their job is not analysis, but cleaning and reshaping data. Every time new data is received, analysts need to repeat manual data preparation tasks to adjust the structure and clean the data for analysis. This ultimately leads to wasted resources and an increased risk of human error.

Solution: Develop agile processes with the right tools to support them

Every organisation has specific needs and there is no ‘one-size-fits-all’ approach to data preparation, but when selecting a self-service data preparation tool, organisations should consider how the tool will evolve processes towards an iterative, agile approach instead of creating new barriers to entry.

Issue #3: Data preparation requires deep knowledge of organisational data

Before preparing data, it is crucial to understand its location, structure, and composition, along with granular details like field definitions. Some people refer to this process as “a combo of data” and it is a fundamental element of data preparation. You would not start a long journey without a basic understanding of where you’re going, and the same logic applies to data prep.

But often, because of information silos, users have less insight into the entire data landscape of their organisation—what data exists, where it lives, and how it is defined. Confusion around data definitions can hinder analysis or worse, lead to inaccurate analyses across the company.

Self-service data preparation tools put the power in the hands of the people who know the data best—democratising the data prep process and reducing the burden on IT.

Solution: Develop a data dictionary

One way to standardise data definitions across a company is to create a data dictionary. A data dictionary helps analysts understand how terms are used within each business application, showing the fields are relevant for analysis versus the ones that are strictly system-based.

Developing a data dictionary is no small task. Data stewards and subject matter experts need to commit to ongoing iteration, checking in as requirements change. If a dictionary is out of date, it can actually do harm to your data strategy.

Issue #4: “Clean data” is a matter of perspective

Different teams have different requirements and preferences regarding what makes “well-structured” data. For example, database administrators and data engineers prioritise how data is stored and accessed—and columns may be added that are strictly for databases to leverage, not humans.

If the information that data analysts need is not already in the data set, they may need to adjust aggregations or bring in outside sources. This can lead to silos or inaccuracies in the data.

Solution: Put the power in the hands of the data experts

Self-service data prep gives analysts the power to polish data sets in a way that matches their analysis, leading to faster, ad-hoc analyses and allowing them to answer questions as they appear. It also reduces the burden on IT to restructure the data whenever an unanticipated question arises. This can also reduce the amount of duplicated efforts because other analysts can reuse these models. If the datasets are valuable on a wide scale, you can combine them into a canonical set in the future.

Issue #5: The hidden reality of data silos

Self-service business intelligence tools have opened up data analysis capabilities to every level of user, but to get insights into their data, users still need to wait and rely on IT for well-structured data. If people are not willing to wait they will simply extract data into excel. This often leads to data littered with calculation errors that have not been vetted, and eventually, results in inconsistent analysis. Repeating this process leads to an abundance of data silos, which are not efficient, scalable, or governed.

Solution: Create consistency and collaboration within the data prep process

Combatting silos starts with collaboration. Scheduling regular check-ins or a standardised workflow for questions allows engineers to share the most up-to-date way to query and work with valid data, while empowering analysts to prepare data faster and with greater confidence.

Companies should enable employees to do the prep themselves in an easy to use tool that they can share with others. This will mean that organisations can see what data employees use and where it is duplicated, so they can create processes that will drive consistency.


August 15, 2018  11:15 AM

Why Graph Needs an SQL

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Neo4j co-founder and CEO Emil Eifrem, who explains why his sector needs its own ‘General Query Language’.

In the same way that SQL helped facilitate the growth and subsequent domination of RDBMS over the database market for the last two decades, to secure its next stage of enterprise acceptance, graph software needs to create a standard property graph query language.

Why? Because put at its simplest, a common query language could open the doors to graph technology being used more widely in exactly the same way the lingua franca of SQL propelled relational to the summits 30 years ago.

The main reasons why are detailed in an open letter we, as Neo4j, are publishing for the entire database world, the GQL Manifesto, and I’d like to summarise why here. Basically, there are two groups that need to come together to achieve a common graph industry language. There is industry, like Neo4j with Cypher and Oracle with the PGQL language, and academia. The property graph researchers in academia have moved on from query languages like XPath for querying documents or SPARQL and RDF, and are now interested in the wider property graph context.

A next-generation graph query language

Over the last 18 months there’s been a lot of behind-the-scenes activity on unification that led to the Manifesto and there’s a lot more commonality we can build on. Oracle has achieved some great innovations with PGQL, and PGQL is very close to Cypher, with the two languages coming from the same fundamental design approach. That means that you’ve got the primary designers of Cypher and PGQL already collaborating with each other (for example, on limited graph querying extensions to SQL) and with researchers on what a good, next generation graph querying language needs to look like.

So what we need to do is to take all that intellectual ferment and energy and turn it into actuality. There are two key ways to do this. One is making it even easier to express complex patterns in a very concise way in a graph database – regular path queries, and regular expressions, which can be used to define the pattern of things you want to look at. Second, there’s composable graph querying, where you glue together queries and one query can feed off the output of another, using that output as its input.

Put that together and we have a GQL at least as good as SQL’s first implementation – and we need to find a way to make that happen. Many graph database practitioners have, I believe, come to the conclusion that these different languages should come together and be the inputs to a new unified language for the whole market.

Hence the manifesto campaign – a joint, open, collaborative initiative where we want everyone to make a contribution. Including developers or managers or line of business users looking for the kind of innovation and creative data modelling you can only get from this form of database technology.

If you agree with me and graph database advocates like us that the time is right to create one standard property graph query language, then we would love to receive your vote. Find out more at The GQL Manifesto main website [https://gql.today].

We hope you agree – and we can add your voice to the discussion.

The author is co-founder and CEO of Neo4j, (http://neo4j.com/)


August 14, 2018  10:36 AM

How automation can help create value, the 5th ‘v’ of data

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Neil Barton, CTO, WhereScape

For CDOs, defining the data and analytics strategy for their organization is increasingly the most important aspect of their job, with 86% ranking it their top responsibility, according to the 2017 Gartner annual Chief Data Officer Survey.  The four “Vs” of data are well known – volume, velocity, variety and veracity. However, data warehousing infrastructure in many organizations is no longer equipped to handle these. The fifth elusive “V” – value – is even more evasive. Meeting these challenges at the scale of data that modern organizations have requires a new approach – and automation is the bedrock.

It’s all about finding methods of using data for value creation and revenue generation, which occupies 45% of a CDO’s time. This means harnessing the growing beast that is data in a way that is practical, manageable and useful.  That’s where the data warehouse comes in, providing a centralized space for enterprise data that business users, including the CDO, can use to derive insights.

Creating a successful data warehouse, then, is critical for CDOs to succeed in monetizing data within their organization. However, the traditional waterfall approach to data warehousing, first introduced in the 1970s, delivers only a fraction of the value that it could potentially offer. Instead, the approach needs to evolve to become more responsive as organizational needs change, addressing new data sources and adapting to business demand.  Using automation software to design, develop, deploy and operate data warehouses is providing far-reaching value to business leaders. This change is positioning IT to incorporate timely technologies and new data sources more easily, as well as flex when business needs demand it.

The CDO’s role in re-invigorating the data warehouse

As the central storage point for enterprise data, the data warehouse is invaluable for providing business users with information. However, as users become more aware of the potential benefits of data-driven decision making, the gap between user expectations and the data warehouse’s ability to provide current, consumable data in a timely manner has grown. Businesses want insights from data far faster than previously. This is exacerbated with the growth of new forms of data, particularly semi-structured or un-structured information, including social media data, sensor data, real-time messages and client communications, and video/audio files.

Traditionally, data warehouse development and evolution required long-cycle IT projects, incongruent with the needs of a more agile project design and build environment. As the change agents for digital transformation, CDOs must lead in the re-architecting of data warehouses, from artistry to acceleration and automation in order to improve business time-to-value.

Practical steps for the successful implementation of an automated data warehouse

As IT departments are expected to do much more with much less, processes need to change. Data warehouses can no longer be created “artisinally” – rather than crafting a complex infrastructure with unique configurations and a long lifespan, IT teams need to focus on producing an adaptable decision support infrastructure. This means creating a data warehouse that can transform, with ease, as business needs change. Here are five steps for CDOs to help their company achieve this:

  1. Understand the desired outcomes

Before making any decisions as to the future of your data warehouse infrastructure, CDOs need to ensure they understand the specific challenges the business teams are facing where data could help. In essence, the data warehouse automation and modernization program needs to be built around enabling decision-making that will lead to differentiation in the market place.

According to a TDWI survey, re-alignment to business objects is the top reason for data warehouse modernization, selected by 39% of respondents. By enabling collaboration between business teams and IT teams, the CDO helps chart the course for how business goals and technology meet. In turn, this will lead to overall business transformation, accelerated through the new data warehouse’s approach to data-driven decisions.

  1. Understand what you have already

Most organizations already have sophisticated data management tools deployed as part of their infrastructure – however these may not be working to the fullest of their abilities. Organizations already using SQL Server, Oracle, or Teradata, for example, have a range of data management and data movement tools, already within their IT real estate, which can be automated and leveraged more effectively as part of a data warehouse automation push.

However, in that inventorying process, CDOs should be ensuring they have considered the capacity requirements of their data warehouse. Data is expected to continue growing exponentially, so while the data warehouse may be fit for purpose today, it’s important that the automation processes, storage requirements and general infrastructure is of a speed and standard capable of handling this in the future too.

As part of this, data warehouse automation needs to integrate with the business as it currently is and will realistically be in the future, rather than the business as the IT teams wish it might be in an ideal world. CDOs need to encourage their teams to understand the data that is available, and the automated analytics and evaluation processes which can be used to meet specific business priorities. To support this, the data warehouse automation strategy needs to be designed not just for an ideal set up of data, expertly managed and curated, but for the realistic “messiness” of the business data landscape.

  1. Automate efficiently

Data warehouse automation, as with any other large scale transformation project, requires resources – and these are often scarce due to strict budgets and competing priorities. This means that CDOs need to think long and hard about what actually should be automated in order to free up man hours in the future. Some good examples of where automation is a cost-effective step could be in hand-coding SQL, writing scripts or manually managing metadata. All of these are systematic processes, where data warehouse automation can either eliminate the need for human involvement or dramatically accelerate the process.

  1. Embrace change

CDOs should look at data warehouse modernization and automation as an avenue of constant, on-going development. As business needs change and new data sources emerge, CDOs need to be able to re-strategize different parts of the infrastructure to match. Similarly, to minimize disruption and ease the transition for business users, CDOs should look to take a staged approach to the initial automation and modernization process, with a set schedule of when different requirements will be met. Post-production change is inevitable due to evolving business needs, new technologies used and continuous improvement desired. Change needs to be planned for.

At the same time CDOs need to prepare for the human change that automation will create. In business teams, users can be re-deployed to double down on analyzing business intelligence and translating insight into business value. In the IT teams, automation provides new capacity to plan for the future – looking at new analytics tools, or planning for smarter, better ways to deliver on business priorities further down the line.

A data warehouse automation mentality

Data warehouse automation is not solely software you buy. It’s a philosophy and culture you implement. Tools and technologies form the bedrock of the processes, but a data warehouse strategy requires strong leadership, a transparent process, and an unrelenting focus on the business’s end goals in order to succeed.

Without robust data warehouse automation, businesses will struggle to capitalize on the potential of data and its associated technologies. As the strategic lead for data-driven transformation, and the change agents across both business and IT teams, the responsibility falls to CDOs. Professionals in this role need to understand, strategize, and execute on the way that large-scale data usage will influence future business decisions. The adaptability of the supporting data infrastructure can either be a CDO’s greatest weakness or greatest asset. Use the four steps covered in this guide to ensure it is the latter, and to achieve the ultimate goal of any business investment – value.


July 17, 2018  3:31 PM

Era of the Smart Factory: how can manufacturers get to the future quicker?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Antony Bourne, Industries President at IFS

Businesses are constantly told they will go bust if they do not digitally transform and quick, but for manufacturers it’s not quite as simple. If production goes down even for a day, the impact can be irreversible, with millions in revenue potentially lost.

From an ergonomic, economic, and environmental perspective, smart factories – where robots and machines relay real-time data from connected devices to learn and adapt to new demands and autonomously manage entire production processes – are heralded as the way manufacturers can strive towards true digital transformation.

However, for many the smart factory is a faraway utopia. According to a recent Capgemini study, while 76% of manufacturers have an ongoing smart factory initiative or are working towards one, only 14% are satisfied with their level of success. However, manufacturers should not feel overwhelmed, and those embarking on their journey should not change too much too soon or do so without the proper counsel.

Here are some essentials for any manufacturer considering a smart factory rollout.

  1. Business first approach

Business owners or project managers and directors should want to rollout a smart factory initiative not to tell customers or the board that they are ‘digitally transforming’, but to achieve better business results; futureproof its existence, and derive greater value from production plants and the entire ecosystem. Importantly, a smart factory should form part of a wider interconnected IT landscape and be a holistic endeavour.

One business goal for any manufacturer should be achieving full visibility across all assets. As we’ve witnessed recently in the construction industry with the collapse of major contractor Carillion, achieving visibility across a business is paramount. This means manufacturers need to become even more data-driven. Data is the lifeblood of a business, and the more real-time data a manufacturer can draw value from, the more likely they are to achieve full transparency across their estate.

Smart factories give manufacturers the ability to benefit from the accumulation of massive amounts of valuable data. This enables them to identify and address quality issues; rapidly learn and adapt to new demands; predict operational inefficiencies or oscillations in sourcing and demand, and respond to customers’ needs ahead of time.

  1. Connecting physical & digital technologies

Some manufacturers have had mechanical systems in place for over 50 years. With the emergence of technologies such as Robotic Processing Automation (RPA) that present the opportunity to drive transformation in the sector, it is important that manufacturers take advantage and replace outdated machinery when possible, not just update it.

Manufacturers must also move their datacentre to the cloud. Escaping legacy systems enables manufacturers to scale horizontally and vertically quickly, as well as reduce costs, boost efficiency and lower time to recovery. Running off a cloud infrastructure is key to maximising the value and ROI of real-time data that a smart factory produces. For manufacturers yet to make the leap, moving disaster recovery and backups to the cloud is a good place to start.

One technology essential to any smart factory, that sits in-between physical and digital infrastructure, is the Internet of Things (IoT). The benefits of Industrial Internet of Things (IIoT) have been widely espoused. IoT devices derive valuable data from sensors outside of the manufacturing plant and from machines on the shop floor, but importantly, also enable businesses to expand their operations and offer new services to customers. Servitization is upon us, and manufacturers not focused on adding services onto their products to make them more outcome-based with the help of IoT solutions will miss out on valuable business opportunities, or worse.

  1. Focus on people and partnerships

The face of the shop floor will change unimaginably over the coming years. Robots will work autonomously 24/7 in a windowless, lightless environment; but despite this there is one aspect that won’t change – people will still be a manufacturers’ most important asset. Reskilling and upskilling employees, opening their eyes to the art of the possible and getting them on board with change – a change that will likely reshape their purpose in the workplace – is a massive but necessary undertaking.

Manufacturers also need to seek expert advice, or partner with technology providers knowledgeable and proven in the manufacturing arena that can help them realise better business benefits.

  1. Start small and scale

Smart factory investments need to be broken down into bite-sized chunks, with specific opportunities realised first. Value and growth can be created through scaling a single asset and testing the processes and technologies around it. There are some incredible centres in the UK that can help manufacturers test these concepts in a safe, manageable environment.

Take Catapult. Catapult is a network of physical centres designed to help organisations transform ideas into new products and services. Manufacturers can take advantage of initiatives like this to safely test assets with zero downtime in production. Once perfected, the solution can then scale to other assets of the business, such as production lines and factories, with complete confidence in performance.

The true power of the smart factory is its ability to adapt and grow with the shifting needs of an organisation. Manufacturers need to take one step at a time in getting to the future, and recognise there is the right expertise out there to help them achieve their business goals, by implementing a harmonious, automated, and resilient smart factory.


June 21, 2018  10:35 AM

The external workforce: are you realising its full value?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Mikael Lindmark, a senior vice president at SAP Fieldglass

As CIOs and other executives across enterprises turn to contractors, freelancers and other external workers to fill  skills gaps and meet their strategic objectives, they are transforming the way work gets done. However, many organisations lack the management rigour and C-suite attention to reap the full value of their external workforce.

Multi-channel workforce moves to core of business operations

A recent survey we at Fieldglass completed in collaboration with Oxford Economics signaled that UK businesses rely heavily on the external workforce, which includes non-payroll workers and services providers. In fact, nearly nine out of 10 senior executives in the UK (87%) who participated in the survey say the external workforce is essential for them to operate at full capacity and meet market demands—much greater than the global average (65%). Furthermore, this rising multi-channel workforce of contingent workers, independent contractors, alumni (i.e. a business’ former employees) and services providers now accounts for nearly half of workforce spend in the UK (46%, slightly higher than the 44% global average).

When it comes to labour market trends, nearly three-quarters (74%) of UK executives say finding high-quality resources at the right time and in the right place is really or extremely challenging, and in just three years, 87% say the external workforce will be critical for sourcing skills that are in scarce supply. Shortfalls in newer technologies, such as artificial intelligence, machine learning, blockchain, automation, cloud and the Internet of Things are particularly severe in the UK, where 29% of executives compared to 13% globally call this a “major shortfall.”

Cost is not the whole story

Increasingly external workers operate at the core of businesses. Not that long ago, companies discussed how and where to use external workers based on an assessment of what was core or non-core to the business as they sought to lower operational costs. Now, nearly half (45%) of executives believe they could not conduct business as usual without their external workforce. External workers are filling roles across enterprises, and contributing to improvements in businesses’ overall financial performance, according to 55% of respondents in the UK.

In addition to making it possible for businesses to operate at capacity and meet demand, UK executives have a greater tendency than the global average (81% as compared to 62%) to credit the external workforce with being important or extremely important to achieve sustainability goals/shrink their carbon footprint. UK executives also look to their external workforce to increase organisational agility (71%), reduce risk (71%), increase speed to market (68%) and develop or improve products and services (61%). Managing costs remains important (77% compared to 60% globally).

Looking beyond traditional metrics, 71% said their external workforce raises the bar for their employees by bringing in new sources of skills and talent.

Managing and extracting value from the external workforce

Visibility provides a basis for measuring and managing results, yet many businesses lack an adequate line of sight into their external workforce. Management of this workforce segment has not fully matured at most businesses. Consequently, companies are not realizing the most value from this important asset.

Around one in 10 companies that responded to our global survey demonstrate markedly superior performance in managing and extracting value from the external workforce—the “Pace setters.”  They are “highly informed” about five of eight areas that executives consider “very important” for managing non-payroll workers and services providers. For example: What are they doing? Are they being paid negotiated rates? What are the contract terms? What is the duration of their work?  Do they have the required licenses and certifications? What access do they have to systems, confidential information and facilities?

Pace setters also find it easier to manage many, yet not all, aspects of their external workforce. For instance: They have greater confidence in their ability to find high-quality talent at the right time, place and rate. They find it easier to avoid payment of duplicate charges and unauthorised expenditures; track who performed well and who should not be re-engaged; and be compliant on local tax and labor laws, regulatory requirements and candidate privacy requirements.

Pace setters also say their external workforce helps them improve their business’ overall financial/business performance and/or helps them compete in today’s digital world.

The Pace setters show how companies could benefit from better visibility and management of the external workforce. Set goals for your company to emulate or surpass the Pace setters as you establish the management rigor required to harness external resources to achieve better business results.

Mikael Lindmark, a senior vice president at SAP Fieldglass, is responsible for overall operations across Europe. Mikael, who joined SAP Fieldglass more than a decade ago, has more than 20 years of experience in human capital development and management, and is based in the UK.


June 15, 2018  12:51 PM

Can good data approaches help uncover bad data social manipulation?

Brian McKenna Profile: Brian McKenna

Software definitely helped cause the pernicious problem of fake news – but could also ameliorate it, says data expert Emil Eifrem, in a guest blogpost.

Time was it was the tabloid newspapers that cornered the market when it came to fake news (remember ‘Freddy Starr Ate My Hamster’?). These days it’s a whole industry in itself – with the technology in the hands of people who don’t even pretend to be objective journalists.

In fact, traditional journalism’s decline has meant the loss of checks and balances, allowing fake news a free and disruptive reign. Technology advances also supports the ways misinformation is spawned and travels so rapidly, as social media is after all about active sharing. According to one of the biggest studies to date on the problem, conducted by researchers at MIT and published in March 2018 in the journal Science, the truth takes six times longer to be seen on Twitter than misinformation.

Researchers tell us lies are 70% more likely to be retweeted than the truth — even when they controlled for factors such as whether the account was verified or not, the number of followers, and how old the account was. This is not good news for society, democracy and human knowledge, one can argue.

Interestingly, while technology certainly is an enabler of fake news, it may also be the answer to helping combat it. Specifically, graph database technology, a powerful way of recognising and leveraging connections in large amounts of data, may offer some hope of salvation.

Indeed graph software is already used by legitimate investigative journalists: it helped the International Consortium of Investigative Journalists track its way through the TBs of data known as the Panama and the Paradise Papers, for instance. But graph software also turns out to be a way to potentially combat fake news.

Visualising patterns that indicate fake content

It’s reported that Russia used social media in a bid to influence the 2016 US presidential election – and with graph technology’s help, the US’s NBC News has uncovered the mechanism of how that was achieved.

That’s because NBC’s researchers found that the key to detecting fake news is connections – between accounts, posts, flags and websites. By visualising those connections as a graph, we can understand patterns that indicate fake content. The group behind the Russian trolling was small, but very effective, working to leveraging Twitter with popular hashtags and posting reply Tweets to popular accounts to gain traction and followers. In one account, for example, of 9,000 Tweets sent, only 21 were actually original, and overall 75% of all the material were re-tweets, specifically designed to broadcast the messages to as wide an audience as possible. While some accounts posed as real-world citizens, others took on the guise of local media outlets and political parties.

When graph software was used to analyse the retweet network, it revealed three distinct groups – one tweeting mainly about right-wing politics, a second group with left leanings, and a final group covered topics in the Black Lives Matter movement, in an invidious but effective triangulation of extreme content sharing and emotional manipulation.

At internet scale, fake news is just too hard to spot without the right tools. We should look to graph technology, specifically designed to expose connections in data, as a possible way of helping to address the issue.
The author is co-founder and CEO of Neo4j, the world’s leading graph database (http://neo4j.com/)


June 14, 2018  1:19 PM

Challenges and benefits of the Cloud

Brian McKenna Profile: Brian McKenna

This is a guest blog by Claudia Imhoff, CEO Intelligent Solutions and founder, Boulder BI Brain Trust

Like any new initiative, there are both challenges and benefits to weigh when deciding whether cloud computing is suitable for your company’s analytic environment. Let’s start with understanding the challenges.

IT governance and control – IT departments are still leery of letting go of their data. There are many reasons but certainly job loss and the concerns about security and privacy over data rank high on the list. IT is generally responsible for corporate data assets being implemented and used according to agreed-upon corporate policies and procedures. This means that service level agreements between the company’s IT department and the cloud provider are critical to ensure acceptable standards, policies and procedures are upheld. IT personnel may also want insight into how the data is obtained, stored, and accessed by its business personnel. Finally, it is recommended that IT determine whether these cloud-deployed assets are supporting your organisation’s strategy and business goals.

Changes to IT workflows – IT workflows dealing with compliance and security become more complicated in hybrid environments (those consisting of both on-premises and cloud deployments). The workflows must take into consideration the need of advanced analysts and data scientists to combine data that is on-premises with data in various cloud computing sites. Keeping track of where the data resides can be quite difficult if good documentation and lineage reports are not available.

Managing multiple cloud deployments – Often, companies have more than one cloud computing implementation; they may use a mix of both private and public deployments – maybe even multiple ones in each category. The company must determine if each cloud provider is in compliance with regulatory requirements. Also, when considering your cloud provider(s), determine how security breaches are prevented or detected.  If data security concerns are great, it may make sense for the corporation to maintain highly sensitive data (like customer social security numbers, medical health records, etc.) within their premises rather than deploying them to cloud computing.

Managing costs – The on-demand and scalable nature of cloud computing services can make it difficult to determine and predict all the associated costs. Different cloud computing companies have different cost plans. Some charge by volume of data stored, others by the number of active users, and others still by cluster size. Some have a mixture of all three. Be sure to watch out for hidden costs like requested customizations, database changes, etc.

Performance – It is clear that if your provider is down, so are you. All you can do is wait for the provider to come back up. A second concern is your internet bandwidth. A slow internet means slow connectivity.

Now let’s turn to the many benefits of migrating to a cloud computing environment:

Lowered operating costs – This is perhaps the first benefit that companies realize when considering a move to the cloud. There is a significant difference between capital expenses and operating expenses. Basically, you are “renting” the infrastructure rather than bearing the costs upfront of building your own environment. The cloud computing provider bears all the system and equipment costs, the costs of upgrades, new hardware and software, as well as the personnel and energy costs.

No maintenance or upgrade hassles – These are again the headaches for the cloud computing provider. This frees up all resources to have a laser focus on obtaining, accessing, and using the data, not on managing the infrastructure.

Ease of implementation – For most companies, purchasing a cloud computing environment is as easy as swiping your credit card. It takes only minutes to access the environment because the technological infrastructure is all ready to go. This must be differentiated from the data infrastructure that must also be established. Whether you implement a data lake, a data vault, or a data warehouse, design and development work must be performed in addition to the technological set up.

Innovation from new cloud companies – Cloud technologies have been “born” from very innovative new companies. They make full use of all the advantages that the cloud has to offer. These technology companies can also add new features, functions, and capabilities, making them available to all customers immediately.

Elastic scalability – Many customers say this is the most appealing attribute of cloud computing. You can quickly scale up and down based on real needs. There is no need to buy extra computing capacity “just in case” you may need it at a later date. Cloud data warehouses can increase or decrease storage, users, clusters with little or no disruption to the overall environment.

Ability to handle the vast diversity of data available for analytics – Cloud computing providers can handle both well-structured data (like from operational systems) as well as the “unusual” data so popular today (like social media, IoT, or sensor data). Cloud implementations can support both fixed schemas and dynamic ones, making it perfect for routine production analytics like Key Performance Indicators or financial analyses as well as unplanned, experimental, or exploratory analyses so popular with data scientists.

Taking the time to identify both the challenges and benefits associated with the cloud is the first step in evaluating whether a move to the cloud is right for your organisation.


May 16, 2018  5:20 PM

What would Karl Marx make of Silicon Valley?

Brian McKenna Profile: Brian McKenna

It’s the bearded one’s 200th birthday this year. Much of the mainstream media has marked the occasion, including the Financial Times, that pre-eminent “capitalist” newspaper.

Why should Computer Weekly be any different? After all, one of our more distinguished alumni is Paul Mason, these days a public intellectual in the classical Marx tradition.

Mason’s book PostCapitalism (2015) contains an intriguing chapter in which he expands on a neglected text of Marx (the “Fragment on Machines”, published in English in 1973, in that massive tome, the Grundrisse) that seems to predict today’s information economy. Marx figures here as a “prophet of postcapitalism”, according to Mason:

“The productive power of machines like the ‘self-acting’ cotton-spinning machine, the telegraph and the steam locomotive was ‘out of all proportion to the direct labour time spent on their production, but depends rather on the general state of science and on the progress of technology, or the application of this science to production’.

“Organization and knowledge, in other words, made a bigger contribution to productive power than the labour of making and running the machines.”

Gartner’s Frank Buytendijk is another who finds value in Marx when it comes to reading today’s IT-saturated economy. In his book Socrates Reloaded: the case for ethics in business and technology (2012), he writes:

“Marx would have been the first to say [of Facebook, Google et al.] that all the ingredients of a revolt [by internet users] are there. What could happen to Internet giants if they shoot themselves in the feet by pushing their data collection and analyses too far?”

Buytendijk argues that Google and Facebook are as hungry for data as Victorian capitalists were for capital. They want to collect as much data as they can for the benefit of advertisers, not for the producers of [that data]. People alienate their data in the way Marx says we alienate our labour power. Moreover, our love of Google and Facebook’s “free” services are the counterpart of the religious “opium of the people”, in his view.

But – in a twist of the dialectic – in the networked world we can “leverage the same community-based business model of the Internet giants to overthrow them”. Go somewhere else and they crumble.

What would Marx, the nineteenth-century economist, make of the particular, venture-capital fuelled economy of Silicon Valley? Would he throw up his hands in horror? More likely, he’d analyse it; most probably at tedious length. And he’d probably applaud its dynamism, just as he hailed the dynamism of industrial capitalism in the Communist Manifesto of 1848.

But surely the hyper-individualist political and social culture of Silicon Valley would be an anathema to this enemy of individualism and promoter of the collective?

Well, maybe. However, Marx’s Economic and Philosophical Manuscripts of 1844 would seem to demonstrate an advocacy of all humans realising their individual powers, their “species-being”, once economic scarcity has been consigned to the past. It is just that they can only do so through other beings. You can’t be an individual on your own, seems to be the paradoxical gist.

Was Marx right?

The cultural theorist Terry Eagleton makes this argument in his book Why Marx was right (2011):

“For Marx, we are equipped by our material natures [as labouring, linguistic, desiring creatures] with certain powers and capacities. And we are at our most human when we are free to realise these powers as an end in themselves, rather than for any purely utilitarian purpose”.

And Eagleton contends elsewhere in the same text:

“Marx was an implacable opponent of the state. In fact, he famously looked forward to a time when it would wither away, His critics might find this hope absurdly utopian, but they cannot convict him at the same time of a zeal for despotic government”.

Even so, the movers and shakers of Silicon Valley are famously more indebted to the libertarian thinker Ayn Rand (exiled by the Russian Revolution) than Marx.

And yet Marx figures more prominently than does Rand in Silicon Valley luminary Peter Thiel’s book Zero to One (2014). Here is Thiel:

“As Karl Marx and Friedrich Engels saw clearly, the 19th-century business class ‘created  more massive and more colossal productive forces than all preceding generations together. Subjection of Nature’s forces to man, machinery, application of chemistry to industry and agriculture, steam navigation, railways, electric telegraphs …. what earlier century had even a presentiment that such productive force slumbered in the lap of social labour?’”

Among his many ventures, Thiel is co-founder and chair of Palantir Technologies, a big data analysis company whose CEO, Alex Karp, wrote his PhD  in dialogue with the Frankfurt School tradition of Theodor Adorno and Jürgen Habermas.

Not poles apart, then, Marx and today’s laboratory of the future on the West coast of the US (and its clones elsewhere)? (It’s moot).

But would he fit into the geek house in Silicon Valley, the HBO comedy? Given his roistering fondness for pub crawls in Soho, and his famous maxim, “Nihil humani a me alienum puto” [nothing human is alien to me], one would have thought so.

PS: In the interests of balance, fellow economist and philosopher Adam Smith’s birthday is on 16 June; we’ll have to wait till 2023 to register his 300th, along with the rest of the media. What would the author of The Wealth of Nations make of Silicon Valley?


Page 1 of 1412345...10...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: