Data Matters

Page 1 of 1312345...10...Last »

August 14, 2018  10:36 AM

How automation can help create value, the 5th ‘v’ of data

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Neil Barton, CTO, WhereScape

For CDOs, defining the data and analytics strategy for their organization is increasingly the most important aspect of their job, with 86% ranking it their top responsibility, according to the 2017 Gartner annual Chief Data Officer Survey.  The four “Vs” of data are well known – volume, velocity, variety and veracity. However, data warehousing infrastructure in many organizations is no longer equipped to handle these. The fifth elusive “V” – value – is even more evasive. Meeting these challenges at the scale of data that modern organizations have requires a new approach – and automation is the bedrock.

It’s all about finding methods of using data for value creation and revenue generation, which occupies 45% of a CDO’s time. This means harnessing the growing beast that is data in a way that is practical, manageable and useful.  That’s where the data warehouse comes in, providing a centralized space for enterprise data that business users, including the CDO, can use to derive insights.

Creating a successful data warehouse, then, is critical for CDOs to succeed in monetizing data within their organization. However, the traditional waterfall approach to data warehousing, first introduced in the 1970s, delivers only a fraction of the value that it could potentially offer. Instead, the approach needs to evolve to become more responsive as organizational needs change, addressing new data sources and adapting to business demand.  Using automation software to design, develop, deploy and operate data warehouses is providing far-reaching value to business leaders. This change is positioning IT to incorporate timely technologies and new data sources more easily, as well as flex when business needs demand it.

The CDO’s role in re-invigorating the data warehouse

As the central storage point for enterprise data, the data warehouse is invaluable for providing business users with information. However, as users become more aware of the potential benefits of data-driven decision making, the gap between user expectations and the data warehouse’s ability to provide current, consumable data in a timely manner has grown. Businesses want insights from data far faster than previously. This is exacerbated with the growth of new forms of data, particularly semi-structured or un-structured information, including social media data, sensor data, real-time messages and client communications, and video/audio files.

Traditionally, data warehouse development and evolution required long-cycle IT projects, incongruent with the needs of a more agile project design and build environment. As the change agents for digital transformation, CDOs must lead in the re-architecting of data warehouses, from artistry to acceleration and automation in order to improve business time-to-value.

Practical steps for the successful implementation of an automated data warehouse

As IT departments are expected to do much more with much less, processes need to change. Data warehouses can no longer be created “artisinally” – rather than crafting a complex infrastructure with unique configurations and a long lifespan, IT teams need to focus on producing an adaptable decision support infrastructure. This means creating a data warehouse that can transform, with ease, as business needs change. Here are five steps for CDOs to help their company achieve this:

  1. Understand the desired outcomes

Before making any decisions as to the future of your data warehouse infrastructure, CDOs need to ensure they understand the specific challenges the business teams are facing where data could help. In essence, the data warehouse automation and modernization program needs to be built around enabling decision-making that will lead to differentiation in the market place.

According to a TDWI survey, re-alignment to business objects is the top reason for data warehouse modernization, selected by 39% of respondents. By enabling collaboration between business teams and IT teams, the CDO helps chart the course for how business goals and technology meet. In turn, this will lead to overall business transformation, accelerated through the new data warehouse’s approach to data-driven decisions.

  1. Understand what you have already

Most organizations already have sophisticated data management tools deployed as part of their infrastructure – however these may not be working to the fullest of their abilities. Organizations already using SQL Server, Oracle, or Teradata, for example, have a range of data management and data movement tools, already within their IT real estate, which can be automated and leveraged more effectively as part of a data warehouse automation push.

However, in that inventorying process, CDOs should be ensuring they have considered the capacity requirements of their data warehouse. Data is expected to continue growing exponentially, so while the data warehouse may be fit for purpose today, it’s important that the automation processes, storage requirements and general infrastructure is of a speed and standard capable of handling this in the future too.

As part of this, data warehouse automation needs to integrate with the business as it currently is and will realistically be in the future, rather than the business as the IT teams wish it might be in an ideal world. CDOs need to encourage their teams to understand the data that is available, and the automated analytics and evaluation processes which can be used to meet specific business priorities. To support this, the data warehouse automation strategy needs to be designed not just for an ideal set up of data, expertly managed and curated, but for the realistic “messiness” of the business data landscape.

  1. Automate efficiently

Data warehouse automation, as with any other large scale transformation project, requires resources – and these are often scarce due to strict budgets and competing priorities. This means that CDOs need to think long and hard about what actually should be automated in order to free up man hours in the future. Some good examples of where automation is a cost-effective step could be in hand-coding SQL, writing scripts or manually managing metadata. All of these are systematic processes, where data warehouse automation can either eliminate the need for human involvement or dramatically accelerate the process.

  1. Embrace change

CDOs should look at data warehouse modernization and automation as an avenue of constant, on-going development. As business needs change and new data sources emerge, CDOs need to be able to re-strategize different parts of the infrastructure to match. Similarly, to minimize disruption and ease the transition for business users, CDOs should look to take a staged approach to the initial automation and modernization process, with a set schedule of when different requirements will be met. Post-production change is inevitable due to evolving business needs, new technologies used and continuous improvement desired. Change needs to be planned for.

At the same time CDOs need to prepare for the human change that automation will create. In business teams, users can be re-deployed to double down on analyzing business intelligence and translating insight into business value. In the IT teams, automation provides new capacity to plan for the future – looking at new analytics tools, or planning for smarter, better ways to deliver on business priorities further down the line.

A data warehouse automation mentality

Data warehouse automation is not solely software you buy. It’s a philosophy and culture you implement. Tools and technologies form the bedrock of the processes, but a data warehouse strategy requires strong leadership, a transparent process, and an unrelenting focus on the business’s end goals in order to succeed.

Without robust data warehouse automation, businesses will struggle to capitalize on the potential of data and its associated technologies. As the strategic lead for data-driven transformation, and the change agents across both business and IT teams, the responsibility falls to CDOs. Professionals in this role need to understand, strategize, and execute on the way that large-scale data usage will influence future business decisions. The adaptability of the supporting data infrastructure can either be a CDO’s greatest weakness or greatest asset. Use the four steps covered in this guide to ensure it is the latter, and to achieve the ultimate goal of any business investment – value.

July 17, 2018  3:31 PM

Era of the Smart Factory: how can manufacturers get to the future quicker?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Antony Bourne, Industries President at IFS

Businesses are constantly told they will go bust if they do not digitally transform and quick, but for manufacturers it’s not quite as simple. If production goes down even for a day, the impact can be irreversible, with millions in revenue potentially lost.

From an ergonomic, economic, and environmental perspective, smart factories – where robots and machines relay real-time data from connected devices to learn and adapt to new demands and autonomously manage entire production processes – are heralded as the way manufacturers can strive towards true digital transformation.

However, for many the smart factory is a faraway utopia. According to a recent Capgemini study, while 76% of manufacturers have an ongoing smart factory initiative or are working towards one, only 14% are satisfied with their level of success. However, manufacturers should not feel overwhelmed, and those embarking on their journey should not change too much too soon or do so without the proper counsel.

Here are some essentials for any manufacturer considering a smart factory rollout.

  1. Business first approach

Business owners or project managers and directors should want to rollout a smart factory initiative not to tell customers or the board that they are ‘digitally transforming’, but to achieve better business results; futureproof its existence, and derive greater value from production plants and the entire ecosystem. Importantly, a smart factory should form part of a wider interconnected IT landscape and be a holistic endeavour.

One business goal for any manufacturer should be achieving full visibility across all assets. As we’ve witnessed recently in the construction industry with the collapse of major contractor Carillion, achieving visibility across a business is paramount. This means manufacturers need to become even more data-driven. Data is the lifeblood of a business, and the more real-time data a manufacturer can draw value from, the more likely they are to achieve full transparency across their estate.

Smart factories give manufacturers the ability to benefit from the accumulation of massive amounts of valuable data. This enables them to identify and address quality issues; rapidly learn and adapt to new demands; predict operational inefficiencies or oscillations in sourcing and demand, and respond to customers’ needs ahead of time.

  1. Connecting physical & digital technologies

Some manufacturers have had mechanical systems in place for over 50 years. With the emergence of technologies such as Robotic Processing Automation (RPA) that present the opportunity to drive transformation in the sector, it is important that manufacturers take advantage and replace outdated machinery when possible, not just update it.

Manufacturers must also move their datacentre to the cloud. Escaping legacy systems enables manufacturers to scale horizontally and vertically quickly, as well as reduce costs, boost efficiency and lower time to recovery. Running off a cloud infrastructure is key to maximising the value and ROI of real-time data that a smart factory produces. For manufacturers yet to make the leap, moving disaster recovery and backups to the cloud is a good place to start.

One technology essential to any smart factory, that sits in-between physical and digital infrastructure, is the Internet of Things (IoT). The benefits of Industrial Internet of Things (IIoT) have been widely espoused. IoT devices derive valuable data from sensors outside of the manufacturing plant and from machines on the shop floor, but importantly, also enable businesses to expand their operations and offer new services to customers. Servitization is upon us, and manufacturers not focused on adding services onto their products to make them more outcome-based with the help of IoT solutions will miss out on valuable business opportunities, or worse.

  1. Focus on people and partnerships

The face of the shop floor will change unimaginably over the coming years. Robots will work autonomously 24/7 in a windowless, lightless environment; but despite this there is one aspect that won’t change – people will still be a manufacturers’ most important asset. Reskilling and upskilling employees, opening their eyes to the art of the possible and getting them on board with change – a change that will likely reshape their purpose in the workplace – is a massive but necessary undertaking.

Manufacturers also need to seek expert advice, or partner with technology providers knowledgeable and proven in the manufacturing arena that can help them realise better business benefits.

  1. Start small and scale

Smart factory investments need to be broken down into bite-sized chunks, with specific opportunities realised first. Value and growth can be created through scaling a single asset and testing the processes and technologies around it. There are some incredible centres in the UK that can help manufacturers test these concepts in a safe, manageable environment.

Take Catapult. Catapult is a network of physical centres designed to help organisations transform ideas into new products and services. Manufacturers can take advantage of initiatives like this to safely test assets with zero downtime in production. Once perfected, the solution can then scale to other assets of the business, such as production lines and factories, with complete confidence in performance.

The true power of the smart factory is its ability to adapt and grow with the shifting needs of an organisation. Manufacturers need to take one step at a time in getting to the future, and recognise there is the right expertise out there to help them achieve their business goals, by implementing a harmonious, automated, and resilient smart factory.


June 21, 2018  10:35 AM

The external workforce: are you realising its full value?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Mikael Lindmark, a senior vice president at SAP Fieldglass

As CIOs and other executives across enterprises turn to contractors, freelancers and other external workers to fill  skills gaps and meet their strategic objectives, they are transforming the way work gets done. However, many organisations lack the management rigour and C-suite attention to reap the full value of their external workforce.

Multi-channel workforce moves to core of business operations

A recent survey we at Fieldglass completed in collaboration with Oxford Economics signaled that UK businesses rely heavily on the external workforce, which includes non-payroll workers and services providers. In fact, nearly nine out of 10 senior executives in the UK (87%) who participated in the survey say the external workforce is essential for them to operate at full capacity and meet market demands—much greater than the global average (65%). Furthermore, this rising multi-channel workforce of contingent workers, independent contractors, alumni (i.e. a business’ former employees) and services providers now accounts for nearly half of workforce spend in the UK (46%, slightly higher than the 44% global average).

When it comes to labour market trends, nearly three-quarters (74%) of UK executives say finding high-quality resources at the right time and in the right place is really or extremely challenging, and in just three years, 87% say the external workforce will be critical for sourcing skills that are in scarce supply. Shortfalls in newer technologies, such as artificial intelligence, machine learning, blockchain, automation, cloud and the Internet of Things are particularly severe in the UK, where 29% of executives compared to 13% globally call this a “major shortfall.”

Cost is not the whole story

Increasingly external workers operate at the core of businesses. Not that long ago, companies discussed how and where to use external workers based on an assessment of what was core or non-core to the business as they sought to lower operational costs. Now, nearly half (45%) of executives believe they could not conduct business as usual without their external workforce. External workers are filling roles across enterprises, and contributing to improvements in businesses’ overall financial performance, according to 55% of respondents in the UK.

In addition to making it possible for businesses to operate at capacity and meet demand, UK executives have a greater tendency than the global average (81% as compared to 62%) to credit the external workforce with being important or extremely important to achieve sustainability goals/shrink their carbon footprint. UK executives also look to their external workforce to increase organisational agility (71%), reduce risk (71%), increase speed to market (68%) and develop or improve products and services (61%). Managing costs remains important (77% compared to 60% globally).

Looking beyond traditional metrics, 71% said their external workforce raises the bar for their employees by bringing in new sources of skills and talent.

Managing and extracting value from the external workforce

Visibility provides a basis for measuring and managing results, yet many businesses lack an adequate line of sight into their external workforce. Management of this workforce segment has not fully matured at most businesses. Consequently, companies are not realizing the most value from this important asset.

Around one in 10 companies that responded to our global survey demonstrate markedly superior performance in managing and extracting value from the external workforce—the “Pace setters.”  They are “highly informed” about five of eight areas that executives consider “very important” for managing non-payroll workers and services providers. For example: What are they doing? Are they being paid negotiated rates? What are the contract terms? What is the duration of their work?  Do they have the required licenses and certifications? What access do they have to systems, confidential information and facilities?

Pace setters also find it easier to manage many, yet not all, aspects of their external workforce. For instance: They have greater confidence in their ability to find high-quality talent at the right time, place and rate. They find it easier to avoid payment of duplicate charges and unauthorised expenditures; track who performed well and who should not be re-engaged; and be compliant on local tax and labor laws, regulatory requirements and candidate privacy requirements.

Pace setters also say their external workforce helps them improve their business’ overall financial/business performance and/or helps them compete in today’s digital world.

The Pace setters show how companies could benefit from better visibility and management of the external workforce. Set goals for your company to emulate or surpass the Pace setters as you establish the management rigor required to harness external resources to achieve better business results.

Mikael Lindmark, a senior vice president at SAP Fieldglass, is responsible for overall operations across Europe. Mikael, who joined SAP Fieldglass more than a decade ago, has more than 20 years of experience in human capital development and management, and is based in the UK.


June 15, 2018  12:51 PM

Can good data approaches help uncover bad data social manipulation?

Brian McKenna Profile: Brian McKenna

Software definitely helped cause the pernicious problem of fake news – but could also ameliorate it, says data expert Emil Eifrem, in a guest blogpost.

Time was it was the tabloid newspapers that cornered the market when it came to fake news (remember ‘Freddy Starr Ate My Hamster’?). These days it’s a whole industry in itself – with the technology in the hands of people who don’t even pretend to be objective journalists.

In fact, traditional journalism’s decline has meant the loss of checks and balances, allowing fake news a free and disruptive reign. Technology advances also supports the ways misinformation is spawned and travels so rapidly, as social media is after all about active sharing. According to one of the biggest studies to date on the problem, conducted by researchers at MIT and published in March 2018 in the journal Science, the truth takes six times longer to be seen on Twitter than misinformation.

Researchers tell us lies are 70% more likely to be retweeted than the truth — even when they controlled for factors such as whether the account was verified or not, the number of followers, and how old the account was. This is not good news for society, democracy and human knowledge, one can argue.

Interestingly, while technology certainly is an enabler of fake news, it may also be the answer to helping combat it. Specifically, graph database technology, a powerful way of recognising and leveraging connections in large amounts of data, may offer some hope of salvation.

Indeed graph software is already used by legitimate investigative journalists: it helped the International Consortium of Investigative Journalists track its way through the TBs of data known as the Panama and the Paradise Papers, for instance. But graph software also turns out to be a way to potentially combat fake news.

Visualising patterns that indicate fake content

It’s reported that Russia used social media in a bid to influence the 2016 US presidential election – and with graph technology’s help, the US’s NBC News has uncovered the mechanism of how that was achieved.

That’s because NBC’s researchers found that the key to detecting fake news is connections – between accounts, posts, flags and websites. By visualising those connections as a graph, we can understand patterns that indicate fake content. The group behind the Russian trolling was small, but very effective, working to leveraging Twitter with popular hashtags and posting reply Tweets to popular accounts to gain traction and followers. In one account, for example, of 9,000 Tweets sent, only 21 were actually original, and overall 75% of all the material were re-tweets, specifically designed to broadcast the messages to as wide an audience as possible. While some accounts posed as real-world citizens, others took on the guise of local media outlets and political parties.

When graph software was used to analyse the retweet network, it revealed three distinct groups – one tweeting mainly about right-wing politics, a second group with left leanings, and a final group covered topics in the Black Lives Matter movement, in an invidious but effective triangulation of extreme content sharing and emotional manipulation.

At internet scale, fake news is just too hard to spot without the right tools. We should look to graph technology, specifically designed to expose connections in data, as a possible way of helping to address the issue.
The author is co-founder and CEO of Neo4j, the world’s leading graph database (http://neo4j.com/)


June 14, 2018  1:19 PM

Challenges and benefits of the Cloud

Brian McKenna Profile: Brian McKenna

This is a guest blog by Claudia Imhoff, CEO Intelligent Solutions and founder, Boulder BI Brain Trust

Like any new initiative, there are both challenges and benefits to weigh when deciding whether cloud computing is suitable for your company’s analytic environment. Let’s start with understanding the challenges.

IT governance and control – IT departments are still leery of letting go of their data. There are many reasons but certainly job loss and the concerns about security and privacy over data rank high on the list. IT is generally responsible for corporate data assets being implemented and used according to agreed-upon corporate policies and procedures. This means that service level agreements between the company’s IT department and the cloud provider are critical to ensure acceptable standards, policies and procedures are upheld. IT personnel may also want insight into how the data is obtained, stored, and accessed by its business personnel. Finally, it is recommended that IT determine whether these cloud-deployed assets are supporting your organisation’s strategy and business goals.

Changes to IT workflows – IT workflows dealing with compliance and security become more complicated in hybrid environments (those consisting of both on-premises and cloud deployments). The workflows must take into consideration the need of advanced analysts and data scientists to combine data that is on-premises with data in various cloud computing sites. Keeping track of where the data resides can be quite difficult if good documentation and lineage reports are not available.

Managing multiple cloud deployments – Often, companies have more than one cloud computing implementation; they may use a mix of both private and public deployments – maybe even multiple ones in each category. The company must determine if each cloud provider is in compliance with regulatory requirements. Also, when considering your cloud provider(s), determine how security breaches are prevented or detected.  If data security concerns are great, it may make sense for the corporation to maintain highly sensitive data (like customer social security numbers, medical health records, etc.) within their premises rather than deploying them to cloud computing.

Managing costs – The on-demand and scalable nature of cloud computing services can make it difficult to determine and predict all the associated costs. Different cloud computing companies have different cost plans. Some charge by volume of data stored, others by the number of active users, and others still by cluster size. Some have a mixture of all three. Be sure to watch out for hidden costs like requested customizations, database changes, etc.

Performance – It is clear that if your provider is down, so are you. All you can do is wait for the provider to come back up. A second concern is your internet bandwidth. A slow internet means slow connectivity.

Now let’s turn to the many benefits of migrating to a cloud computing environment:

Lowered operating costs – This is perhaps the first benefit that companies realize when considering a move to the cloud. There is a significant difference between capital expenses and operating expenses. Basically, you are “renting” the infrastructure rather than bearing the costs upfront of building your own environment. The cloud computing provider bears all the system and equipment costs, the costs of upgrades, new hardware and software, as well as the personnel and energy costs.

No maintenance or upgrade hassles – These are again the headaches for the cloud computing provider. This frees up all resources to have a laser focus on obtaining, accessing, and using the data, not on managing the infrastructure.

Ease of implementation – For most companies, purchasing a cloud computing environment is as easy as swiping your credit card. It takes only minutes to access the environment because the technological infrastructure is all ready to go. This must be differentiated from the data infrastructure that must also be established. Whether you implement a data lake, a data vault, or a data warehouse, design and development work must be performed in addition to the technological set up.

Innovation from new cloud companies – Cloud technologies have been “born” from very innovative new companies. They make full use of all the advantages that the cloud has to offer. These technology companies can also add new features, functions, and capabilities, making them available to all customers immediately.

Elastic scalability – Many customers say this is the most appealing attribute of cloud computing. You can quickly scale up and down based on real needs. There is no need to buy extra computing capacity “just in case” you may need it at a later date. Cloud data warehouses can increase or decrease storage, users, clusters with little or no disruption to the overall environment.

Ability to handle the vast diversity of data available for analytics – Cloud computing providers can handle both well-structured data (like from operational systems) as well as the “unusual” data so popular today (like social media, IoT, or sensor data). Cloud implementations can support both fixed schemas and dynamic ones, making it perfect for routine production analytics like Key Performance Indicators or financial analyses as well as unplanned, experimental, or exploratory analyses so popular with data scientists.

Taking the time to identify both the challenges and benefits associated with the cloud is the first step in evaluating whether a move to the cloud is right for your organisation.


May 16, 2018  5:20 PM

What would Karl Marx make of Silicon Valley?

Brian McKenna Profile: Brian McKenna

It’s the bearded one’s 200th birthday this year. Much of the mainstream media has marked the occasion, including the Financial Times, that pre-eminent “capitalist” newspaper.

Why should Computer Weekly be any different? After all, one of our more distinguished alumni is Paul Mason, these days a public intellectual in the classical Marx tradition.

Mason’s book PostCapitalism (2015) contains an intriguing chapter in which he expands on a neglected text of Marx (the “Fragment on Machines”, published in English in 1973, in that massive tome, the Grundrisse) that seems to predict today’s information economy. Marx figures here as a “prophet of postcapitalism”, according to Mason:

“The productive power of machines like the ‘self-acting’ cotton-spinning machine, the telegraph and the steam locomotive was ‘out of all proportion to the direct labour time spent on their production, but depends rather on the general state of science and on the progress of technology, or the application of this science to production’.

“Organization and knowledge, in other words, made a bigger contribution to productive power than the labour of making and running the machines.”

Gartner’s Frank Buytendijk is another who finds value in Marx when it comes to reading today’s IT-saturated economy. In his book Socrates Reloaded: the case for ethics in business and technology (2012), he writes:

“Marx would have been the first to say [of Facebook, Google et al.] that all the ingredients of a revolt [by internet users] are there. What could happen to Internet giants if they shoot themselves in the feet by pushing their data collection and analyses too far?”

Buytendijk argues that Google and Facebook are as hungry for data as Victorian capitalists were for capital. They want to collect as much data as they can for the benefit of advertisers, not for the producers of [that data]. People alienate their data in the way Marx says we alienate our labour power. Moreover, our love of Google and Facebook’s “free” services are the counterpart of the religious “opium of the people”, in his view.

But – in a twist of the dialectic – in the networked world we can “leverage the same community-based business model of the Internet giants to overthrow them”. Go somewhere else and they crumble.

What would Marx, the nineteenth-century economist, make of the particular, venture-capital fuelled economy of Silicon Valley? Would he throw up his hands in horror? More likely, he’d analyse it; most probably at tedious length. And he’d probably applaud its dynamism, just as he hailed the dynamism of industrial capitalism in the Communist Manifesto of 1848.

But surely the hyper-individualist political and social culture of Silicon Valley would be an anathema to this enemy of individualism and promoter of the collective?

Well, maybe. However, Marx’s Economic and Philosophical Manuscripts of 1844 would seem to demonstrate an advocacy of all humans realising their individual powers, their “species-being”, once economic scarcity has been consigned to the past. It is just that they can only do so through other beings. You can’t be an individual on your own, seems to be the paradoxical gist.

Was Marx right?

The cultural theorist Terry Eagleton makes this argument in his book Why Marx was right (2011):

“For Marx, we are equipped by our material natures [as labouring, linguistic, desiring creatures] with certain powers and capacities. And we are at our most human when we are free to realise these powers as an end in themselves, rather than for any purely utilitarian purpose”.

And Eagleton contends elsewhere in the same text:

“Marx was an implacable opponent of the state. In fact, he famously looked forward to a time when it would wither away, His critics might find this hope absurdly utopian, but they cannot convict him at the same time of a zeal for despotic government”.

Even so, the movers and shakers of Silicon Valley are famously more indebted to the libertarian thinker Ayn Rand (exiled by the Russian Revolution) than Marx.

And yet Marx figures more prominently than does Rand in Silicon Valley luminary Peter Thiel’s book Zero to One (2014). Here is Thiel:

“As Karl Marx and Friedrich Engels saw clearly, the 19th-century business class ‘created  more massive and more colossal productive forces than all preceding generations together. Subjection of Nature’s forces to man, machinery, application of chemistry to industry and agriculture, steam navigation, railways, electric telegraphs …. what earlier century had even a presentiment that such productive force slumbered in the lap of social labour?’”

Among his many ventures, Thiel is co-founder and chair of Palantir Technologies, a big data analysis company whose CEO, Alex Karp, wrote his PhD  in dialogue with the Frankfurt School tradition of Theodor Adorno and Jürgen Habermas.

Not poles apart, then, Marx and today’s laboratory of the future on the West coast of the US (and its clones elsewhere)? (It’s moot).

But would he fit into the geek house in Silicon Valley, the HBO comedy? Given his roistering fondness for pub crawls in Soho, and his famous maxim, “Nihil humani a me alienum puto” [nothing human is alien to me], one would have thought so.

PS: In the interests of balance, fellow economist and philosopher Adam Smith’s birthday is on 16 June; we’ll have to wait till 2023 to register his 300th, along with the rest of the media. What would the author of The Wealth of Nations make of Silicon Valley?


April 20, 2018  12:44 PM

Is AI ethics enough of a niche for the UK’s economic strategy?

Brian McKenna Profile: Brian McKenna

The House of Lords report on AI and UK economy and society came out this week, with the guardedly bullish title: “AI in the UK: ready, willing and able?” The question mark is moot.

I think a strong case can be reasonably made that the government has been using AI as a fig leaf to cover the economic uncertainty generated by the Brexit decision of June 2016. It is hard to blame the prime minister or her chancellor for making this rhetorical move. Neither of them wanted the country to leave the European Union, and vaunting the UK’s putative special strengths in AI, as part of a “global Britain” narrative, provides a quantum of solace. So, why not?

And having an industrial strategy is common ground between Conservative and Labour parties today. The hands-off neo-liberalism of Thatcher and Blair seems to belong in the past.

Moreover, the report has emphasized the strategic need for the government to do more to bolster the UK’s network infrastructure to support artificial intelligence – not just to spawn new start-ups, but to improve economic productivity more generally.

Britain leads the world in AI. Really?

Deep in the House of Lords report (paragraphs 392 to 403) is a judicious dissection of the claim that “Britain leads the world in AI”. It is a cup of cold water realism rather than a bowl thereof. Nevertheless, it is realistic and balanced, and makes an argument well worth thinking about. Essentially, the report acknowledges that the US and China are the real leaders in AI, and contends that the UK should find itself a specialist niche, putting forward the ethics of AI as its preferred candidate.

Would attending to the ethics of AI give the UK enough heft in the field? We do, in the UK, have a tendency to reach for a claim to “lead the world” in doing good things. It was the Christian missionary flip-side of our gunboat diplomacy in the days of the Empire.

The CND movement, at its several peaks in the late 1950s and 1980s is a good example of this: we can lead the world by moral example, said Bertrand Russell and Bruce Kent. I marched for unilateralism myself in the 1980s, but I digress. Suffice to say it is a noble part of the British liberal tradition, and the House of Lords has often given it a home, sometimes outflanking the House of Commons on the left, ironically for the non-elected chamber. (Indeed, the Lords did this this week, with the vote to demand the government includes a customs union in its negotiation agenda with the EU).

It should not, in other words, be ruled out as an idea, this proposed UK specialization in the ethics of AI. Someone should do it.

The case for specializing in ethics

This is the train of argument in the Lords committee’s report:

“we have discussed the relative strengths and weaknesses of AI development in the UK, but questions still remain regarding Britain’s distinctive role in the wider world of AI. The Government has stated in its recent Industrial Strategy White Paper that it intends for the UK to be ‘at the forefront of the AI and data revolution’. What this means in practice is open to interpretation.

“Some of our respondents … made comparisons with the United States and China, especially in terms of funding. For example, Nvidia drew attention to the large investments in AI being made in these countries, including the $5 billion investment announced by the Tianjin state government in China, and the estimated $20–30 billion investments in AI research from Baidu and Google. Balderton Capital emphasised the ‘many billions of funding’ being invested in AI and robotics in China and the US, and argued that the UK Government needed to invest more in academic research to ensure that the UK ‘remains [?] a global leader in the field’.

“Microsoft also highlighted the disparities in computer science education, noting that ‘in a year when China and India each produced 300,000 computer science graduates, the UK produced just 7,000 ….

“However, it was more commonly suggested that it was not plausible to expect the UK to be able to compete, at least in terms of investment, with the US and China …. [W]e were greatly impressed by the focus and clarity of Canada and Germany’s national strategies when we spoke with Dr Alan Bernstein, President and CEO of CIFAR and Professor Wolfgang Wahlster, CEO and Scientific Director of the DFKI. Dr Bernstein focused on the Pan-Canadian AI Strategy’s bid to attract talented AI developers and researchers back to Canada from the United States, while Professor Wahlster emphasised that Germany was focusing on AI for manufacturing”.

There then follows the proposed UK focus on the ethics of AI:

“In January 2018, the Prime Minister said at the World Economic Forum in Davos that she wanted to establish ‘the rules and standards that can make the most of artificial intelligence in a responsible way, and emphasised that the [UK’s] Centre for Data Ethics and Innovation would work with international partners on this project, and that the UK would be joining the World Economic Forum’s new council on artificial intelligence, which aims to help shape global governance in the area.

“On the basis of the evidence we have received, we are convinced that vague statements about the UK ‘leading’ in AI are unrealistic and unhelpful, especially given the vast scale of investment in AI by both the USA and China. By contrast, countries such as Germany and Canada are developing cohesive strategies which take account of their circumstances and seek to play to their strengths as a nation. The UK can either choose to actively define a realistic role for itself with respect to AI, or be a relegated to the role of a passive observer ….

“We believe it is very much in the UK’s interest to take a lead in steering the development and application of AI in a more co-operative direction, and away from this riskier and ultimately less beneficial vision of a global ‘arms race’. The kind of AI-powered future we end up with will ultimately be determined by many countries, whether by collaboration or competition, and whatever the UK decides for itself will ultimately be for naught if the rest of the world moves in a different direction. It is therefore imperative that the Government, and its many internationally-respected institutions, facilitate this global discussion and put forward its own practical ideas for the ethical development and use of AI.”

Finally, the Lords committee has called on “the Government [to] convene a global summit in London by the end of 2019, in close conjunction with all interested nations and governments, industry (large and small), academia, and civil society, on as equal a footing as possible. The purpose of the global summit should be to develop a common framework for the ethical development and deployment of artificial intelligence systems. Such a framework should be aligned with existing international governance structures”.

It’s a thoughtful argument. And it’s surely better than wrapping AI in the Union Jack, trying to gain an edge over nations in a necessarily global field?

Nevertheless, the UK’s most obvious comparative advantage in AI is located at GCHQ, with its special relationship with the US’s NSA. Might cyber-security prove a better niche than ethics?


April 9, 2018  2:59 PM

Why GDPR provides the opportunity for greater data collaboration

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Jim Conning, Managing Director of Royal Mail Data Services (RMDS).

The forthcoming 25 May implementation date for the General Data Protection Regulation (GDPR) is focusing businesses on the whole topic of customer data. How can they ensure that they are compliant and avoid potential fines of up to 4% of global turnover? Research into customer data management from my organisation, Royal Mail Data Services highlights the pressure that companies are under – and how collaboration between IT and marketing is necessary for effective customer data management strategies.

GDPR – varying confidence levels

In a recent survey carried out by Royal Mail Data Services among key decision makers, we found that compliance with the GDPR was the number-one concern for survey respondents, with 29% citing it as their biggest worry.

Focusing on specific areas, the study asked how confident respondents were that their internally held and third-party customer data was GDPR compliant. The positive news is that 78% were either “very” or “reasonably” confident that their internally held customer data complied – although 11% were not confident, including 2% who even more worryingly didn’t know if they were compliant or not.

However, when it comes to third-party data, confidence levels drop dramatically. Just 43% of respondents were “very” or “reasonably” confident when it came to compliance, which demonstrates the difficulty of gathering evidence that the right permissions are in place when data has come from other sources. Only 9% of brands said they were very confident in their data compliance, which shows that there is plenty of work to do ahead of 25 May 2018.

Collaboration is the key

When it comes to data strategy, companies are adopting a range of approaches. Just over half (51%) of marketing teams set data strategies, while other groups such as central data management (26%) and the board (25%) were also involved. Legal and compliance teams were naturally heavily involved in privacy and permissions decisions, taking lead responsibility within 38% of organisations. Forty-four per cent of marketing departments led in this area, compared to 20% of IT/IS teams.

Responsibility for actually managing customer data is also split between different departments. IT/IS was in charge in 30% of cases, behind marketing (37%) and central data management teams (also 37%).

This demonstrates the need for departments to work closely together – each has different skills and approaches that together provide the complete solution for a business and help it to achieve its overall objectives.

Data quality is still an issue

Poor-quality data hits the bottom line, and survey respondents recognise this – they estimated the average cost to the business of poor-quality customer data to be around 6% of annual revenue. For major brands this is measured in millions of pounds – and this excludes any potential GDPR fines.

So what leads to poor-quality data? Respondents saw basic errors as the main culprits, specifically out-of-date information and incomplete data. Increasing automation around validation helps overcome this – but a significant minority (19%) of survey respondents said they didn’t validate website data, and 16% didn’t check data coming into internal systems at all. A similar gap is visible when it comes to data cleansing. While 22% of companies undertake this daily or continuously, one-third (33%) still have no formal processes in place to clean customer contact data. Overall, many businesses are putting themselves at risk of data-quality issues – and potential GDPR investigations over non-compliance.

The Royal Mail Data Services research demonstrates that GDPR is acting as a wake-up call to organisations, providing an opportunity to focus on how they collect, manage and store customer data. Successfully achieving compliance and getting the best out of customer data therefore requires IT and other departments to work together, now and in the future.

You can download a full copy of the research report, “The use and management of customer data”, here.


April 5, 2018  3:03 PM

2018 is a graph-shaped year

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Neo4j’s CEO Emil Eifrem, in which he says graph databases are about to grow up
Graph technology has come a long way: from financial fraud detection in the Panama and Paradise papers to contextual search and information retrieval in NASA’s knowledge graph and its support for true conversational ecommerce in eBay’s ShopBot.

What propels this success is graph’s unique focus on data relationships. And we’ve witnessed the value of connected data explode, as businesses look to drive innovation as they connect supply chain, IoT devices, marketing technology, logistics, payment history, making the value of connectedness across all those data elements increase exponentially.

But only a decade ago, the graph industry was just Neo4j and a few niche players. In the subsequent years other startups made their entrance as part of the NoSQL revolution, while more recently tech giants such as Oracle, Microsoft, SAP and IBM have each produced graph offerings of their own. Today the graph paradigm offers choices –  with native platforms, NoSQL multi-model containers and embedded-in-SQL variants.

Amazon’s long-standing absence from this list of tech behemoths was always a notable irony, given that its business models, in both ecommerce and the data centre, are so graph-influenced. So the recent launch of Amazon Neptune is a welcome progression, marking the full acceptance of graph software into the mainstream. Amazon’s entrance should be welcomed by the graph database market, as it will drive the growth generally and contribute to graph technology’s commercial success.

As with all markets, more competition and choice means stronger market and better products. Ultimately, customers will benefit.

Still in the graph database kindergarten
Now that all of the major database players are in the graph game, the next phase of the market’s development will be all about solutions – though it’s evident that we are only at the beginning of this journey. Graph platforms will likely become foundational elements of corporate technology stacks, interweaving different types of data sources, applying comprehensive graph analytics, deploying easy-to-use graph visualisation tools and constructing purpose-built graph-based applications, which will speed widespread adoption.

Creating the graph ‘SQL
Second, to achieve widescale adoption, the market needs a standard graph query language analogous to SQL that is simple as well as easy to learn and implement.

I believe Cypher will become this standard, because in addition to years of real-world validation it has by far the widest adoption among actual graph end users.

Cypher is overseen by the openCypher project, whose governance model is open to the community; it now has over 30 participants including academics, vendors and enthusiasts. To date, Cypher is used by Neo4j, SAP HANA, Redis Graph and Splunk, and the project has released Cypher for Apache Spark and Gremlin. Amazon is interesting, having hedged bets on two older languages; its decision here may well have an influence.

The graph community is growing

Finally, along with this commercial success comes a growing interest in graph skills and awareness. The community needs to ensure that every developer, data scientist, data architect and even business analyst is skilled in graph technology.

2017 was a massive year for graphs. More entrants into the graph community means 2018 will be even bigger.


March 14, 2018  2:46 PM

Fourth Industrial Revolution rhetoric: mere cant?

Brian McKenna Profile: Brian McKenna

Philip Hammond’s Spring statement, as UK chancellor, reached, predictably, for the rhetoric of the so-called fourth industrial revolution.

Not for the first time. Whenever he gets the chance to say the UK is in the forefront of artificial intelligence, big data analytics, and so on, and so forth he takes it. He might be taking his “spreadsheet Phil” moniker a bit too seriously.

This nationalistic appropriation of AI/machine learning functions as a fig leaf for Brexodus, it almost goes without saying. “Don’t worry about Brexit, we’ve got the AIs and the hashtags to keep us warm”, is the gist of government patter here, whether from Hammond or Amber Rudd, home secretary. How much any of them know about technology is anyone’s guess.

Hammond seems to believe Matt Hancock, secretary of state for culture, media and (also) sport, is himself a product of the software industry — of which he is, admittedly, a scion. This is Hammond, speaking in the House of Commons this week:

“Our companies are in the vanguard of the technological revolution.

And our tech sector is attracting skills and capital from the four corners of the earth.

With a new tech business being founded somewhere in the UK every hour.

Producing world-class products including apps like TransferWise, CityMapper,

And Matt Hancock.”

Hilarious. And Theresa May, the prime minister, is always keen to get in on the 4IR act. Her speech in Davos, to a half-empty hall, was long on technology rhetoric, and short on detail about what the global elite are interested in – viz Brexit.

Now, there is no denying the UK does have some unusual strengths in AI, at least in terms of academic research, and the start-ups therefrom. One can only wonder at the world-class work undoubtedly going on at GCHQ under the AI banner. The UK must, surely, have an advantage to squander?

Hopefully, the forthcoming House of Lords Select Committee report on artificial intelligence will provide a balanced, cool, rational, non-flag waving description of the state of the art in the UK, and offer some policy that will make a positive difference to our economy. But it will only do so if it takes the measure of some of the AI scepticism expressed in the committee’s hearings towards the end of last year. And appreciates that there are different sides in the debate on AI among people who know what they are talking about. It’s not all Tiggerish enthusiasm, whether nescient or not.


Page 1 of 1312345...10...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: