Data Matters

Page 1 of 1112345...10...Last »

October 12, 2017  3:44 PM

Graphs will take applied AI to the next level

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Emil Eifrem, CEO, Neo4j

Google’s dominant search engine has always been driven by smart software. But around 2012, the search giant quietly transformed the way users could search for information – and that’s a change that many of us are going to want to see in other Web applications, too.

What did Google do? It started using a Knowledge Graph – enhancing search results with semantic search information gathered from a wide variety of sources.

That sounds like a small step, but it was a profound one. The traditional way of storing data is ‘store and retrieve’. But that method doesn’t give you much in terms of context and connections and for your searches and recommendations to be useful, context needs to come in.

To help improve meaning and precision, you need richer search – which is what Google started to give us. Knowledge Graphs powered by graph databases are now one of the central pillars of the future of applied AI, and graphs are becoming more and more widespread in the form of recommender system technology or the shopping or customer service chatbot.
eBay’s AI-powered ShopBot, for example, is built on top of a graph database. This enables the system to answer sophisticated questions like, ‘I am looking for a black Herschel bag for under £50 – find me those only.’

The software can then ask qualifying questions and quickly serve up relevant product examples to choose from. You can send the bot a photo – ‘I like these sunglasses, can you find similar models for me?’ – and it will employ visual image recognition and machine learning to figure out similar products for you.

All this is done by using natural language techniques in the background to figure out your intent (text, picture and speech, but also spelling and grammar intention are parsed for meaning and context).

The recommendation engine, built with Neo4j, helps to refine the search against inventory with context, a way of representing connections based on shopper intent is key to helping the bot make sense of the world in order to help you.

That context is stored so that the ShopBot can remember it for future interactions: when a shopper searches for ‘black bags’ for example, eBay ShopBot knows what details to ask next like type, style, brand, budget or size. As it accumulates this information by traversing the graph database, the application is able to quickly select specific product recommendations.

Tapping into human intent is the ‘holy grail’ of what we want to do with applied AI. In this discussion on conversational commerce the example is well made: in response to a statement, My wife and I are going camping in Lake Tahoe next week, we need a tent, most search engines would react to the word ‘tent’ and the additional context regarding location, temperature, tent size, scenery, etc. is typically lost. This matters, as it’s this specific information that actually informs many buying decisions – and which Knowledge Graphs could help empower computers to learn.

Context matters. It is what drives sophisticated shopping behaviour and decision making generally. And just as Google did quietly but effectively five years ago, so the rest of us need to enfold AI-enriched features to our systems, be they retail or recommendations, to make them that more powerful and reactive to user need and business demand.

The author is CEO of Neo Technology, the company behind graph database Neo4j (http://neo4j.com/)

October 12, 2017  1:55 PM

The block chain supply chain. Will these chains meet ever meet?

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Greg Kefer, vice-president, marketing, Infor.

Blockchain is arguably, one of the most hyped technologies around. Despite the fact that Gartner believes that 90% of pilots will fail over the next 18 to 24 months, there is huge, sustained interest and the business value-add of blockchain is expected to grow to more than $176 billion by 2025.

Blockchain recently passed the “peak of inflated expectations” and is now making its way into the trough of disillusionment in the most recent Gartner Hype Cycle for Emerging Technology. Despite all the marketing hype, blockchain does represent a future technology that will mature and become commercially viable, we just don’t yet know where, or when, or how.

Supply Chain automation is one of the opportunity areas that has been associated with blockchain because of the decentralised processes associated with sourcing, manufacturing, shipping and paying for goods. The distributed nature of the technology has led to a great deal of speculation of supply chain applications for blockchain. Asset tracking, payments and intelligent contracts have all been proposed as possible deployments and there is no shortage of PR and event presentations that highlight the potential.

Despite the hype, the market is a long way from maturity in supply chain. Gartner predicts that mainstream adoption of blockchain in the supply chain at scale is likely 10 or more years away.

This reminds me of the early days of cloud. In the late 1990s the notion of renting advanced business software via the Internet saw tremendous VC investment and hype. Thousands of companies emerged, Super Bowl commercials were bought, new ideas surfaced, Y2K happened, and then reality began to set in. After most companies failed, business models were refined, and early adopters began to engage in cloud IT projects which matured the technology.

In the global supply chain arena, cloud computing represented an immense opportunity. Why? Because supply chains had become global, companies were producing and shipping goods all over the world, and they needed technology that could unite entire partner ecosystems together technologically – something enterprise software systems were never designed to do.

Despite superior IT economics and making the impossible possible, cloud still took a full decade to reach a level of mainstream adoption in the supply chain. Complexity was part of the reason. Security was also a concern. And most importantly, the notion of shifting mission critical operations – a company’s ability to deliver its products to the market – from the relative “safety” inside the four walls software systems to the cloud was still a scary proposition for most CIOs.

Powerful use cases began to emerge which showed the industry that connecting partners to a common network and shedding light on what is actually happening in the supply chain could unlock tremendous value.

Excellence in supply chain automation requires more the advanced software. The network is critical and that takes time to build. The reason Facebook is so valuable is not the software code, but rather the 2 billion active users. Same with LinkedIn. While there is no Facebook for supply chain yet, there are a number of network based solutions that have spent the past two decades steadily building up a network of buyers, sellers and partners that are actively engaged in the biggest supply chains on earth. Our GT Nexus Commerce Network is a prime example.

Blockchain will face a similar challenge gaining traction is the supply chain. It will take time.

Blockchain should also not be dismissed as pure hype. Expect to see companies engaging in projects with the multitude of new blockchain focused vendors that are emerging. As use cases make their way into the mainstream, more companies will jump into blockchain which will further prove out what works and what does not.

The degree of complexity in the supply chain will likely keep mass blockchain adoption a few steps behind some other business process areas, but something will happen eventually. It’s very likely that successful projects will be done in conjunction with some of the newer network-based solutions that have built in flexibility to operate alongside new technologies. We’re already seeing internet of things and advanced machine learning solutions being meshed with cloud supply chain platforms and there is no reason blockchain wouldn’t part of that evolution.

But companies should also not stand on the sidelines and wait. A lot of what blockchain promises already exists in some form. The ability to connect a supply chain to a single source of truth exists. Capabilities to monitor access and history exist. And the solutions are available anywhere, anytime via the Web are common today.

If companies are still e-mailing purchase orders and using spreadsheets to run their operations, they will not be in position to take advantage of blockchain. The supply chain has become a strategic advantage and those that don’t stay in front of innovation will find themselves getting disrupted into bankruptcy before Supply-Block-Chain becomes reality. It will be interesting to watch.


October 5, 2017  3:42 AM

Einstein, Leonardo, Coleman, ClAIre … 18c: what’s in a name?

Brian McKenna Profile: Brian McKenna

Oracle launched what it has described as the “world’s first autonomous database” at OpenWorld this week in San Francisco.

CTO Thomas Kurian said, in the press conference following his keynote at the event, it has been working on the new iteration of the Oracle database for a good many years, which has to mean before machine learning became as fashionable as it now is.

The 18c database has a new layer of automation infused into it, meaning, as an example, security patches to fix code flaws that are at risk of hacker exploitation are applied automatically. In both his conference keynotes, Oracle founder, chairman, and CTO Larry Ellison referred to the recent Equifax data loss as an example of why humans have to be taken out of patching in order to reduce the opportunity for error.

The automation added to the database is said to be an example of machine learning in action. Ellison said ML is as radical as the internet itself, and that his company’s new “autonomous” database is “revolutionary” – a term that he said he did not apply lightly.

This stress on machine learning has, of course, been commonplace in the world of enterprise software in recent years. A galaxy of human geniuses – Einstein for Salesforce, Leonardo for SAP to name but two – has been invoked to name a new generation of enterprise software inflected by machine learning.

But why has Oracle refrained from naming their artificial intelligence/machine learning efforts? UK, Ireland and Israel managing director Dermot O’Kelly said, in an interview with Computer Weekly that he thought the epithet “revolutionary” was indeed appropriate for the new iteration of the database and saw no issue with Oracle’s not giving a “fancy” name to its machine learning initiative.

“Unless you know another fully autonomous database then, yes, it is revolutionary. Our customers know immediately what an autonomous database would be – it patches itself, it upgrades, it scales ….

“We keep naming very simple, that is the way Larry likes it. Embedding AI into the applications or the database is much more important than giving it a fancy name. Not one customer I’ve spoken to at Open World has asked me what it is called. I talk to a lot of CIOs who have all done bits with AI, but having it delivered to them embedded into a product set helps them with their journey to using it”.

It is surely reasonable to perceive the naming of AI/ML enterprise software programmes after rare human geniuses as pretentious. However, Oracle’s “no name” approach betrays a pragmatic approach – of which they are making a virtue – that could be said to be lacking in imagination, and so subserving an incrementalist agenda.

Moreover, is there not a difference between automation and autonomization – the latter being the realm of AI in its fullest sense, where computers think like humans, learn by themselves, without human input? Automation, on the other hand is about incrementally improving operational processes, and  reducing human input – and so opportunity for error: and has been the point of business computing all along, so no big change there. Also, in terms of the database, I wonder if the approach of making it more and more automatic – which Oracle has to do, and which has business value for users – in the end risks increased commoditization of the database bit of the IT industry? It might keep you off the front page for losing customer data, but it won’t differentiate your business strategically.

These are open questions, and, as Dermot O’Kelly says it is easier to say than to do autonomy in the database realm.

I discuss these, and other issues provoked by OpenWorld 2017 with my TechTarget colleagues David Essex and Jack Vaughan in a couple of linked podcasts on SearchERP.com and SearchOracle.com:

Oracle AI apps now present throughout enterprise cloud suite

Machine learning and analytics among key Oracle security moves

 

 


September 22, 2017  10:01 AM

Artificial Intelligence could be about to start a processing chip arms race

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Matt Jones, lead analytics strategist, Tessella

Recent converts to AI and machine learning’s amazing potential may be disappointed to discover that many of its more interesting aspects have been around for a couple of decades.

AI works by learning to distinguish between different types of information. An example is the use of a neural network for medical diagnosis. Inspired by the way the human brain learns, neural nets can be trained to analyse images tagged by experts (this image shows a tumour, this image does not). Over time, and with enough data, they become as good as the experts at making the judgement, and they can start making diagnoses from new scans.

Although not simple, it is not the complexity of the algorithms that has held these tools back until now. In fact, data scientists have been using smaller scale neural networks for many years.

Rather profanely, the limiting factor for the past 20 years has been processing power and scalability.

Processors have improved exponentially for years (see Moore’s law). But a few years ago, NVIDIA launched GPUs with chips which were not just powerful, but capable of running thousands of tasks in parallel with an instruction set that was exposed for use by developers.

This was the step change for machine learning and other AI tools. It allowed huge amounts of data to be processed simultaneously. Neural nets, like the many synapses in your brain, process lots of information simultaneously to reach their conclusion, with calculations being performed at each node, of which there can be thousands. Before highly parallel processing capability, this was a slow process. Imagine looking at a picture and taking hours to work out what it was.

The availability of consumer level GPUs with massive parallelisation via Nvidia CUDA cores has meant deep neural networks can now be run in reasonable times and at reasonable cost. A grid of GPUs is considerably cheaper, and more effective than the corresponding level of compute available via traditional CPU’s.

Neural nets have long been used in labs to analyse datasets, but due to compute limitations this would take weeks or even months to complete a run. They found applications where lengthy data analysis beyond human capacity (or patience) was needed, but where speed wasn’t critical, such as for predicting drug-like molecule interaction with target receptors in medicine research.

Today’s neural nets – and deep learning (large, combined neural networks) – can now do the same compute in hours or minutes. Computationally intensive AI processes which previously took hours can be applied to real time tasks, such as diagnostics and making safety critical industrial decisions.

On the shoulders of giants

This has been critical to the rapid rise of AI. Off the back of this, more commoditisation has appeared. Google, Microsoft and Facebook, amongst others, have all developed AI programming tools. These allow anyone to build their own AI on the tech giants’ platforms and run the associated processing in their data centres – examples include diagnosing disease, managing oil rig drilling, and predicting when to take aeroplanes out of service. AI became democratised.

Amongst the excitement around applying AI to a brand new set of possibilities, NVIDIA quietly cornered the market for AI processor chips.

Slightly late to the party, but never to be written off, the usual suspects are catching up. Microsoft (using Intel technology) recently launched Brainwave, a new type of chip specifically designed for deep learning. Google also recently started building its own chips, the Tensor Processing Unit, for AI applications.

This means a chip arms race is around the corner. Expect AI announcements soon from other chip manufacturers, and an aggressive push from NVIDIA to defend its leadership.

None of this is a bad thing for us machine learning professionals. If the capacity to process data increases at an ever faster rate, it expands what we can do with AI and how fast we can do it. Better, faster, more parallelised tasks can mean ever deeper deep learning algorithms and complex neural networks. Data processing tasks which previously took minutes, hours, days, are gradually brought into the realm of real time decision making.

With AI tools and processing power readily available, the desire to harness AI is growing rapidly. The tech giants, innovative startups, and companies undergoing digital transformation all want a piece of the action. Technology advances apace, but the limiting factor now is skills, which have not been able to keep up with AI’s meteoric rise.

Truly harnessing AI requires a wide range of highly specialised skills covering advanced maths, programming languages, and an understanding of the tools themselves. In most cases a degree of expertise in the subject the AI is being designed for (oncology or oil rig engineering for example) is necessary. AI is now seen as a serious career choice, but still, these skills will take the uninitiated a good few years to learn – a PhD and many years’ industry experience, which are needed for most AI roles – do not come overnight.

Meanwhile, a generation of scientists – who have spent the last 20 years in a lab patiently waiting for their meticulously designed neural network to work its way through months of data – are suddenly finding themselves in high demand.


September 15, 2017  10:46 AM

Augmented Intelligence: the key to more effective customer journey mapping

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Vera Loftis, UK managing director, Bluewolf

The availability of real time customer data and the increasing capability of companies to analyse and create actionable insights is revolutionising the way companies interact with their customers. It is also transforming how and when colleagues with varied skill sets across different teams work together.

As our ability to make sense of customer data increases exponentially, marketing experts are collaborating far more closely with data analysts, scientists and IT teams than they have in the past.

In this area, augmented intelligence (AI) is set to assume ever greater importance. AI harnesses the power of information technology to enhance human intelligence. It helps companies to understand their customers’ habits, behaviours and actions to deliver a better overall customer experience. This in turn drives growth, efficiency and innovation.

Against this background, many of the world’s biggest brands need to examine multiple touchpoints across different parts of a customer’s journey. By looking at that holistically, and at what point customers engage or don’t engage in our communications, companies can make an impact and optimise on each unique ‘customer moment’. Moving forward, data scientists and IT professionals will perform a critical role in delivering high quality actionable customer insights.

The customer opportunity

There is a great opportunity for marketers to work closely with data analysts to understand their customers better. With such a raft of data available, the key is to make sense of it and learn more about what customers want. By understanding what products and services an individual wants, and when and through which channels, marketers can maximise engagement with customers and drive sales.

Customer journey mapping and its relevance

This is where customer journey mapping can play such a key role. Customer journey mapping is defined as the process of documenting a customer’s experiences with an organisation, across all major and minor touchpoints. Often done graphically, it can be a complex but very valuable activity that offers a holistic approach to understanding a customer’s experience. The more you use journey mapping, the better you can understand customers’ motivations and behaviours.

So how does AI fit into this picture?

AI can undoubtedly help organisations map the customer journey and, in doing so, understand and engage with them better. Augmented intelligence allows us to make sense of massive amounts of structured and unstructured data to get a clear picture of customer attitudes, motivations, and behaviours. If we can understand and then predict how data can benefit our customers, we can better serve them at a future moment of need.

For example, IBM and Salesforce’s recent partnership includes a way for IBM Watson and Salesforce Einstein to connect and enable deeper customer engagement. Watson can connect AI to weather through IBM Weather Insights. If say, an insurance company knew that hail was about to hit, through Salesforce connectivity, they can simply notify customers to get their cars into the garage. It’s that quick and actionable.

What’s next?

Through artificial intelligence and machine learning, marketers will soon be able to scale and automate campaigns at a level that we haven’t been able to before. There’s still a lot of things that need to be automated in creative and content marketing, but that’s the future. Data is the most important resource organisations have in delivering exceptional customer experiences. Working together across departments, companies need to focus on understanding the changing dynamics of their customers by mapping the customer journey and engaging them better with AI. By focusing on how they can use data and intelligent systems, they can understand customers better. This will help them deliver more relevant, timely and better products that increase customer satisfaction while upholding brand standards and maximising sales.


September 5, 2017  11:50 AM

Against the grain: three business apps CEOs putting in a challenge

Brian McKenna Profile: Brian McKenna

One of the advantages of my role as Computer Weekly’s business applications editor is that I get to have fairly in-depth conversations with US CEOs visiting London. These are background interviews since CW steers clear of giving a platform for suppliers to sell to our readers.

In this blogpost are gathered up excerpts from three of these briefings. All three CEOs are tackling their industry space from a relatively fresh angle, building on modern IT. There’s no endorsement here: similar technologies are always available. And the cliché of “disruption” is surely boring. Nevertheless, these are thoughtful types and, as CEOs, it is their job to do strategy: they own that in a way no other executive does or can do.

Frank Bien is the CEO of Looker, a business intelligence company based in Santa Cruz, California.

Bien’s narrative of business intelligence goes like this. There have been three waves of BI, he says. The first was “the big monolithic stacks: Business Objects, Cognos and Microstrategy. What you go there were complete systems. And you spent a lot of time ‘manicuring the lawn’. By that I mean databases were slow, and were built to do transactions, not to do analytics. So, when you wanted to ask a question, you had to reorganize the data physically”. And that, he maintains, became rigid and inflexible.

The second phase was one of “blowing up that stack about seven years ago. And there were small tools raining out of the sky to do things separately, like data preparation or visualization. And, as vendors we said to customers: ‘you put all that together’”. This, in his view, is the era of Qlik and Tableau.

“At the same time, there was a revolution in data infrastructure, with technologies coming out of Google and Facebook, and so on. Then the cloud happened, with Amazon Redshift, Azure, and so on, and it became trivial to store everything. So, that second wave of BI tools was evolving while there was a complete revolution underneath it, and the tools did not catch up.

“And so there is a third wave, where there is a re-constitution of a complete platform, but working in this new data world”. Which is where, needless to say, Looker comes in, in his view.

“I think the BI tool is dead. The big trend is that people are creating a separate class of application. If you think about who is being successful in data, it is not Tableau or Qlik, it’s companies like New Relic, AppDynamics, and Anaplan. These are all SaaS products that are providing a consolidated view, based on monitoring, to a knowledge worker, within workflow. Looker is a tool to build those applications: like AWS billing analysis or a Google marketing attribution application”.

Bien also takes a robust line on the fashion for artificial intelligence/machine learning. “It is parlour tricks. It is interesting to provide AI on really specific business problems, like ‘be predictive on my sales pipeline and tell me what leads would be most interesting to convert’. But the attention it is getting now is [excessive] and the main thing is to have access to reliable data, otherwise your predictive modelling will be false”.

Embedded BI

Stephen Schneider, CEO of Logi Analytics, based in McLean, Virginia leads a supplier that also takes an embedded approach to BI. He takes the view that the business intelligence and analytics market has become commoditized. Logi embeds its analytics into software, including ad-hoc appslications.

“We do believe every company is becoming a software company, so that represents a growing opportunity for us, as applications become more analytical”.

Connected planning in the cloud

Anaplan – mentioned by Bien – is a UK company, founded in York by Michael Gould, that has a base in San Francisco, as well. Its relatively new CEO, Frank Calderoni says its “connected planning”, cloud-based software is well positioned to challenge SAP, Oracle, and Microsoft.

Calderoni was a senior finance professional at Cisco and at RedHat, where he was CFO. In the finance organization, as a user, he had, he says the usual experience of using Oracle, SAP and Microsoft. “That’s all that’s been available for decades. They are costly, complex and resource heavy. Having now the ability to work with something as flexible and agile as Anaplan — it is a whole new world”.

On Calderoni’s account, Anaplan is being used beyond finance and sales, considered departmentally, to other functions, like supply chain and HR, as well as in a more enterprise-wide way. He gives as an example HP using the software to take their sales forecasting down from four months to one week, but then extending use from sales to financial forecasting. And he cites supply chain use cases, especially in retail. EAT is a UK customer that uses the application to track what foodstuffs are being bought more than others “so they can stock the right food in the right place at the right time”. Louis Vuitton is also using the software to do similar supply chain analysis, to decide what products are most heavily in demand.

Calderoni says the biggest issue companies now have is how they make decisions in a digital context. “There is a higher level of completion now that means they require information that is not readily available from planning tools they might have had in the past. That’s how we get in”.


August 21, 2017  12:25 PM

Anti-money laundering – four big factors that contribute to compliance failure

Brian McKenna Profile: Brian McKenna

This is a guest blogpost by Partha Sen, CEO and co-founder, Fuzzy Logix

Money laundering was back at the top of the agenda recently when, on 26 June, the European Union’s Fourth Anti-Money Laundering Directive came into force requiring European member states (including the UK) to update their respective money laundering laws and transpose the new requirements into local law.

Primarily affecting those in the gambling sector – whose levels of laundering for criminal activity were considered “of concern” to the EU – the overall impact of this 4th AML Directive will be a requirement to “know your customers” better and enhance your AML and financial crime framework with more effective prevention and detection controls.

With customer, data and transaction levels increasing exponentially year-on-year, “knowing your customers” better has become a hugely mathematical and complex task.

As long ago as November 2009, Forrester published a research report entitled ‘In-Database Analytics: The heart of the predictive enterprise’. The report argued that progressive organisations “are adopting an emerging practice known as ‘in-database analytics’ which supports more pervasive embedding of predictive models in business processes and mission-critical applications.” And the reason for doing so? “In-database analytics can help enterprises cut costs, speed development, and tighten governance on advanced analytics initiatives”. Fast forward to today and you’d imagine that in-database analytics had cleaned up in the enterprise? Well, while the market is definitely “hot” it appears that many organisations have still to see the need to make a shift.

And that’s despite the volumes of data increasing exponentially since Forrester wrote its report meaning that the potential rewards for implementing in-database analytics are now even higher.

Given we can deliver our customers with analysis speeds of between 10 – 100 times faster than if they were to remove the data to a separate application outside of the database, we have a ‘hard metric’ that is very compelling in helping us convince prospects of the value of in-database analytics. It’s what gives us confidence that the shift to in-database analytics as the standard for data analysis is a question of time rather than choice. Quite simply, the volumes of data that are increasingly being created mean that the only way to process the data and find analytical value is by doing so within the database. But, as ever, real world examples are the best way to illustrate a point so let’s take an unusual one: money laundering.

Banks have a vested interest in ensuring they stay compliant with the regulations in place for catching and reporting anti-money laundering (AML). The regulations have been in place for several years, and it is likely that most large banks have systems/processes in place to track and catch money-laundering activity. Despite this, we still hear about cases where the authorities have fined reputable banks for their failure to implement proper AML solutions. Very recently, in May 2017, Citigroup agreed to pay almost $100m and admitted to criminal violations as it settled an investigation into breaches of anti-money laundering rules involving money transfers between the US and Mexico, its two largest markets. Earlier this year, Deutsche bank was fined $650m by British and US authorities for allowing wealthy clients to move $10 billion out of Russia. So why are current implementations/best practices not keeping up?

Four impediments to compliance

Let’s look at 4 big factors that contribute to compliance failure in the realm of anti-money laundering:

  • The number of non-cash transactions around the world is exploding. Think about applications like Uber, Venmo, PayPal, Xoom and various international money transfer services (almost every large retailer is now in the Moneygram business) – the number of non-cash transactions in the world has exploded. The number crossed $426 billion in 2015, and is growing at over 10% annually. Conclusion: There’s just a lot more hay to find the needle in.
  • Legacy solutions were largely rules-based. A programme would run through the list of rules, and if any transaction matched a rule, a STR (suspicious transaction report) would be raised, meaning the transaction needed deeper analysis – often with a human in the loop. These rules were good at some point in time, but there are several issues with any rules-based system: (a) Rules are static, and require constant periodic refreshes. For instance, is a $4000 deposit to a rural bank typical of a crop cycle, or an anomaly? (b) Rules do not often transcend cultural/economic/geographical boundaries – e.g., a $400 hotel room in Washington D.C. may be normal, but in rural India it might warrant a second look. Finally (c) Rules present an inherent bias and don’t look at the data to determine what is ‘normal’ and therefore what pops out as ‘abnormal.’
  • The analysis of AML is mathematically complex. Incidence of fraud is a rare event, compared to the data that needs sifting through, and catching AML transactions involves looking deep into the history of the account. Each time the system flags a transaction for a deeper look, a human has to investigate further. False positives mean a lot of resource and time investment for the bank. Couple this with the explosion in the number of transactions and the multiple channels through which non-cash transactions occur today, and you can see why systems that were put in place over a decade ago are in serious need of an upgrade.
  • The sheer scale of the human capital cost of AML is mind blowing. Many financial institutions I have spoken to have described the ‘human army’ required to both manage and resolve the escalations of ‘suspect transactions.’ And on top of the army of staff already in employment, each false positive means a further increase to constantly review these new transactions. A cursory search online reveals one institution currently has 95 openings in the AML space, with another looking to optimise it’s ‘army’ through automation. Large institutions are typically dealing with a true-positive rate of 2-3% (number of SARs filed out of the total number of cases looked at). Adding a solution that can increase this true positive rate by another few percentage points , then, is almost a slam dunk. Offering greater productivity without needing to increase the size of the army, or offering to maintain productivity levels with fewer staff means they’re all ears!

With the money at stake for money launderers (according to the UN, $2 trillion is moved illegally each year), the efforts taken by criminals to avoid detection have become incredibly sophisticated. Organised crime is continually seeking ways to ensure that the process of money laundering is lost within the huge amounts of financial data that are now being processed on a daily, hourly and even by-the-minute basis. Their hope is that, because so much data is being processed, it is impossible to spot where illegal money laundering activity is happening. And they’d be right, if you had to take the data out of the database for analysis.

Achieving a good degree of accuracy in a typical large bank means having to analyse billions of data points from multiple years of transactions in order to identify irregularities in trends and patterns. A traditional approach would require moving the data to a dedicated analytical engine, a process that could take hours or days or more depending on the volume of data. This makes it impossible to perform the analysis in a manner that can provide any real value to the organization. With in-database analytics, there is no need to move the data to a separate analytical engine, and the analysis can be performed on the entire dataset, ensuring the greatest possible coverage and accuracy.

One-nil to the good guys

One of our largest customers is a leading retail bank in India. It was experiencing a rapid growth in data volumes that challenged its then-current AML processes. By not needing to move the data for analysis, we were able to analyse billions of data points over a number of years (3+) of historical data to identify possible irregularities in trends/patterns, and do so in under 15 minutes – faster than any other method. By not working to a pre-defined set of analytical rules and by letting the data ‘speak for itself’, it is possible to uncover patterns which occur naturally in the data. As a result, the bank is seeing an improvement of over 40% in terms of incremental identifications of suspicious activity and a 75% reduction in the incidence of ‘false positives’. In short, good guys 1, bad guys 0 because in-database analytics is having a very real impact on the bank’s ability to spot where money laundering is happening.

I’m pretty sure that when Forrester published its report into in-database analytics towards the end of the last decade, it didn’t envisage the fight to combat money laundering being a perfect case study for why in-database analytics is a no brainer when handling large volumes of data. But in today’s world, with ever increasing data volumes and the requirement to understand trends and insight from this data ever more urgent, in-database analytics has now come of age. It’s time for every organization to jump on board and make the shift; after all, if it can help defeat organized crime, imagine what it could do for the enterprise?


August 4, 2017  10:08 AM

Blockchain beyond the realm of financial services

Brian McKenna Profile: Brian McKenna

This is a guest blog post, in which Jody Cleworth, CEO of MTI [Marine Transport International], explains why blockchain is, in his view, good for businesses in the real economy, including the shipping industry.

Blockchain is best known as the technology behind the bitcoin cryptocurrency. Although this is its most common association, it is important for businesses to recognise that blockchain is a standalone technology, with its value reaching beyond the realm of cryptocurrencies and financial services.

Blockchain is unique in its ability to provide a secure, yet transparent, platform that enables permissioned access to private transactions and digital records. Even more impressively, it can do this in real time and can be used in just about any industry. To name but a few, shipping, manufacturing, and real estate are beginning to recognise and exploit the promise of blockchain: that of absolute trust and accessibility.

Simply put, blockchain is a way of recording data: that is, anything that needs to be individually recorded and certified as having happened. It is a type of digital ledger that can be used to record a wide range of different processes. It is of most value when it comes to recording the steps in a process that involve a wide range of parties, where responsibility is handed off at each point. If a supply chain is blockchain enabled, absolute certainty can be created at each step of the process. Placing a digital ‘block’ on the ledger indicates that a process has taken place, and all parties have the ability to view this. It is similar to using a Google document, which is shareable and visible to all those inputting data.

In the business world, blockchain technology is becoming increasingly attractive for tracking the movement of items through supply chains which link a variety of organisations. A container logistics company, for example, is obligated to interact with stakeholders such as shipping lines and port authorities, and so there are many points at which accountability could become an issue. In order to ensure that containers are shipped and received in a good state, a safeguard is needed. Blockchain ensures that each party within the supply chain takes responsibility for their own dataset, but also shares their data with all other parties.

Early Proof of Concepts in the shipping industry have attested that greater efficiency is achieved by supporting supply chains with blockchain technology. Shipping companies, suppliers and distributors have recognised that ‘smart contracts’ can decrease costs and increase profitability, as a result of capitalising on the links between supply chain actors. The use of blockchain in the shipping industry serves as a promising example for all sectors that engage in a series of interrelated processes, as blockchain increases collaboration between parties, heightens visibility, and reduces resistance.

In addition, the distributed nature of a blockchain database makes it harder for hackers to attack. Central servers that store data are easy targets for cyberattacks, but the blockchain model does not include this. Instead, the data is copied identically across each ‘node’ in the network, meaning that if one computer is successfully targeted, it does not result in business devastation. Ultimately, in order to hack a blockchain database, simultaneous access to every copy of the database would be needed.

Blockchain has the ability to empower a multitude of industries to better adapt to the digital economy. Embracing this technology allows executives to assert greater control over their transactions, whilst protecting the privacy of terms and conditions between parties. Essentially, blockchain creates a secure business environment, where reliable transactions become a reality without the need for a centralised authority.


August 3, 2017  1:49 PM

How to get the best business value out of data scientists

Brian McKenna Profile: Brian McKenna

This is a guest blog post by Jane Zavalishina, CEO, Yandex Data Factory

Well-established enterprises like retailers or manufacturing companies now have an abundance of data at their disposal. Unfortunately, merely possessing vast amounts of raw data does not lead directly to increased efficiency or the rapid development of new revenue streams. Instead, everyone must now figure out exactly how to make this data work for them.

Following in the footsteps of the internet giants – Google, Facebook and others – established enterprises are eager to invest in advanced analytics solutions to capitalise on the opportunities that possessing this data presents. To address this, an increasing number of businesses are deciding to bring machine learning in-house – introducing new departments and resources to accommodate. Others are choosing to collaborate with external teams to tackle the task. Regardless of the approach chosen, both bring a new distinct set of challenges to resolve.

The main challenge is revealed in the name of the discipline itself: “data science“. To succeed, enterprises need to merge two very different worlds – an economics-driven business and a scientific, data-driven department. While the cultural and organisational clashes are hard to avoid, they are rarely foreseen.

Here are few things to keep in mind:

Business is to set the goals, and it may be not easy 

Decision-making in businesses is far from data-driven –  with authority, persuasion and vision playing a significant role. Science, however, is based purely on evidence and experiments. Synthesizing these two approaches is the primary challenge when you start to work with data science.

Businesses will have to learn to formulate tasks based on what they want to predict or recommend, rather than “understanding” or “insights”. Despite being used to explicable measurements and disputable arguments, they will have to learn to work with uninterpretable results by managing them through correctly defined metrics. The task of translating the business problem into a mathematical statement, with precisely defined restrictions, and setting a goal in such a way that actually measures its influence on the business is an art in itself.

For example, if the goal is to improve the efficiency of marketing offers, it would be incorrect to task a data scientist with investigating the top ten reasons for refusals or delivering an innovative way to segment the audience. Instead, they should be tasked with building a “recommender” system that will optimise a meaningful sales metric: the margin of the individual order, the number of repeating purchases, or increase in sales of a specific product group. This must be identified beforehand based on the business’ strategic priorities.

The whole business will be affected, not just one department

When it comes to data science, it is wrong to think that a new means to solve a set of given tasks will be handled by one single department. The introduction of highly precise “black box” models will eventually affect company culture, the organisational structure and approaches to management – all of which must be taken into account to succeed.

One aspect of this is data access. Businesses should work on the ways to establish easy sharing of data which is too often siloed within each individual department.

Another is the ability to experiment. The only way to estimate the success of a machine learning model is by putting it to practice and measuring the effect as compared to the existing approach, isolating all other factors. However, running such A/B tests in an established business may not always be straightforward. For example, retailers aren’t going to just stop promotional activities in a few stores to have a baseline group for comparison. Preparing, organising and strictly following such procedures to measure the effect of data science applications on a business is part of the job and so will need to be integrated in the company DNA.

Last, but not least: managing blame. For decision automation to succeed, top management must support the initiative. However, once the models are put to work, it is impossible to place responsibility on any individual person any more – say, the one who used to sign off the demand forecast. Instead, the roles should be given new definition and new ways of assigning responsibilities and controlling the results, introduced.

Every data science project is a small research project

Leading an in-house data science team also requires a different approach to project management. Data science differs from software development, or other activities where the project can be broken down into pieces, and the progress easily monitored. Building machine learning models involves a trial-and-error approach, and it is impossible to track whether your model is, say, 60% or 70% done.

Businesses, on the other hand, are used to the following process: planning ahead, tracking the progress and looking at tangible intermediate results. With data science, it is no longer viable to plan a whole project and expect smooth movement towards the end goal. Project managers should instead be carefully planning quick iterations, keeping in mind that failure is always a possible outcome. Adjustment to a no-blame culture should be also part of the job.

As always with science, you cannot expect to make a discovery according to a plan. But managing a lab efficiently will deliver a product of predictable quality, and not just exploratory research. To succeed businesses will need to understand how to make this work for them, rather than transferring old project management guidelines to a new team.

You will have to get comfortable with a lack of understanding

Data science is a complex discipline, involving major chunks of statistics, probability theory, and, in essence, years of rigorous studies. Sadly, managers have very little chance of acquiring this knowledge quickly.  

When acting as team leads, or as internal clients for data science, managers will need to get comfortable will this lack of technical knowledge. Instead, they should articulate success through results, according to defined experimental procedures put in place to check the quality of the models.

 You cannot do everything in-house

Build or buy is the often question faced by organisations. Many businesses with stronger IT teams are very tempted to run an own data science department, sometimes not fully comprehending the challenge and the implications.

If data science is very close to the core business that is probably the right choice – it would be hard to imagine Amazon outsourcing it recommendations engine. But when you are an offline retail, factory, or bank, having data science in-house often equals building a separate R&D business. This presents new business challenges like competing with the internet giants when hiring rare talent. What’s more, many soon discover that either financial resources are unavailable, or is it strategically incorrect to add a new data science line of business.

At the other extreme, full outsourcing means lacking the expertise on a matter of major importance to the modern businesses. The answer is in combination. What businesses should strive to do is build teams that can tackle the challenges that are close to the core business, or require deep domain knowledge, while competently outsourcing the rest.

Developing this “data science purchasing” capacity is of outmost importance to always stay on the top of the game. You should be able to formulate the tasks and establish business cases, to easily test and compare quality of external algorithms, even probably running several different at a time, and switching to the best one without any disruption.

Summing up

The role of a data scientist within a business is to access data and “extract value” from it, based on the objectives they have been given. The truth is, data scientists are not magicians. They must be given the right metrics, huge amount of data and the time to experiment to deliver the required outcomes, with failure still remaining a possible option.

This scientific approach is unusual in business, and understanding how to work with data scientists as a business leader is key to success.


August 1, 2017  3:49 PM

Advancements in automation: why AI won’t replace the human touch

Brian McKenna Profile: Brian McKenna

This is a guest blog post by Larry Augustin, CEO, Sugar CRM

The way we engage with technology is changing constantly and there’s hardly a day that goes by where we don’t hear about Artifical Intelligence (AI) and the ‘rise of machines.’ While mature AI-related technologies are still in their infancy, it’s a fact that workplace automation is already here across all sectors. From farming to fashion, businesses are using automation to reduce costs and pick up the pace of production. A report from PwC earlier this year claims that in the next 15 years, 30% of jobs in the UK could be affected by automation, with 46.4% of manufacturing jobs and 56.4% of storage jobs being automated by the early 2030s. Despite these high figures, PwC argues that for the most part, automation will enhance productivity, cut costs and add real value to sectors that depend on intelligent human interaction.

AI requires machines learning. And, to do so, machines needs data – lots of it. But, humans are also becoming better at their jobs because of the weath of data out there that gives us a universe of information whenever, wherever. We need only to turn to the likes of Google to see how data driven applications have re-invented the way we exist, and all this free information has improved the way businesses operate. But getting the right data fast can be a cumbersome process – after all, not all data is good data. The good news is that help is at hand and technology is advancing to help automate the most lengthy and mundane of data entry tasks.

The evolution of  CRM

Technology has always progressed at dizzying rate. In the 1960s, Gordon Moore coined the term Moore’s law, which predicted that the overall processing power of a computer would double every two years. Although technology has far surpassed Moore’s predictions, the basic premise remains the same – that technology by its very nature will continually advance.

Customer Relationship Management (CRM) systems have not followed Moore’s prediction. For a long time, legacy CRM was stuck as online record keeping systems that was good for generating reports and telling how effective you were last quarter, but didn’t do a lot to help you improve in the future. Finally, with more data, analytcs and automation, the evolution of CRM has enabled businesses to streamline data to get optimum results. It has evolved from somewhat a cumbersome platform to a productivity enabler, providing relationship intelligence and bringing in data from outside source sand it’s something we’ve been proud to bring to the market in our own recent launch of Sugar Hint. Ultimately, it’s about automating mundane tasks, allowing humans to focus on tasks that only humans at present can do.

The importance of human interactions

Despite the rapid growth of technology, a report by Gartner outlines that 89% of companies now believe that the customer experience they can offer is the most important benchmark by which they are judged. It’s a figure that encapsulates the vital role that humans still have in the workplace. Although data can be effectively sorted by a machine in under a millisecond, humans are the ones who implement technology and give it meaning.

In the context of AI and chatbots managing customer interactions, there are some benefits in using computer power to handle high volume tasks – like in e-commerce. But I’d argue that in high value situations – car sales or investment banking for example – people still want to deal with people. Where the technology comes in, is in supporting humans in giving them the information they need intelligently at a time they need it. It complements the job, rather than replacing it.

What will become of humans?

The rise of automation will reinvent traditional business practices. We don’t need to sit and cower in fear of a global AI takeover, we need to understand how automation will enable us as humans to do more of what we do best: being human. Automating menial tasks such as data sorting and emailing will enable businesses to re-work the goalposts of traditional jobs. Technology will be the enabler that allows employees to focus on their own skillsets. A report by Accenture maintains that 80% of businesses believe that technology will enable a more fluid workforce. It would seem that if we get automation to do the groundwork, employees, employers and customers will all see positive benefits.

History is littered with prophesises that automation will make humans redundant. We need only to turn to the First Industrial Revolution where textile Luddites were concerned that machines would take their jobs and steal their livelihoods. But, with hindsight we can see these advancements only enhanced the workers lives and increased productivity. As we now enter the Fourth Industrial Revolution this fear again has once again come into play. Although there will be a marked increase in automation, technology will aid businesses and employees – making for a better employee and a more successful business.


Page 1 of 1112345...10...Last »

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: