CW Developer Network


September 17, 2019  6:30 PM

RPA series – Blue Prism: RPA has an interconnectivity X-factor

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Jon Theuerkauf, chief customer officer at Blue Prism – and this story forms part of a series of posts on CWDN that dig into the fast-growing topic of Robotic Process Automation (RPA).

Blue Prism’s RPA technology, powered by Microsoft SQL and built on the Microsoft .NET framework, automates applications and supports any platform including mainframe, Windows, Windows Presentation Foundation (WPF) and Java. 

Blue Prism does not use code and does not hack into the backend of applications; it uses credentials to access applications just as a human employee would.

TechTarget defines Robotic Process Automation as the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.

Theuerkauf writes as follows…

Today, RPA adoption is skyrocketing, being used by business operations teams, within large scale, enterprise environments, to replicate process work that was previously done by humans — via automated, ‘digital workers’.  I used this technology with great success at BNY Mellon, it provided the means to connect our disparate IT systems helping enable a strategic, business-led, change, that goes far beyond merely automating tasks.

What we’ve dubbed ‘connected-RPA’, is transformational because it’s quick to implement and non-invasive to existing systems. This means digital workers can access and read the user interface to interoperate and orchestrate any third-party application – just as an employee would, so no costly alteration is required. It also means no lengthy design, coding, build, test and rollout projects too.  Essentially, the non-technical user is in charge.

The universal connectivity capabilities built into digital workers, also enable them to consume intelligent automation skills through a broad ecosystem of complementary technologies – so they adapt and learn from humans and other systems. This, coupled with the increasingly intelligent way that digital workers operate, is now being harnessed by business users to integrate with and utilise any new or existing technology application.

Business users, who really understand their processes, simply create automated processes in a “Visio-like” designer, which are then used by an easy-to-control, digital worker to manage a task – using the same applications. These users can also easily access leading-edge cloud, AI, cognitive, and other capabilities, through drag and drop, intelligent automation skills. This enables incredibly fast iterations when creating innovations to keep organisations ahead.

Blue Prism extends these automation capabilities with its Digital Exchange (DX) marketplace, comprising of applications that are accessible to customers, resellers and technology partners. In this way, users can incorporate cognitive and AI technologies into their automation projects without incurring any risk.

RPA in practice

Here are 10 examples of what can and is being achieved by integrating connected-RPA with AI and other cognitive technologies:

  1. Collaborating with AI and machine learning tools for multilingual, automated e-mail processing for inbound customer inquiries and e-mail triage
  2. Anti-money laundering prevention in conjunction with blockchain technologies and business process management tools
  3. Automated case handling and resolution for insurance claims
  4. Automating the extraction of unstructured data
  5. Using AI tools to gauge sentiment analysis, intensity and mood for customer support–then automatically elevating requests to a customer representative
  6. Working in conjunction with process mining tools to automatically extract historical records/data research and business intelligence analytics
  7. Dynamically and automatically verifying legal compliance on complex contracts
  8. Collaborating with optical character recognition and computer vision technologies to automatically verify identity for loan processing, or transform secured faxes into searchable, text-embedded formats
  9. Automatic, real-time translation in virtual meetings
  10. Automatically connecting chatbots and humans for financial transactions, human resources, or customer service requests

Long term sustainability

The key goal for process automation is to deliver longer-term value at scale, so they must be underpinned by a clear vision linked to strategic imperatives – with business users driving it, IT supporting it and other key stakeholders championing it. Automations must then be carefully planned, modelled, designed and centrally pooled for re-use. It’s also best to either make processes more efficient, prior to automating – or to redesign them during the design phase. Organisations will then be able to re-imagine processes and organisational structure with a wider range of more innovative, impactful, RPA use cases across the enterprise.

Ultimately, organisations will automate more – by expanding on their early efforts, more intelligently and strategically with insight, and they’ll automate better – by building and running higher quality process automations, faster and more easily for the long term.

Theuerkauf: RPA puts the non-technical user in charge.

September 17, 2019  6:29 PM

RPA series: Enate – why initial RPA projects fail… & how to succeed

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Kit Cox, CEO at Enate – and this story forms part of a series of posts on CWDN that dig into the fast-growing topic of Robotic Process Automation (RPA). 

Enate is a service orchestration SaaS platform that manages any vendor technology alongside humans to create a digital workforce. Enate handles work so bot exceptions and failures are flipped back to humans, which eliminates the need to hard-code exceptions. That means businesses can use the time saved to get on with more automation elsewhere in the business.

TechTarget defines Robotic Process Automation as the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.

Cox writes as follows…

Are you prepared for your first RPA project to fail?

A report published by Ernst and Young predicts between 30-50% of all initial RPA projects fail. EY cites reasons such as forgetting about IT infrastructure, targeting RPA at the wrong processes and applying traditional delivery methodologies. Another key reason is treating robotics as a series of automations rather than an end-to-end change program.

It’s well-documented there’s a goldmine of value to be found in using RPA to eliminate mundane tasks from human jobs, liberating humans to use our innate skills to do better (and more enjoyable) work.

But deploying RPA is more than setting up a Centre of Excellence (CoE). How RPA slots into your existing processes can be tricky. You must change the way you think about your processes beyond the IT department. The very notion of a ‘job’ will have to change as tasks are mapped to core skills across a resource pool of human competencies and technology capability. Let me explain.

Intelligent building blocks

Think of all the different skills your business requires to deliver a complete service. Let’s say a customer requested some information about an online order. Your employees would need a range of skills and competencies from reading and understanding the email request, manually checking and updating a database, having a conversation and solving problems to provide a good customer experience. Imagine all those skills can be represented by different colour toy building bricks to create a model of your process. RPA can replace ‘skill’ bricks relating to following basic rules (reading, copying and clicking). But, if you remove and replace one brick from within an entire model, it’s likely you’ll weaken – or even break – the overall structure.

There’s also the challenge of ensuring all the different ‘tasks’ involved in delivering a process are completed seamlessly in sync with all your different technologies – alongside clunky legacy systems – and have oversight of all the workers involved.

Without overarching governance and workflow, RPA will merely deliver sub-processes within a service slightly more efficiently. Then, every time a process or technology changes, there’s greater complexity and risk. Businesses need a base plate to build the blocks of the future of work upon.

And then… there’s scaling

Of course, your first RPA project might not fail. But, even if the initial projects don’t fail, getting to a point of scale is still a challenge.

Deloitte’s most recent report found only 3% of companies had achieved substantial scale (based on 478 respondents), citing process fragmentation as the biggest barrier.

Phil Fersht, CEO and chief analyst at HFS Research, calls this struggle to move beyond pilot exercises and project-based experimentation a “serious point of failure for the whole industry”.

The three top RPA vendors (as named by Gartner: Automation Anywhere, Blue Prism and UiPath) have all partnered with Enate to enable RPA at scale, recognising the need for orchestration – and for cross-platform integration and connectivity to ensure RPA’s success at scale.

RPA is forcing us to reassess how our businesses are structured.

Organisational change will have to happen first to realise RPA’s full potential. RPA is just one of the building blocks of a new way of working. Unless businesses crack vendor silos and achieve cross-platform governance across systems and people, RPA will not meet the industry’s – or your – expectations.

Cox: RPA is forcing us to reassess how our businesses are structured.

 

 

 

 

 


September 17, 2019  6:26 PM

Oracle Open World: Autonomy, Acceleration, Assistants & Applications

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Oracle used its Oracle Open World 2019 conference to produce its customary barrage of product announcements.

Amidst this volley of customer wins, platform enhancements, application enrichments and datacentre expansion celebrations, the Computer Weekly Developer Network team has naturally gravitated towards the more data-centric elements being tabled here.

As we have noted here already, the drive to provide autonomous intelligence has been key to the way Oracle has now presented its product set.

Although the company is (obviously) primarily known for its database product technology, the autonomous database is just one element of the total Oracle autonomous vision.

Autonomous Linux

Oracle’s Autonomous Linux operating system (OS) is an OS with autonomy. It handles patching, updates, integration requirements and security layer ‘chores’ automatically and performs these tasks without any user involvement.

These are of course the more tedious type of task that can be error prone due to human error mistake. They are also tougher to manage in large-scale cloud environments.

This distribution of Linux is shipped alongside a new product called the Oracle OS Management Service.

“Oracle Autonomous Linux builds on Oracle’s proven history of delivering Linux with extreme performance, reliability, and security to run the most demanding enterprise applications,” said Wim Coekaerts, senior vice president of operating systems and virtualisation engineering, Oracle.

Coekaerts underlines the fact that all developer instances created on the autonomous operating system will be ‘current and continuously up-to-date’ because the operating system patches itself while its running.

Beyond the autonomous database

Speak at the day two keynote, executive vice president for database server technologies at Oracle, Andrew (Andy) Mendelsohn explained how Oracle is working to provide not just a database, but a ‘data platform’ that will allow firms to work with digital assets in new ways.

“You’re all out there gathering data from transactional systems, sensors, telemetry systems etc. That’s good. But the bad news is that that data is out of control, so your company’s data scientists can not get access to the data they need to run the business. The Oracle approach to autonomous database provision provides the patching, scaling and tuning so that we do everything for customers once you just tell us the ‘shape’ of the data i.e. how much compute CPU power they need, how much storage they need and so on,” said Mendelsohn.

Suggesting that data scientists still need more than this, Mendelsohn says that the Oracle Autonomous Data Platform also provides another step towards getting things done more intuitively. Companies need to make it possible for data scientists to be able to view a data catalogue that allows data scientists to be empowered to be able to choose which data they need when… and this is where Oracle is working.

“But doing all the ETL to move all that data from one place to another still take a lot of time, so… Oracle is also shouldering that chore with its platform i.e. ETL is converged to become part of the Oracle Autonomous Data Platform itself,” added Mendelsohn.

Developer tools

Oracle also provides a set of free developer tools so programmer can build and deploy new data-driven applications faster on Oracle Autonomous Database.

Developer tools include Oracle Application Express (APEX) for low-code web application development, SQL Developer Web for user interaction with the database, Machine Learning Notebooks, REST interfaces for access and publishing of database data… and drivers for all popular programming languages.

The Oracle Cloud Developer Image includes tools, choice of development languages, OCI Software Development Kits (SDKs) and database connectors.

As part of its developer-centric news, Oracle will be providing educators and students with the Oracle Cloud Free Tier through Oracle Academy, Oracle’s philanthropic educational programme. The company says the free resources here are free for as long as it is used.

“We are thrilled to offer Always Free Oracle Autonomous Database and Oracle Cloud Infrastructure,” said Oracle’s Mendelsohn. “This enables the next generation of developers, analysts and data scientists to learn the latest database and machine learning technologies for developing powerful data-driven applications and analytics on the cloud.

We’ve talked autonomy and acceleration, so what about assistants and applications?

Assistants, the digital kind

Oracle Customer Experience (CX) Cloud has been augmented with new digital assistants for sales, customer service and marketing professionals, new data enriched B2B sales capabilities, and new industry solutions for telecom and media, financial services, and public sector.

Oracle CX is an integrated set of applications across marketing, sales, service and commerce. The Oracle Digital Assistant is available across the entire Oracle CX Cloud, enabling sales, customer service, and marketing professionals to use voice commands to quickly and efficiently drive outcomes and actions.

With a full baker’s dozen worth of announcements to work through ever day, it’s always tough to know where to focus first at an Oracle event. What we do know is that Oracle is a database company, but it’s also a programmer/developer company and that’s over and above any actual mention of Java.

Code lives on data, databases need code to direct and manage them… databases are code, the role of the data-developer continues to expand.


September 17, 2019  5:01 PM

RPA series: Digital Workforce – does RPA integration beat APIs?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Niko Lehtonen, head of training at Digital Workforce – and this story forms part of a series of posts on CWDN that dig into the fast-growing topic of Robotic Process Automation (RPA).

Digital Workforce is a service provider specialising in intelligent automation, RPA and AI services on an industrial scale

TechTarget defines Robotic Process Automation as the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.

Lehtonen writes as follows…

Both Robotic Process Automation (RPA) and an Application Program Interface (API) effectively provide a channel of communication to move information between two or more separate pieces of software – thus making them both useful tools for enterprise application integration (EAI). But when should organisations use RPA and when should they use an API to integrate their systems?

In short, RPA offers a far cheaper and quicker solution to the same problem that an API looks to solve. RPA’s competitive advantage over an API is its light structure; it offers virtually a no-code alternative to an API. This can provide huge advantages for businesses looking to digitally transform as it means that they can implement new technologies without having to make changes to their existing IT systems. API’s on the other hand can be limiting for system heavy organisations in this respect as implementing and integrating new systems often requires them to have to reinvent their existing IT.

Rather than replace the old, RPA acts more like a human worker would, creating a communication bridge between the two separate IT systems. A software robot can be taught a simple process in just a few days, which means that organisations can realise the benefits of the integration quickly. The same software robots can also be quickly reconfigured to integrate different systems. In addition, using RPA to integrate two systems requires much less skill, whereas implementing an API would normally require a specialised systems engineer to configure.

RPA vs API

This is not to say that RPA is always the right option, far from it. RPA is often limited when it comes to integrating more nuanced and complex systems and this is where an API may be of better use. In addition, the advent of cloud computing has meant that as organisations adopt more cloud – based applications, they require more stringent data management programs and these programs are being rehashed as APIs.

Ultimately, there are pros and cons to using both RPAs and APIs. IT departments will require a plethora of different tools in order to optimise their organisation’s IT environment and this means evaluating what tool will best fit the integration.

There’s a time and place for everything and different solutions work in different situations. System integration is great, when the solution needs to be robust. But in many cases, RPA is enough where full system integration would be overkill.


September 17, 2019  4:33 PM

What to expect from Percona Live 2019

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team is off to Percona Live 2019… or, to give it its full title… Percona Live Open Source Database Conference Europe 2019.

Staged in the salubrious surrounds of the Hilton Amsterdam Airport Schiphol, the company is rolling up a weighty programme of sessions designed to attract not just database professionals (DBAs, sysadmins and so on), but also the newer emerging roles in this space such as data-developer, data analyst and data architect.

Over and above the techies, Percona says it wants to welcome C-suite execs who need to get a deeper understanding of the data layer in their IT stack.

Percona itself is a provider of enterprise-class support, consulting, managed services, training and software for MySQL, MariaDB, MongoDB, PostgreSQL and other open source databases in on-premises and cloud environments.

The company has stated that this is the one conference for the ‘entire’ open source database community, especially for MySQL, PostgreSQL, MongoDB and MariaDB.

With such a grand claim to respond to and deliver on… the theme of the 2019 conference is “Any Open Source Database, Any Platform, Any Time.”

The event will be organised around these six tracks:

  1. Performance & Scalability
  2. Public, Private, & Hybrid Clouds & Everything In Between
  3. Building Large, Scalable, & Secure DB Deployments
  4. Hot Topics, New Features, & Trends You Should Know About
  5. Monitor, Manage, & Maintain Databases At Scale
  6. How to Reduce Costs & Complexity with Open Source DBs

Keynote addresses are being delivered by presenters from  Microsoft, Oracle, HashiCorp and, obviously, from Percona.

Database teams from AWS, Booking.com, Slack, Facebook, Salesforce, Rakuten, Rackspace, OpenCorporates, Altinity, Pythian Group and others are also sharing thoughts on database design, management and administration techniques via a mix of panel discussions and technical presentations.

Additional topics of particular interest include monitoring, automation, migration, security, compliance and… wait for it… it had to be there… Kubernetes.

The new version of PostgreSQL will also be discussed at the event.

New distro

The news is actually out before the event proper i.e. Percona has launched its own enhanced distribution of PostgreSQL for customers that want enterprise-class support for applications running on this popular open source database.

The Percona Distribution for PostgreSQL is based on v11.5 of PostgreSQL and provides a supported distribution of the database alongside open source tools for managing database instances and ensuring data is secure, available and backed up for recovery.

Tools include third-party extension for rebuilding PostgreSQL database objects – plus extensions that provide in-depth session and/or object audit logging through the standard logging facility in PostgreSQL.

There ‘s also pgBackRest, a backup tool that replaces the built-in PostgreSQL backup offering. pgBackRest can seamlessly scale up to handle the largest database workloads. It uses streaming compression to help companies minimize storage requirements and delta restores to significantly lower the amount of time to complete a restore.

“Companies are creating more data than ever and they have to store and manage this data effectively,” said Peter Zaitsev, co-founder and CEO of Percona. “Open source databases are becoming the platforms of choice for many organisations and Percona provides the consultancy and support services that these companies rely on to be successful. Adding a distribution of PostgreSQL alongside our current options for MySQL and MongoDB helps our customers leverage the best of open source for their applications as well as get reliable and efficient support.

Multi-database environments

The new distribution adds to Percona’s list of supported open source database editions, which includes the company’s own versions of MySQL and MongoDB.

With the prominence of multi-database environments, the addition of PostgreSQL provides customers with enterprise support, services and consulting for all their open source database instances across multiple distributions and across both on-premises and cloud deployments.

“While at its core the PostgreSQL server has many features, alone it does not meet all the workloads of a modern enterprise,” explains Matt Yonkovit, chief experience officer at Percona. “As a result, the community has built many extensions and add-ons to fill in the gaps. However, community solutions often overlap or provide similar functionality to one another. We developed the Percona Distribution for PostgreSQL as enterprise companies need a way to guarantee that the tools, add-ons, and extensions they require are going to work together, are easy to deploy, are certified, and supported by a trusted vendor,” added Yonkovit.

Database management appears to be growing in prominence in the modern IT stack… so the CWDN team will be aiming to dig in and gorge on as much of this topic as possible. Oh… alongside a good helping of the always amazing Dutch ‘kroketten’ croquettes, obviously.

The event hastags are #PerconaLive #Percona #PostreSQL #opensource


September 16, 2019  7:23 PM

RPA series: Appian – creating an integrated robotic workforce with RPA

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Sathya Srinivasan, principal solutions architect at Appian — and this story forms part of a series of posts on CWDN that dig into the fast-growing topic of Robotic Process Automation (RPA).

Appian is known for its low-code development platform and the company used its last Appian World 2019  conference to stand back and saying that it is also capable of being an RPA-centric ‘orchestrator’ of systems which can now be connected to the apps built on the Appian low-code platform. 

TechTarget defines Robotic Process Automation as the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.

Srinivasan writes as follows…

RPA has become an invaluable tool for organisational digital transformation.

It has evolved rapidly over the past ten years and is now capable of automating human-machine interactions in the enterprise. For IT operations, RPA can effectively automate repetitive tasks such as updating system check-ups or running scripts on a regular basis. And now, that capability is starting to extend much further, both externally and to new services that IT needs to accommodate.

What is important is identifying and understanding where the high-volume, low-value manual work is and where human input can be removed; initially by automating the task using RPA, and then ideally modifying or removing the process altogether.

Across industries, including those which are highly regulated such as banking, insurance and pharmaceuticals, RPA is proving its effectiveness in bridging the gap between legacy and modern business apps.

RPA fills API EAI chasm

Where legacy platforms with limited to no API connectivity are prevalent, RPA is an increasingly important element of Enterprise Application Integration (EAI). If you were to take a medical record checking website as an example, where you can enter practitioner details to find information such as medical licenses, there is no easy system-to-system API that can execute successful integration. This would be an ideal use case for robots to mimic human behaviour and extract information without the need for the layer above (the user) to move out of the context.

Companies may find that their RPA engine has all the features it needs to build their process.

However, they may well encounter scenarios requiring logic that their RPA vendor does not have, which raises some challenges. Generally, RPA engines should have all the facilities to define business processes and business logic with the necessary structures in place to handle a wide variety of use cases – such as activities, events, gateways and integration services.

If this covers the majority of use cases, then the rest can be achieved by providing a functionality to expand the RPA engine’s capability in a scalable, maintainable fashion with a level of guarantee to ensure upgrades do not lose the functions being extended.

Doing so in a low-code fashion may increase the value of adoption.

RPA evolution path

RPA will mature in the coming years and most software will end up incorporating a degree of built-in automation or provide an RPA add-on to handle multiple scenarios. In addition, vendors will find ways to identify patterns and recommend methods to automate them. This is where machine learning and RPA will work closely together.

Work is already underway for RPA vendors to integrate with machine learning concepts that allow the bots to learn from their activities.

This currently falls under two main headings:

  • Identifying where a process can be modified
  • Recommending and/or performing the modification (based on the level of confidence)

The next stage in the evolution of RPA is to learn from already successful processes across the globe and accelerate innovation through collaboration. Particularly in automating the approval step, where RPA should recommend what are the likely steps to be considered, the exception scenarios, the handling of those exceptions, and how to involve humans in the loop.

This will accelerate utilisation of bots beyond the skillset and the individual developer.

RPA has already established itself as a vital tool of digital transformation for CIOs and CTOs with proven use cases. In the coming years there RPA will see further technological improvements, the automation of more manual processes, and ultimately, a move towards a genuinely digital workforce harmoniously integrated with both humans and artificial intelligence.

Appian’s Sathya Srinivasan: RPA is evolving fast, we need platforms that can capture its efficiencies.


September 16, 2019  7:14 PM

Oracle: the autonomous database drives the ‘evolution’ of the DBA

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team is in San Francisco for Oracle Open World 2019.

As day one kicks off, we start with a resounding message from Oracle central: this is the point at which we are witnessing the birth of the autonomous database… and that allows us to also welcome the ‘evolution’ of the DataBase Administrator (DBA).

Oracle’s autonomous database capabilities create automatic services to perform many of the ‘chore’ functions that a DBA would typically have had to perform themselves in the past.

These are actions such as database patching, database provisioning and the whole grunt work of the nightly build, plus all the other data service requests that developers would normally put upon the database function inside any business.

That’s just the start — DBA’s will also typically have to look after schema creation and schema management, capacity planning (knowing what data resources should be available at any one time for the application needs inside the organisation), then there’s also performance management and the whole chore of moving data from one place to another when needed.

Architectural advance

Oracle’s autonomous database vision provides most (if not all) of these functions automatically.

This subject was initially tabled at Oracle Open World 2019 by Maria Colgan in her role as master product manager at Oracle. Colgan herself is core responsibility is creating material and lectures on the Oracle database and the best practices for real world deployment scenarios.

DevOps provided a means of bringing the Dev developer function closer to the working practices of the Ops operations function, but this could ultimately elevate the DBA world into far more programmer-centric circles.

Oracle’s Golgan explains that what this means for the poor old (all too often unloved) DBA is that his or her role has become more architectural.

With many of the Extract, Transform & Load (ETL) chores handled automatically, the DBA can focus on more creative aspects of work such as working with systems architects to get data to the right place for the right application at the right time… and so… the actual build of applications can be more tightly coupled to the database layer services that exist to serve it.

“Traditionally, creating a database management system required a team of experts to custom build and manually maintain a complex hardware and software stack. With each system being unique, this approach led to poor economies of scale and a lack of the agility typically needed to give the business a competitive edge,” said Oracle’s Colgan.

Colgan points to Oracle’s approach, which aims to enable businesses to safely run a complex mix of high-performance transactions, reporting and batch processing using autonomous controls.

Previously, DBAs would approach scalability issues by provisioning for ‘worst-case scenario + 10%’ in order to ensure for every possible eventuality once a production environment is live. Now, through Oracle’s approach to autonomoius database management, that type of function is auto-scaled.

Oracle Autonomous Database itself comes in two versions optimised to meet specific requirements for online transaction processing and data warehousing.

Eradicating (human) DBA error

Both versions use Machine Learning (ML) to drive automation to tackle human error and manual management.

The company also notes that Oracle Autonomous Transaction Processing Service allows much simpler application development and deployment for enabling real-time analytics, personalisation and fraud detection.

Also here we can find integrated ML algorithms to drive automatic caching, adaptive indexing, advanced compression and optimised cloud data-loading to deliver unrivaled performance.

So it’s not the end of the DBA or complete redundancy for the wider world of sysadmin functions that work alongside the database function in any business.

It is, as Colgan enthuses, a chance for DBAs to take a far more data-programming approach to their roles. As an ex developer who (by self-confessed admission) shunned the world of the cubicle and found enterprise database structures more enticing, she would know.

DBAs are getting funkier, get used to it.

Oracle’s Colgan: DBA life is where it’s at (and it’s getting better all the time).

 


September 13, 2019  4:15 PM

RPA series: IFS – Up, up and RPAway!

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Bob De Caux, director of AI and RPA at IFS.

IFS is known for its industrial cloud software deployments with a specific focus in areas including Field Service Management (FSM), Enterprise Asset Management (EAM) and Enterprise Resource Planning (ERP) systems.

TechTarget defines Robotic Process Automation (RPA) is the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.

De Caux writes as follows…

The fastest-growing segment of the global enterprise software market, Robotic Process Automation (RPA) has proven itself to be a highly effective tool for transforming and streamlining the way organisations operate. But in the hype surrounding the technology, we shouldn’t over-estimate its capabilities.

In our own experience as a software company, we’ve found that RPA works well for our customers as a quick and easy way to integrate our software with other systems, especially in terms of replicating the GUI steps that users go through when moving data into or out of our software.

However, RPA providers have realised that going ‘end-to-end’ on a business process across multiple applications is very difficult. Despite clever advances that use AI-driven image recognition to improve flexibility, replicating the actions a user takes through a GUI is not robust to changes in the underlying software in the same way that interacting through an Application Programming Interface (API) would be.

Therefore, RPA will increasingly become about orchestration, maintaining simplicity by encapsulating defined API-driven interactions with software that perform clearly defined tasks as pieces that can be easily added dragged and dropped into RPA workflows.

The onus now will be on software companies (such as us) to create APIs that are easily consumable by RPA software, but it also creates an opportunity to provide more advanced process automation logic deeper within our products, which can take advantage of usage data and AI in a way that external RPA engines can never access. Through this approach, external RPA engines and internal process automation engines will start to work together more efficiently.

RPA combos are key

The evolution of RPA, therefore, hinges on how it’s utilised alongside other technologies.

RPA forms a key part of ‘intelligent automation’ along with Business Process Automation (BPA) and AI. Speaking from personal experience, we drive intelligent automation within our own software using a combination of effective BPA and machine learning to optimise process effectiveness, which while not traditionally thought of as RPA, has proven to be a highly effective method of automation.

As RPA adoption increases, we expect to see machine learning increasingly used on RPA-driven workflows to optimise processes across multiple systems.

When looking to RPA, businesses must look beyond just short-term efficiency gains and understand its potential as part of the broader technology ecosystem.

In doing this, they will be best placed to harness its full benefits, and go up, up and RPAway!

IFS’s De Caux: a ‘new RPA’ via a combination of BPA and machine learning to optimise process effectiveness.


September 12, 2019  6:50 PM

Why multi-tenancy architecture is the new normal

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Sumo Logic CEO Ramin Sayar delivered his keynote at the company’s Illuminate 2019 conference to reinforce his assertion that there is ‘a massive intelligence gap’ throughout modern technology stacks caused by the pressure that comes from ‘continuous release’.

The world of always-on web-connected cloud-services driving mobile-optimised applications is one where users want applications to be continuously available, continuously connected, continuously updated, continuously integrated and continuously secured… and so, continuously released.

This drive for continuous release depends upon the existence of cloud computing.

Consequently, is has implications on computing architectures, which must now be built for a world that is inherently multi-tenant on the public cloud.

CEO Sayar says that as we move to continuous data, we need to provide users with what he could also call continuous choice. Users want choice over what device, what platform and what application they use and so on… plus they also want choice over where their data resides.

Noisy complexity

“But choice creates ‘noise’ and complexity. Most enterprises today are using up to somewhere around 30 tools to perform monitoring… and the problem is only made worse when we find out that the monitoring tools themselves don’t interconnect,” said Sayar.

The antidote to this predicament is what Sumo Logic likes to call continuous intelligence… and it’s a platform that is true multi-tenant SaaS in nature.

What this leads us to is the suggestion that multi-tenant cloud architectures are essentially the new normal.

“Multi-cloud [built on inherently multi-tenant instances] and open source technologies, specifically Kubernetes, are hand-in-hand dramatically reshaping the future of the modern application stack,” said Kalyan Ramanathan, vice president of product marketing for Sumo Logic. “For companies, the increased adoption of services to enable and secure a multi-cloud strategy are adding more complexity and noise, which current legacy analytics solutions can’t handle. To address this complexity, companies will need a continuous intelligence strategy that consolidates all of their data into a single pane of glass to close the intelligence gap,” he added.

But for this level of intelligence to have substance, we need to think about how it is applied at the coalface of the modern IT stack.

Continuous Intelligence (Cont-Intel)

Ramanathan says that for Continuous Integration (CI) and Continuous Delivery (CD) to run effectively, it needs Continuous Intelligence (let’s shorten to Cont-Intel) throughout the entire Software Application Development Lifecycle (SDLC).

He also notes that his firm is ‘fundamentally different’ to other machine-generated log analytics players (Splunk being the obvious suspect) because Sumo Logic was born in and of the cloud (Splunk dates back to 2003 – Sumo Logic is a newer cloud era baby born in 2010)… so, he claims, the company is natively attuned to the service-centric consumption model of the cloud.

Why multi-tenancy matters

When we first started talking about multi-tenancy in cloud computing, people were concerned. Customers thought that putting all their data on a shared multi-tenant cloud datacenter server next to somebody else’s data to be pretty irregular.

Today that view has changed.

Today the industry understands that multi-tenant architectures proliferate, typify and exemplify the nature of the modern web with its continuous ubiquity.

But there’s another lesson here… and it’s one for DevSecOps (i.e. DevOps delivered from a core security stance).

“Looking at the state of the market as we have detailed in our Continuous Intelligence report, I would emphasise to you that we say that Sumo Logic is natively multi-tenant… and here’s why multi-tenancy matters, especially to DevOps. When a Denial Of Service attack happens in a company, it doesn’t happen in a pocket, it happens in many places throughout the stack. So being able to correlate issues right throughout the stack becomes fundamentally important,” said Ramanathan.

Ramanathan urges us to think about DevSecOps in slightly different terms. While he’s okay with the term itself, he suggests that it might perhaps more accurately have been labelled DevOpsSec… or perhaps even DevOps_Sec.

The difference is, this would be DevOps that is essentially concerned with operations, but has still has a sharp eye focused on security issues when they occur.

Times have changed, cloud perceptions have moved on, our notion of how we apply analytics to an increasingly complex cloud must (arguably) also move on to. This is where Sumo Logic is at and it’s a subject that is (again, arguably) spiraling upwards in the modern IT stack of today’s now ever-so continuous company.

Sumo Logic CEO Ramin Sayar: with choice comes ‘noisy complexity’.

 


September 12, 2019  3:17 PM

RPA series: ABBYY – RPA shortcomings in automating content-centric processes

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Bill Galusha in his role as director of product marketing for RPA & Data Capture at ABBYY.

ABBYY specialises in AI-based technologies and solutions for content and process intelligence.

TechTarget defines Robotic Process Automation (RPA) is the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.

Galusha writes as follows…

Finding a simple technology solution for a complex problem is not always easy. For example, every industry and business department still rely heavily on documents in digital or printed format coming from different communication channels, such as email, fax, mobile, and desktop scanners.

Over the past few years, companies have been using robotic process automation (RPA) systems to streamline these document-related processes, i.e. the input of data into systems. In practice, the RPA deployment has revealed significant shortcomings related to unstructured content and inefficient processes. However, when combined with complementary technologies to turn unstructured data into structured and gain insight into processes, this is where we have seen RPA and its new class of digital workers show great promise.

Limitations with unstructured content

RPA has been successful at automating structured, repetitive tasks associated with processing data across many systems. However, robots can quickly hit a roadblock any time the process involves content that is not structured.

That is very concerning since the overwhelming majority of enterprise content is unstructured. This includes documents, contracts, handwritten information, email communications, and images. Think about some of the common business functions involving back-office financial processes or onboarding a new customer.

These processes always have unstructured content associated with them, which puts an enormous strain on operations and employees to perform tasks like routing documents to the right business groups, verifying the accuracy of data, trying to make a business decision based on information scattered across many different documents, etc.

In order for the new digital workforce to become more efficient, RPA tools are leveraging complementary technology that can learn to recognise, read, and understand all documents regardless of their format – digital and image, structured and unstructured.

Information validation

By leveraging the right visual perception, understanding, and insight skills, a robot can fully automate a process involving documents using technologies like intelligent optical character recognition (OCR), natural language processing (NLP), and machine learning (ML), to digitise content, classify documents, extract data and validate information with little human involvement.

Vision, understanding and insight skills are central to content intelligence (IQ), where AI is applied to:

  • Easily automate and analyse content-centric processes involving images, documents, texts, and communications.
  • Analyse and learn from content to make more informed decisions.
  • Incorporate machine learning to perpetually improve and streamline business processes.
  • Measure, sustain and adapt digitised content processes over time.

Smarter RPA bots

To grow and expand the use of RPA within an enterprise, robots must become smarter to be able to interpret and understand unstructured content and turn it into actionable structured information. Think of RPA as the starting point for intelligent automation, then with the addition of content IQ skills, digital robots can have varying degrees of intelligence.

For reference, content IQ is defined as a class of enabling technologies that help digital workforces understand and create meaning from enterprise content. Content IQ provides the ability to automatically extract all relevant information from documents and break down the processing of content into easy to use and consume technology. It can be leveraged directly within an automation solution, targeting activities and skills required to solve specific business problems.

Connecting content IQ skills is easy. They can be integrated into RPA platforms like UiPath and Blue Prism using pre-built connectors. Content IQ skills are like services, they are very extensible, so that enterprises, partners, and software vendors could integrate them with any business process automation system.

When combined with RPA, content IQ bridges the cognitive gap digital software robots have when it comes to unstructured content.

Insight into RPA process workflows

Another common challenge associated with RPA is that it requires process transparency. If you don’t take a process-first approach, it can be like building a LEGO set without directions. Knowing where to start with RPA as well as the benefits and risks to expect before, during, and after implementation are all elements that need consideration.

Realistic expectations should be set.

Adding process intelligence (IQ) to automation efforts gives the foresight needed to improve the success of an RPA project. From selecting the right process to automate with cost analyses, quantifying the effects of processes downstream, and monitoring digital worker performance post-implementation, process IQ technology in a way acts as a crystal ball on how RPA will behave.

RPA ignited the fuel to accelerate digital transformation. Now, with the addition of content IQ and process IQ, organisations can achieve significantly better results with their initiatives.

Galusha: RPA requires process transparency.

 


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: