CW Developer Network


October 22, 2019  6:06 PM

Splunk .conf keynote notes, quotes & anecdotes

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

What’s not to love about log file management and trace analytics?

Nothing apparently.

Splunk managed to attract what was claimed to be somewhere over 11,000 attendees to Las Vegas for its .conf 2019 conference and exhibition… and the geek-cognoscenti were there in force to dig deep in all things logs and machine data.

CEO Doug Merritt explained that the event is now in its 10th year and the gathering has grown from what was just 300 attendees in its first year.

Merritt thinks that this is ‘just the beginning of the first data age’ and that the future has everything it in from printed foods, to flying cars to mission(s) to Mars.

“In the [near] future, there will be those companies that seize the opportunity to do [productive] things with data… and those that simply don’t exist,” said Merritt.

But as positive as the drive to data is, CEO Merritt says that there is a real need to ‘liberate’ data, because so much of it is locked into systems, devices and machines.

The ‘shape’ of data

We know that some data is static and stable, we know that data sits in many different data sources and repositories, we know that data works on different time-scales from milliseconds to months, we know that some data is structured, some is unstructured… and some is even semi-structured, even further, we know that some data is streaming data and some data sits in a more orchestrated and federated state than others.

Merritt says that Splunk has been engineered to be able to deal with all those data sources and work to provide the right level of analytics.

So for all of this data, Splunk is aiming to differentiate its offerings for organizations who need to use data in lots of different ways. The company is also looking to provide new levels of infrastructure-based analytics and also offer rapid adoption packages based upon recognised industry use cases.

“As you solve increasingly complex [data] problems, you [the attendees] will be showing the rest of the business what is really possible with new insights in the data that underpins company operations,” said Merritt.

Splunk VP of customer success and professional services Toni Pavlovich took the stage to showcase a use case at Porsche. This section of the keynote featured a demonstration to showcase Splunk Augmented Reality (AR), a technical development which helps engineers fit parts and equipment that actually features video inside the viewing headset experience.

The road to unbounded learning

Splunk CTO Tim Tully brought us out of the customer session (lots of high fives and people shouting ‘awesome!’ – you get the picture) to explain where Splunk is building, buying and investing in new functionalities and capabilities.

“In terms of what Splunk is building, the company is pushing for ‘massive scalability and real time capability’ in its platform… and in a form that is usable in mobile form. The Splunk Data Stream Processor is focused on creating data pipelines and learning use cases into live routing of data to Splunk or other public cloud connectors. We’ve seen customers use it as a data routing message bus, which was actually a surprise,” said Tully.

Many people are working out how Machine Learning really works and using old processes from raw data to feature engineering to model training to model deployment. Splunk promotes a new approach called ‘unbounded learning’ where the model learns continually from the point of deployment.

Tully also talks about what he calls ‘indulgent design’, an aspect of user interface creation that the company has used to create its new Mission Control product, which has a new colour dashboard presented in ‘dark mode’ with an additional ‘notable events’ screen to allow users to really ‘stare at it’ (Tully’s own words) for as long as they need to get data insights.

Font of (data) knowledge

Splunk Data Sans is the company’s own new font which the company has used to brand itself in a new way. The text itself has an elongated bar and clear disambiguation throughout the character set so that anyone looking at Splunk text will immediately be able to recognise it as Splunk, simply by look and feel.

Tully also explained how Splunk wants to extend its mobile capabilities to provide interactive data dashboards that allow users to address incidents more quickly. The company calls this the ability to ‘Splunk data’… and so uses Splunk in this case as a verb i.e. the ability to drill through and analyse data in a live format on a mobile unit… and there’s an integration with Slack to make that easier.

Overall, Splunk moved from broader CEO messages to specific on-screen presentation layer updates with accompanying functionality changes within an hour of keynote, which is pretty deep… well, Spelunking is all about going deep, after all.

 

October 22, 2019  3:42 PM

Splunk Mission Control acts on data ‘at machine speed’

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Splunk has built new functions into its Security Operations Suite to modernize and unify its Security Operations Center (SOC) product.

Anchored by the newly launched Splunk Mission Control, the Splunk Security Operations Suite is designed to help security analysts to turn ‘data into doing’ (as the marketing spin puts it) in real world operational systems.

The cloud-based Splunk Mission Control connects Splunk SIEM (Splunk Enterprise Security), SOAR (Splunk Phantom) and UEBA (Splunk UBA) products into a single data-developer data-analyst experience.

Combined, these products form the Splunk Security Operations Suite.

“With Splunk Mission Control, customers gain a new, unified SOC experience that supports investigation and search across multiple on-premise and cloud-based Splunk Enterprise and Splunk Enterprise Security instances, ChatOps collaboration, case management and automated response, all from a common work surface,” said Haiyan Song, senior vice president and general manager of security markets, Splunk.

Machine speed response

The company points out one core truth and says that as the volume of security-relevant data continues to grow, so will the importance of technologies that can automate and respond to that data in real-time.

So… the mission is: detection, defence and action on threats at machine speed.

New product announcements include Splunk Enterprise Security (ES) 6.0 as the latest version of Splunk’s flagship security offering. Splunk ES is a security information and event management (SIEM) platform that now benefits from improved asset and identity framework enhancements.

Splunk User Behavior Analytics (UBA) 5.0 is described as a product that enables security teams to build advanced, customized Machine Learning (ML) models for baselining and tracking deviations, based on their security environment and use cases.

Splunk Phantom 4.6 is the company’s security orchestration, automation and response (SOAR) product and it now come to the mobile phone.

“Phantom on Splunk Mobile allows customers to automate repetitive, manual tasks from the palm of their hand, enabling analysts to focus on mission-critical security threats that fuel security operations. Splunk Phantom 4.6 also introduces new open source integration apps, giving developers easy access to Phantom’s source code to extend SOAR to the unique needs of every individual SOC,” said the company, in a press statement.

Splunk has also announced several new security apps and updates to Splunk ES Content Update, which delivers pre-packaged Security Content to Splunk ES customers. Updates include Splunk Analytics Story Preview, a new Splunkbase app; Cloud Infrastructure Security, new security content which analyses cloud infrastructure environments; and new open source content, including over 30 new open sourced apps for Splunk Phantom.


October 22, 2019  1:31 PM

CI/CD series: What drives continuum in software?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Software used to shut down.

Users would boot up applications and wrangle about with their various functions until they had completed the tasks, computations or analyses that they wanted — and then they would turn off their machines and the applications would cease to operate.

Somewhere along the birth-line that drove the evolution of the modern web, that start-stop reality ceased to be.

Applications (and in many cases the ancillary database functions and other related systems) that served them became continuous, or always-on.

Where applications weren’t inherently always-on in the live sense, the connected nature of the mothership responsible for their being would be continuously updating and so whenever the user connected a ‘live pipe’ to the back-end, then the continuum would drive forward to deliver the updates, enhancements, security refreshes and other adornments that the app itself deserved.

This, we now know as Continuous Integration & Continuous Delivery (CI/CD).

CI/CD reality

The reality of CI/CD today is that it has become an initialism term in and of itself that technologists don’t spell out in full when they speak out loud, like API, like GUI… or even like OMG or LOL, if you must.

But as simple as CI/CD sounds in its basic form, there are questions to be answered.

We know that CI/CD is responsible for pushing out a set of ‘isolated changes’ to an existing application, but how big can those changes be… and at what point do we know that the isolated code is going to integrate properly with the delpoyed live production application?

A core advantage gained through CI/CD environments is the ability to garner immediate feedback from the user base and then (in theory at least) be able to continuously develop an application’s functionality set ‘on-the-fly’, so CI/CD clearly has roots in Agile methodologies and the extreme programming paradigm.

But CI/CD isn’t just Agile iteration, so how are the differences highlighted?

Do firms embark upon CI/CD because they think it’s a good idea, but end up falling short because they fail to create a well managed continuous feedback system to understand what users think?

Does CI/CD automatically lead to fewer bugs? How frequent should frequency be in a productive CI/CD environment and how frequent is too frequent? Can you have CI/CD without DevOps? Is CI/CD more disruptive or less disruptive?

TechTarget reminds us that the GitLab repository supports CI/CD and can help run run unit and integration tests on multiple machines as it splits builds to work over multiple machines to decrease project execution times… is this balancing act a key factor of effective CI/CD delivery?

CI/CD contrasts with continuous deployment (despite the proximity), which is a similar approach in which software is also produced in short cycles but through automated deployments rather than manual ones… do we need to clarify this point more clearly?

How do functional tests differ fro unit tests in CI/CD environments… and, while we’re asking, if development teams use CI/CD tools to automate parts of the application build and construct a document trail, then what factors impact the wealth and worth of that document trail?

CWDN CI/CD series

The Computer Weekly Developer Network team now sets out on a mission to uncover the depths, breadths, highs, lows and in-betweens where CI/CD practices and methodologies are today.

In a series of posts following this one we will feature commentary from industry software engineers who have a coalface understanding of what CI/CD is and what factors are going to most prevalently impact its development going forward.

We look at how organisations are shifting to continuous integration and continuous deployment to deliver new software powered functionality to the business. What are the common tools being used? How do organisations get started? What are the pitfalls? How much is enough application to go-live and then continuously build upon?

These (and more) are the CI/CD questions we will be asking and we hope that you dear reader will come back again and again for updates… continuously.

Image: Wikipedia

 


October 22, 2019  1:28 PM

Can advanced fire modeling put out wildfires?

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Splunk has ‘closed funding’ in Zonehaven, a cloud-based analytics application designed to help communities improve evacuations and reduce wildfire risk with data.

The funding is the first investment from Splunk’s newly launched Splunk Ventures social impact fund that champions data-driven approaches programmes that have social impact.

In 2018, more than 57,000 wildfires burned 8.8 million acres of land in the United States – although Splunk acknowledges that the problem is a global risk.

Despite this reality, fire departments still rely heavily on word-of-mouth, emergency calls and static ‘paper playbooks’ to detect wildfires and evacuate people at risk.

Zonehaven provides situational awareness and decision support by using what have been called ‘intelligent evacuation zones’ as well as advanced fire modeling, real-time weather data and always-on fire sensing capabilities.

“The increased spread of wildfires is a global emergency that impacts public health and the planet. While technology alone won’t eliminate fires, Zonehaven’s software can help communities prepare for evacuation, provide advance warning to those in harm’s way, preserve natural and economic resources and ultimately save lives,” said Charlie Crocker, CEO, Zonehaven.

Common data platform

Zonehaven’s technology presents a ‘common data platform’ for coordination and response to wildfires. The technology helps identify ignition points, projects simulated fire spread and develops fire-specific intelligent evacuation zones.

Splunk also used its annual .conf user event to detail Splunk Partner+ Program updates.

The company says it can count over 2,200 individual partner attendees at the show itself in the form of distributors, system integrators, service providers, original equipment manufacturers, technology alliance partners and value-added resellers.

Many of the partners build connectors, apps and add-ons to Splunk itself.

Splunk’s Big Data Beard team recently equipped an RV (recreational vehicle) with IoT sensors, built an edge-to-cloud computing environment and drove over 3,700 miles with stops in 13 cities on their Road Trip to .conf19.

Big Data Beard used Splunk throughout the journey to analyse their location, road quality, comfort levels and health data. Big Data Beard’s dashboards use Splunk Augmented Reality.

Image: Zonehaven


October 18, 2019  3:34 PM

What to expect from Tableau Conference 2019

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team are big fans of business intelligence and big data analytics, interactive data dashboard technologies, data visualisation specialists and companies that brew their own craft ale for their technical conventions.

That’s lucky then, because we’re off to Tableau Conference 2019.

It’s a seminal time for Tableau, the firm has obviously just closed up being acquired by CRM cloud giant Salesforce… a deal that went through for $15.7 billion.

Since the June announcement, rumours that Tableau’s hometown of Seattle will become Salesforce ‘new HQ2’ have flown and we hope to see ebullient CEO Benioff to make some sort of appearance at Tableau Conference 2019, which is staged between 12-15 November in Las Vegas.

“Tableau is an extraordinary company, with an amazing product and team and an incredibly passionate community. Together we can transform the way people understand not only their customers, but their whole world — delivering powerful AI-driven insights across all types of data and use cases for people of every skill level,” said Benioff.

Although he obviously didn’t really mean ‘together’ (he arguably more likely meant ‘as part of my firm’) one can only hope that some of the original Tableau hardcore geek-goodness survives.

Tableau Conference 2019

At the conference itself, we can expect around 18,000 attendees over the four-day show. Indeed, the last time Computer Weekly attended we found the ‘party’ zones extending out into the hotel car parks to gain a little extra room.

“At Tableau Conference we unite behind the mission to help people see and understand data. [Attendees can] immerse themselves in a data-driven culture that is unparalleled in the industry,” notes the show preview statement.

Opening remarks will be presented by Tableau president & CEO, Adam Selipsky and guests. Selipsky will welcome live data-developer demos and there’s even an Iron Viz Championship (like Iron Chef or Iron Man/Woman, but for data viz gurus) where three contestants will compete live for all the glory. Data will be provided by Pitney Bowes.

So who’s all coming?

The sessions, activities and networking will be tuned specific to different attendee interest and industry.

Attendees will include data and business analysts, business users (and comparatively non-technical team leaders) developer and programmers (and the perhaps more tangential world of data-developers) and other IT professionals in team leadership and management.

Vertical industry focus will span all sectors… but if we’re going to get specific, then we can say that Tableau looks at financial services, healthcare and public sector as key areas of customer interest.

Part of the event is the Data Village… and the company says this ‘conference hub’ has plenty of ‘surprise and delight’ features and lots to explore. “From streaming content at the Data Cubes, Aha! Theater, Story Points Theaters, and Data Pool to all the latest swag at the Tableau Store, this is your go-to-spot throughout the week,” notes the company.

We also expect to get a closer look at the latest product features including the recently announced 2019.3, which introduced new AI and machine learning-driven discovery capabilities.

“As the amount of data increases and the pace of decision-making accelerates, the need for data management has never been more critical to foster a thriving data culture,” said Francois Ajenstat, chief product officer at Tableau. “With Tableau 2019.3, we’re integrating data management directly into the analytics experience, making it easier for customers to curate and prepare all the data needed for analysis and improving visibility and increasing trust in the data for everyone within an organisation.”

Recent product updates likely to be discussed at the conference saw the company launch Tableau Catalog, a set of capabilities that provide a view of all the data used in Tableau to enable visibility and data discovery i.e. ensuring the right data is always used for analysis.

Tableau also recently announced the general availability of the Tableau Server Management Add-On, a new offering designed to help customers manage their enterprise-wide deployments of Tableau Server.

Explain Data

Also worth looking out for in terms of product sessions are periods devoted to Explain Data, a new capability built directly in Tableau that enables users to perform advanced statistical analysis with one click.

With no complex data modeling or data science expertise required, any user is promised the ability to uncover AI-driven insights about data.

Explain Data uses statistical algorithms to analyse all available data on behalf of the user and automatically explains the most relevant factors driving any given data point. In doing so, Explain Data brings analytics to more people and helps them discover insights that would be difficult or time-consuming to find.

Explain Data is available at no extra charge as part of Tableau’s latest release, Tableau 2019.3.

We can hope for an explanation of where the company is going to fit into the Salesforce vision and few other similarly related forward-looking thoughts. Whether you spell it visualization or visualisation, Tableau promises that there’s going to be a lot to feast the eyes on.


October 16, 2019  2:15 PM

Appian & UiPath connect neurons on robotic workforce tools

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Low-code software company Appian has announced the availability of the Appian Robotic Workforce Manager for UiPath, the Robotic Process Automation (RPA) specialist.

The software optimises coordination between robots and humans.

Incorporating Appian Robotic Workforce Manager, users can use this tool to manage, scale and maximize ROI while providing visibility and access to the business.

Features of the software include ‘total visibility’ into the digital workforce on any device (web or mobile) via pre-built business reports and dashboards to track and manage robots across applications.

Automation Center of Excellence (CofE) leaders can provide this information to the business owners of respective RPA processes through web and mobile interfaces.

“Managing a digital workforce across systems, humans and robots effectively is important,“ said Dave Marcus, VP strategic product technology alliances at UIPath. “Through our partnership with Appian, our joint customers are able to [use] the Appian Robotic Workforce Manager to orchestrate a view of their workforce and maximize their automation efforts.”

The technology also works to provide automation of work ‘hand-offs’ between robots and people. It provides a case management and exception management framework that automates ‘human-in-the-loop’ tasks and eliminates manual hand-offs.

Director of product at Appian Prittam Bagani also says that the software comes with an automation manager to allow business process owners to request new automation and manage the life cycle of adding new automations.

Bagani states that this gives the Automation COE full visibility, compliance and control over all enterprise-wide RPA deployments. For companies that are building many RPA processes, it is obviously quite important to to track and prioritise new RPA processes and ongoing management of existing RPA processes.

Appian also released a no-code integration plug-in for UiPath — and — UiPath also released an integration for designers of the UiPath environment called ‘Appian Activity’ so that UiPath users can drag and drop it in UiPath for connectivity to Appian.

 


October 16, 2019  1:30 PM

SELIS: A recipe for a shared logistics information space

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Dr. Takis Katsoulakos in his role as managing director for Inlecom Systems.

Inlecom Systems is an information learning and communication solutions company with offices and consultants in Brussels, UK, Athens, Ireland Spain and Italy.

Katsoulakos writes

The logistics sector is plagued by low efficiency.

For example, one fifth of road freight journeys in the EU are performed by empty vehicles. Software application developers, data architects and every other engineer involved in the construction of the IT stack must now realise that smarter, more effective (and subsequently more environmentally friendly) supply chains can only be achieved by more synergistic and collaborative use of resources to boost transport optimisation.

SELIS (Shared European Logistics Intelligent Information Space), a €17 million flagship European Commission-funded research project, was set up to address the challenges associated with fragmented logistics chains, more specifically:

(1) to improve information sharing between logistics stakeholders.

(2) to provide a platform that is easy to use by companies of all sizes, enabling a plug-and-play approach to sharing and analysing supply chain data.

(3) to facilitate real-time availability of information.

These challenges were approached by building a scalable and replicable platform for logistics applications, allowing a standardised exchange of data between any number of users in a supply chain community. The use of blockchain technology has created a trusted data layer, integrating multiple sources of data to provide greater transparency and visibility of supply chain transactions.

SELIS Supply Chain Community Nodes 

SELIS’s approach is based on Supply Chain Community Nodes (SCNs) that have created a shared intelligence ‘data space’, configured to address the needs of a logistics community.

SCNs aggregate information flows in various industry standard formats that are generated by operational systems of logistics participants, IoT devices and controllers.

Specifically, SCNs supports transport & logistics communities to connect, allowing data to be collected from heterogeneous sources and thus creating a single data-sharing intelligence space in the Cloud, which physically consists of distributed connected data sources from supply chain actors.

Connectivity tools include intelligent adaptors, including translators to a common SCN data model; a publish/subscribe system for many-to-many communication of events with the ability to restrict who can subscribe to what data artefacts; and authentication and monitoring services. A Single-sign-on federated-enabled authorisation system is provided for services or data sources, which means that participants can deploy services via secure APIs.

SCNs also share aggregated data to provide a shared situational picture linked to a knowledge graph and event log. Where appropriate, shared data from the event log is transferred to a blockchain ledger, thus increasing trust levels related to the use of data with the help of full traceability, auditability and immutability.

Also, SCNs optimise operations through analysing the available aggregated data using SELIS Big Data Analytics module offering generic analytics algorithms in the form of “Recipes” (essentially pre-packaged industry knowledge) that can be easily configured to execute typical optimisation operations.

These include matching transport demand with available resources, accurately estimating cargo’s arrival time, and route optimisation. Predictive and optimisation analytics can also be used to cover smart contracts associated with route and mode decisions in synchromodal transport.

Moreover, SCNs utilise the supply chain context to incorporate methods, services and tools to facilitate better understanding and analyses of data.

To this end they support:

  • Efficient aggregation, ingestion, cleansing and data operations (e.g. joins, filtering, schema transformations, inferenced links creation in graphs) of data and events.
  • Efficient aggregation of information in various formats, arriving from different sources, i.e. operational back end systems, databases, services and APIs, IoT controllers…

The SELIS Knowledge Graph

An important innovation is the use of Knowledge Graphs to capture the SCN semantics based on relationships of entities relevant to a Collaboration Logistics Model, such as organisations, logistics objects/assets, resources and locations.

A Knowledge Graph integrates spatial, business-social and temporal data. It also provides an innovative mechanism to deal with the complexities of interconnected concepts in the range of millions of parallel events, and to accommodate information sources from open data and social media, where NoSQL approaches have previously been established as the database solution of choice.

In October 2019, SELIS published its open source components, including complex big bata Analytics, which are now for the first time freely accessible to the transport & logistics community for further development. This is an important step towards accelerating and broadening innovation, particularly in the SME segment.

The SELIS open source components can be accessed here — the project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Grant Agreement No 690588.

 


October 16, 2019  12:37 AM

UiPath makes key acquisitions for RPA discovery & process mining

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

UiPath has been shopping, perhaps unsurprisingly.

The software automation company specialises in the platform-level development of Robotic Process Automation (RPA) technologies. Having been on a rapid growth trajectory in recent months, UiPath probably couldn’t have developed all the new branches of technology that it needed to finesse its platform internally, certainly not immediately.

Among two key acquisitions the firm has announced is StepShot, a provider of Process Documentation software.

Going forward, UiPath will use StepShot to record, document and share processes as well as automate key steps in robot creation in RPA deployments for a spectrum of both well-defined and undefined processes.

“We work closely with customers to build automation strategies – [but] there is still untapped value in making the documentation process into an easier, more effective solution,” said Daniel Dines, UiPath co-founder and CEO.

Dines has claimed that with StepShot functionality now baked into the UiPath platform, that documentation challenge is well tackled. The company will build with this acquisition and its parallel acquisition of ProcessGold (see below) to now introduce the UiPath Explorer product family.

Front-line & back-line ops

Designed to simplify process understanding, the UiPath Explorer family works to identify, document, analyse and prioritise processes, with an ability to understand both front-line and back-line operations through scientific and visual analysis.

Looking more closely at ProcessGold, the company has made its name as a process mining vendor based in the Netherlands.

With this acquisition, UiPath claims to become the first vendor of scale to offer a solution that brings together both process mining and RPA capabilities. CEO Dines says that its all about understanding business processes, identify opportunities for automation – and then measuring the impact of those automations.

“Together with the StepShot acquisition and our organic work in Process Understanding, we can now help our customers have a complete view of their processes, including everything from front-end granular user actions to back-end, higher level transactional steps,” said PD Singh, vice president of AI at UiPath.

ProcessGold offers a development and operations platform for professional process mining that is used to build process intelligence applications for business process improvement.

UiPath says that it will now work closely with its partners to build out process mining at scale with the goal of offering customers the ability to close the loop between Process Understanding and RPA.


October 15, 2019  11:42 PM

What to expect from Splunk .conf 2019

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team is off to Splunk .conf for the fourth consecutive year.

Splunk is a Data-to-Everything Platform, helping organisations to get a complete view of their data in real time and turn it into business outcomes.

The company is named after spelunkers, an American term for caving or ‘pot hole’ aficionados. Spelunkers wade through meandering cavernous underground caverns, Splunkers navigate their way through cavernous amounts of noisy machine data.

Since Splunk .conf 2018 we have seen new versions of the core Splunk Enterprise product (which now sits at version 7.3) and Splunk Cloud products – technologies designed to allow users (developers… and, increasingly, less technical users) to search and ‘ask questions’ of machine data, log files and other core system information.

As Splunk has already explained, machine data has the potential to deliver important insights if it can be associated and correlated to the devices that are generating it – and, interrelated to the staff members directly associated with it – and, to the business systems that the data itself runs inside.

Splunk .conf by numbers

In terms of numbers, Splunk is promising somewhere around 10,000 attendees, 72 partner pavilion members and 350 technical sessions delivering over 100,000 hours of data-related education.

In terms of tracks, .conf19 features the following: Developer: to build apps for Splunk Enterprise, Splunk Developer Cloud and see Splunk’s native functionality live. Also, Internet of Things: to explore use cases to see what Splunk can do for transportation needs, predictive maintenance etc.

Other tracks include Business Analytics: for insights on Splunk’s real-time analytics platform which helps blend all types of data. Also, Splunk Foundations/Platform

Understand: for real-world best practices along with steps for using, deploying and administering Splunk.

Data-to-Everything

Splunk seems to change its core tagline from time to time. It used to be the company that delivered ‘aha moments’… and this year it’s the company with the Data-to-Everything platform.

Recent news saw Splunk completed the acquisition of SignalFx, a SaaS player in real-time monitoring and metrics for cloud infrastructure, microservices and applications.

Splunk already has a significant hand in ITOM and AIOps, so with the addition of SignalFx, the company will aim to position itself as a new voice in ‘observability’ and Application Performance Monitoring (APM).

“We live in a cloud-first world, where developers must have the ability to monitor and observe cloud-native infrastructure and applications in real-time, whether via logs, metrics or tracing. The power of Splunk and SignalFx allows our customers to monitor any relevant data at massive scale,” said Doug Merritt, President and CEO, Splunk. “I’m thrilled to welcome SignalFx to the Splunk team, and can’t wait to show our customers how our data platform can help them turn data into doing.”

We will report live from the show itself and follow up with deeper data-developer related analysis.

The event agenda is found here and key hashtags include #splunkconf19 and #DataToEverything for this year.


October 15, 2019  4:54 PM

UiPath Forward III: keynote notes, quotes & anecdotes

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Software automation specialist organisation UiPath staged it’s annual FORWARD user conference this week in Las Vegas and the Computer Weekly Developer Network team was there to soak up the tech know-how.

The company’s core technology proposition revolves around its development of Robotic Process Automation (RPA) technologies — IT advancements designed to shoulder human workloads by automating workflow tasks inside live business operations.

UiPath doesn’t stop to explain how its name came about… but we can safely assume that it is so named because it wants to offer a ‘path’ for users to be able to shift-off increasing amounts of work that would have featured inside their User Interfaces (UI) to the RPA bots that it develops and manages… right?

Official notes detail the use of Ui = User Interface and Path = the path of a workflow on the screen… it’s all about more succinctly capturing the opportunity from ‘computer vision’ technology.

Rules, to experience

UiPath’s chief marketing officer Bobby Patrick used his time on stage to acknowledge that we’re on a journey from rules-based technologies to experienced-based technologies that adapt to the reality of company workflows in real time… and now ‘grafting’ RPAs onto a moving target like this is a pretty complex and big ask.

“The automation era is in full swing — it’s a mindset shift to be able to examine your workflows and now work to bring [software] robots into the workplace. We now sit at a point where not just developers, but also citizen developers and subject matter experts are going to be able to not only use, but also develop automation… and this is at the point where automation itself becomes an application in its own right,” said Patrick.

Cheif product officer Param Kahlon followed Patrick onto the stage to dig into more of the UiPath product development story.

Kahlon bemoaned the fact that cloud has not (generally) made things in technology any easier… and he thinks that by automating business processes at the UI level, we can automate [via RPA] in many high-value ways.

Several customers took part in this keynote… and the general sentiment from all of those who spoke was their focus not just on data, but on the ‘shape of transactions’ inside their businesses.

Looking at what RPA needs to do next, Kahlon explained that UiPath customers are asking for a democratisation of RPA so that users at every tier in the company can start to build the bots they need… we also need to move towards directing RPA at more complex processes that drive increasingly interconnected elements of work.

The company wants to champion the development of what it calls a ‘robot for every person’, which is a way of saying that every employee in the workplace has a degree of their work managed by automation.

The message from UiPath is that we’re now working hard to put real context awareness into RPA so that it goes a good way beyond basic computer vision — the more straightforward technologies that deal with how computers gain high-level understanding from digital images or videos.

People are still needed to validate exceptions in the development of RPAs… which is of course why we talk about the need to incorporate Human In The Loop (HITL) into enterprise use of software robotics.

UiPath accommodates for this reality in its total suite of products. When a process execution requires a human input, the software directs the human user to look more closely at the task to see whether some software of anomaly or problem exists.

In terms of next stage, RPA is now previewing the ability to allow human-robot interaction in this way even on a mobile interface. As RPAs are directed at more complex business problems, we can expect that the operationalisation of RPAs into business workflows will need this careful level of touch and feel as we move to more fully automated business on a day-to-day basis.

Platform picks

The UiPath Platform features UiPath Studio (a software tool for designing RPA bots that will take on manual processes), UiPath Orchestrator (a management tool that takes on the role of a ‘control tower’ to look after RPA bots and also lay down security layers) and UiPath Robots (for both attended and unattended bots).

UiPath used Las Vegas event itself to detail new product launches including UiPath Explorer, a new offering designed to accelerate ‘process’ understanding and automation pipelines. Other new additions to the platform include UiPath StudioX for Citizen Developers.

As part of this event’s keynote session the company detailed the mechanics of how its UiPath StudioX works. The tool can directed to focus on information stored in an Excel file and then monitor how a user might bring in additional information to the spreadsheet to perform updates (such as creating new fields and so on).

As we now work more closely with bot technologies, we are starting to see an increasing number of companies formally define the role of RPA Engineer, Head of RPA Development or RPA Lead and so on.

Robots just got one bit closer to real reality.

UiPath CMO Bobby Patrick

UiPath live product demos formed part of the keynote.

 


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: