CW Developer Network


October 18, 2019  3:34 PM

What to expect from Tableau Conference 2019

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team are big fans of business intelligence and big data analytics, interactive data dashboard technologies, data visualisation specialists and companies that brew their own craft ale for their technical conventions.

That’s lucky then, because we’re off to Tableau Conference 2019.

It’s a seminal time for Tableau, the firm has obviously just closed up being acquired by CRM cloud giant Salesforce… a deal that went through for $15.7 billion.

Since the June announcement, rumours that Tableau’s hometown of Seattle will become Salesforce ‘new HQ2’ have flown and we hope to see ebullient CEO Benioff to make some sort of appearance at Tableau Conference 2019, which is staged between 12-15 November in Las Vegas.

“Tableau is an extraordinary company, with an amazing product and team and an incredibly passionate community. Together we can transform the way people understand not only their customers, but their whole world — delivering powerful AI-driven insights across all types of data and use cases for people of every skill level,” said Benioff.

Although he obviously didn’t really mean ‘together’ (he arguably more likely meant ‘as part of my firm’) one can only hope that some of the original Tableau hardcore geek-goodness survives.

Tableau Conference 2019

At the conference itself, we can expect around 18,000 attendees over the four-day show. Indeed, the last time Computer Weekly attended we found the ‘party’ zones extending out into the hotel car parks to gain a little extra room.

“At Tableau Conference we unite behind the mission to help people see and understand data. [Attendees can] immerse themselves in a data-driven culture that is unparalleled in the industry,” notes the show preview statement.

Opening remarks will be presented by Tableau president & CEO, Adam Selipsky and guests. Selipsky will welcome live data-developer demos and there’s even an Iron Viz Championship (like Iron Chef or Iron Man/Woman, but for data viz gurus) where three contestants will compete live for all the glory. Data will be provided by Pitney Bowes.

So who’s all coming?

The sessions, activities and networking will be tuned specific to different attendee interest and industry.

Attendees will include data and business analysts, business users (and comparatively non-technical team leaders) developer and programmers (and the perhaps more tangential world of data-developers) and other IT professionals in team leadership and management.

Vertical industry focus will span all sectors… but if we’re going to get specific, then we can say that Tableau looks at financial services, healthcare and public sector as key areas of customer interest.

Part of the event is the Data Village… and the company says this ‘conference hub’ has plenty of ‘surprise and delight’ features and lots to explore. “From streaming content at the Data Cubes, Aha! Theater, Story Points Theaters, and Data Pool to all the latest swag at the Tableau Store, this is your go-to-spot throughout the week,” notes the company.

We also expect to get a closer look at the latest product features including the recently announced 2019.3, which introduced new AI and machine learning-driven discovery capabilities.

“As the amount of data increases and the pace of decision-making accelerates, the need for data management has never been more critical to foster a thriving data culture,” said Francois Ajenstat, chief product officer at Tableau. “With Tableau 2019.3, we’re integrating data management directly into the analytics experience, making it easier for customers to curate and prepare all the data needed for analysis and improving visibility and increasing trust in the data for everyone within an organisation.”

Recent product updates likely to be discussed at the conference saw the company launch Tableau Catalog, a set of capabilities that provide a view of all the data used in Tableau to enable visibility and data discovery i.e. ensuring the right data is always used for analysis.

Tableau also recently announced the general availability of the Tableau Server Management Add-On, a new offering designed to help customers manage their enterprise-wide deployments of Tableau Server.

Explain Data

Also worth looking out for in terms of product sessions are periods devoted to Explain Data, a new capability built directly in Tableau that enables users to perform advanced statistical analysis with one click.

With no complex data modeling or data science expertise required, any user is promised the ability to uncover AI-driven insights about data.

Explain Data uses statistical algorithms to analyse all available data on behalf of the user and automatically explains the most relevant factors driving any given data point. In doing so, Explain Data brings analytics to more people and helps them discover insights that would be difficult or time-consuming to find.

Explain Data is available at no extra charge as part of Tableau’s latest release, Tableau 2019.3.

We can hope for an explanation of where the company is going to fit into the Salesforce vision and few other similarly related forward-looking thoughts. Whether you spell it visualization or visualisation, Tableau promises that there’s going to be a lot to feast the eyes on.

October 16, 2019  2:15 PM

Appian & UiPath connect neurons on robotic workforce tools

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Low-code software company Appian has announced the availability of the Appian Robotic Workforce Manager for UiPath, the Robotic Process Automation (RPA) specialist.

The software optimises coordination between robots and humans.

Incorporating Appian Robotic Workforce Manager, users can use this tool to manage, scale and maximize ROI while providing visibility and access to the business.

Features of the software include ‘total visibility’ into the digital workforce on any device (web or mobile) via pre-built business reports and dashboards to track and manage robots across applications.

Automation Center of Excellence (CofE) leaders can provide this information to the business owners of respective RPA processes through web and mobile interfaces.

“Managing a digital workforce across systems, humans and robots effectively is important,“ said Dave Marcus, VP strategic product technology alliances at UIPath. “Through our partnership with Appian, our joint customers are able to [use] the Appian Robotic Workforce Manager to orchestrate a view of their workforce and maximize their automation efforts.”

The technology also works to provide automation of work ‘hand-offs’ between robots and people. It provides a case management and exception management framework that automates ‘human-in-the-loop’ tasks and eliminates manual hand-offs.

Director of product at Appian Prittam Bagani also says that the software comes with an automation manager to allow business process owners to request new automation and manage the life cycle of adding new automations.

Bagani states that this gives the Automation COE full visibility, compliance and control over all enterprise-wide RPA deployments. For companies that are building many RPA processes, it is obviously quite important to to track and prioritise new RPA processes and ongoing management of existing RPA processes.

Appian also released a no-code integration plug-in for UiPath — and — UiPath also released an integration for designers of the UiPath environment called ‘Appian Activity’ so that UiPath users can drag and drop it in UiPath for connectivity to Appian.

 


October 16, 2019  1:30 PM

SELIS: A recipe for a shared logistics information space

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Dr. Takis Katsoulakos in his role as managing director for Inlecom Systems.

Inlecom Systems is an information learning and communication solutions company with offices and consultants in Brussels, UK, Athens, Ireland Spain and Italy.

Katsoulakos writes

The logistics sector is plagued by low efficiency.

For example, one fifth of road freight journeys in the EU are performed by empty vehicles. Software application developers, data architects and every other engineer involved in the construction of the IT stack must now realise that smarter, more effective (and subsequently more environmentally friendly) supply chains can only be achieved by more synergistic and collaborative use of resources to boost transport optimisation.

SELIS (Shared European Logistics Intelligent Information Space), a €17 million flagship European Commission-funded research project, was set up to address the challenges associated with fragmented logistics chains, more specifically:

(1) to improve information sharing between logistics stakeholders.

(2) to provide a platform that is easy to use by companies of all sizes, enabling a plug-and-play approach to sharing and analysing supply chain data.

(3) to facilitate real-time availability of information.

These challenges were approached by building a scalable and replicable platform for logistics applications, allowing a standardised exchange of data between any number of users in a supply chain community. The use of blockchain technology has created a trusted data layer, integrating multiple sources of data to provide greater transparency and visibility of supply chain transactions.

SELIS Supply Chain Community Nodes 

SELIS’s approach is based on Supply Chain Community Nodes (SCNs) that have created a shared intelligence ‘data space’, configured to address the needs of a logistics community.

SCNs aggregate information flows in various industry standard formats that are generated by operational systems of logistics participants, IoT devices and controllers.

Specifically, SCNs supports transport & logistics communities to connect, allowing data to be collected from heterogeneous sources and thus creating a single data-sharing intelligence space in the Cloud, which physically consists of distributed connected data sources from supply chain actors.

Connectivity tools include intelligent adaptors, including translators to a common SCN data model; a publish/subscribe system for many-to-many communication of events with the ability to restrict who can subscribe to what data artefacts; and authentication and monitoring services. A Single-sign-on federated-enabled authorisation system is provided for services or data sources, which means that participants can deploy services via secure APIs.

SCNs also share aggregated data to provide a shared situational picture linked to a knowledge graph and event log. Where appropriate, shared data from the event log is transferred to a blockchain ledger, thus increasing trust levels related to the use of data with the help of full traceability, auditability and immutability.

Also, SCNs optimise operations through analysing the available aggregated data using SELIS Big Data Analytics module offering generic analytics algorithms in the form of “Recipes” (essentially pre-packaged industry knowledge) that can be easily configured to execute typical optimisation operations.

These include matching transport demand with available resources, accurately estimating cargo’s arrival time, and route optimisation. Predictive and optimisation analytics can also be used to cover smart contracts associated with route and mode decisions in synchromodal transport.

Moreover, SCNs utilise the supply chain context to incorporate methods, services and tools to facilitate better understanding and analyses of data.

To this end they support:

  • Efficient aggregation, ingestion, cleansing and data operations (e.g. joins, filtering, schema transformations, inferenced links creation in graphs) of data and events.
  • Efficient aggregation of information in various formats, arriving from different sources, i.e. operational back end systems, databases, services and APIs, IoT controllers…

The SELIS Knowledge Graph

An important innovation is the use of Knowledge Graphs to capture the SCN semantics based on relationships of entities relevant to a Collaboration Logistics Model, such as organisations, logistics objects/assets, resources and locations.

A Knowledge Graph integrates spatial, business-social and temporal data. It also provides an innovative mechanism to deal with the complexities of interconnected concepts in the range of millions of parallel events, and to accommodate information sources from open data and social media, where NoSQL approaches have previously been established as the database solution of choice.

In October 2019, SELIS published its open source components, including complex big bata Analytics, which are now for the first time freely accessible to the transport & logistics community for further development. This is an important step towards accelerating and broadening innovation, particularly in the SME segment.

The SELIS open source components can be accessed here — the project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Grant Agreement No 690588.

 


October 16, 2019  12:37 AM

UiPath makes key acquisitions for RPA discovery & process mining

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

UiPath has been shopping, perhaps unsurprisingly.

The software automation company specialises in the platform-level development of Robotic Process Automation (RPA) technologies. Having been on a rapid growth trajectory in recent months, UiPath probably couldn’t have developed all the new branches of technology that it needed to finesse its platform internally, certainly not immediately.

Among two key acquisitions the firm has announced is StepShot, a provider of Process Documentation software.

Going forward, UiPath will use StepShot to record, document and share processes as well as automate key steps in robot creation in RPA deployments for a spectrum of both well-defined and undefined processes.

“We work closely with customers to build automation strategies – [but] there is still untapped value in making the documentation process into an easier, more effective solution,” said Daniel Dines, UiPath co-founder and CEO.

Dines has claimed that with StepShot functionality now baked into the UiPath platform, that documentation challenge is well tackled. The company will build with this acquisition and its parallel acquisition of ProcessGold (see below) to now introduce the UiPath Explorer product family.

Front-line & back-line ops

Designed to simplify process understanding, the UiPath Explorer family works to identify, document, analyse and prioritise processes, with an ability to understand both front-line and back-line operations through scientific and visual analysis.

Looking more closely at ProcessGold, the company has made its name as a process mining vendor based in the Netherlands.

With this acquisition, UiPath claims to become the first vendor of scale to offer a solution that brings together both process mining and RPA capabilities. CEO Dines says that its all about understanding business processes, identify opportunities for automation – and then measuring the impact of those automations.

“Together with the StepShot acquisition and our organic work in Process Understanding, we can now help our customers have a complete view of their processes, including everything from front-end granular user actions to back-end, higher level transactional steps,” said PD Singh, vice president of AI at UiPath.

ProcessGold offers a development and operations platform for professional process mining that is used to build process intelligence applications for business process improvement.

UiPath says that it will now work closely with its partners to build out process mining at scale with the goal of offering customers the ability to close the loop between Process Understanding and RPA.


October 15, 2019  11:42 PM

What to expect from Splunk .conf 2019

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team is off to Splunk .conf for the fourth consecutive year.

Splunk is a Data-to-Everything Platform, helping organisations to get a complete view of their data in real time and turn it into business outcomes.

The company is named after spelunkers, an American term for caving or ‘pot hole’ aficionados. Spelunkers wade through meandering cavernous underground caverns, Splunkers navigate their way through cavernous amounts of noisy machine data.

Since Splunk .conf 2018 we have seen new versions of the core Splunk Enterprise product (which now sits at version 7.3) and Splunk Cloud products – technologies designed to allow users (developers… and, increasingly, less technical users) to search and ‘ask questions’ of machine data, log files and other core system information.

As Splunk has already explained, machine data has the potential to deliver important insights if it can be associated and correlated to the devices that are generating it – and, interrelated to the staff members directly associated with it – and, to the business systems that the data itself runs inside.

Splunk .conf by numbers

In terms of numbers, Splunk is promising somewhere around 10,000 attendees, 72 partner pavilion members and 350 technical sessions delivering over 100,000 hours of data-related education.

In terms of tracks, .conf19 features the following: Developer: to build apps for Splunk Enterprise, Splunk Developer Cloud and see Splunk’s native functionality live. Also, Internet of Things: to explore use cases to see what Splunk can do for transportation needs, predictive maintenance etc.

Other tracks include Business Analytics: for insights on Splunk’s real-time analytics platform which helps blend all types of data. Also, Splunk Foundations/Platform

Understand: for real-world best practices along with steps for using, deploying and administering Splunk.

Data-to-Everything

Splunk seems to change its core tagline from time to time. It used to be the company that delivered ‘aha moments’… and this year it’s the company with the Data-to-Everything platform.

Recent news saw Splunk completed the acquisition of SignalFx, a SaaS player in real-time monitoring and metrics for cloud infrastructure, microservices and applications.

Splunk already has a significant hand in ITOM and AIOps, so with the addition of SignalFx, the company will aim to position itself as a new voice in ‘observability’ and Application Performance Monitoring (APM).

“We live in a cloud-first world, where developers must have the ability to monitor and observe cloud-native infrastructure and applications in real-time, whether via logs, metrics or tracing. The power of Splunk and SignalFx allows our customers to monitor any relevant data at massive scale,” said Doug Merritt, President and CEO, Splunk. “I’m thrilled to welcome SignalFx to the Splunk team, and can’t wait to show our customers how our data platform can help them turn data into doing.”

We will report live from the show itself and follow up with deeper data-developer related analysis.

The event agenda is found here and key hashtags include #splunkconf19 and #DataToEverything for this year.


October 15, 2019  4:54 PM

UiPath Forward III: keynote notes, quotes & anecdotes

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Software automation specialist organisation UiPath staged it’s annual FORWARD user conference this week in Las Vegas and the Computer Weekly Developer Network team was there to soak up the tech know-how.

The company’s core technology proposition revolves around its development of Robotic Process Automation (RPA) technologies — IT advancements designed to shoulder human workloads by automating workflow tasks inside live business operations.

UiPath doesn’t stop to explain how its name came about… but we can safely assume that it is so named because it wants to offer a ‘path’ for users to be able to shift-off increasing amounts of work that would have featured inside their User Interfaces (UI) to the RPA bots that it develops and manages… right?

Official notes detail the use of Ui = User Interface and Path = the path of a workflow on the screen… it’s all about more succinctly capturing the opportunity from ‘computer vision’ technology.

Rules, to experience

UiPath’s chief marketing officer Bobby Patrick used his time on stage to acknowledge that we’re on a journey from rules-based technologies to experienced-based technologies that adapt to the reality of company workflows in real time… and now ‘grafting’ RPAs onto a moving target like this is a pretty complex and big ask.

“The automation era is in full swing — it’s a mindset shift to be able to examine your workflows and now work to bring [software] robots into the workplace. We now sit at a point where not just developers, but also citizen developers and subject matter experts are going to be able to not only use, but also develop automation… and this is at the point where automation itself becomes an application in its own right,” said Patrick.

Cheif product officer Param Kahlon followed Patrick onto the stage to dig into more of the UiPath product development story.

Kahlon bemoaned the fact that cloud has not (generally) made things in technology any easier… and he thinks that by automating business processes at the UI level, we can automate [via RPA] in many high-value ways.

Several customers took part in this keynote… and the general sentiment from all of those who spoke was their focus not just on data, but on the ‘shape of transactions’ inside their businesses.

Looking at what RPA needs to do next, Kahlon explained that UiPath customers are asking for a democratisation of RPA so that users at every tier in the company can start to build the bots they need… we also need to move towards directing RPA at more complex processes that drive increasingly interconnected elements of work.

The company wants to champion the development of what it calls a ‘robot for every person’, which is a way of saying that every employee in the workplace has a degree of their work managed by automation.

The message from UiPath is that we’re now working hard to put real context awareness into RPA so that it goes a good way beyond basic computer vision — the more straightforward technologies that deal with how computers gain high-level understanding from digital images or videos.

People are still needed to validate exceptions in the development of RPAs… which is of course why we talk about the need to incorporate Human In The Loop (HITL) into enterprise use of software robotics.

UiPath accommodates for this reality in its total suite of products. When a process execution requires a human input, the software directs the human user to look more closely at the task to see whether some software of anomaly or problem exists.

In terms of next stage, RPA is now previewing the ability to allow human-robot interaction in this way even on a mobile interface. As RPAs are directed at more complex business problems, we can expect that the operationalisation of RPAs into business workflows will need this careful level of touch and feel as we move to more fully automated business on a day-to-day basis.

Platform picks

The UiPath Platform features UiPath Studio (a software tool for designing RPA bots that will take on manual processes), UiPath Orchestrator (a management tool that takes on the role of a ‘control tower’ to look after RPA bots and also lay down security layers) and UiPath Robots (for both attended and unattended bots).

UiPath used Las Vegas event itself to detail new product launches including UiPath Explorer, a new offering designed to accelerate ‘process’ understanding and automation pipelines. Other new additions to the platform include UiPath StudioX for Citizen Developers.

As part of this event’s keynote session the company detailed the mechanics of how its UiPath StudioX works. The tool can directed to focus on information stored in an Excel file and then monitor how a user might bring in additional information to the spreadsheet to perform updates (such as creating new fields and so on).

As we now work more closely with bot technologies, we are starting to see an increasing number of companies formally define the role of RPA Engineer, Head of RPA Development or RPA Lead and so on.

Robots just got one bit closer to real reality.

UiPath CMO Bobby Patrick

UiPath live product demos formed part of the keynote.

 


October 14, 2019  10:40 PM

What to expect from Tanium CONVERGE 2019

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

The Computer Weekly Developer Network team are big fans of robust data management, younger-breed tech mavericks who want to carve a new niche in the crowded information intelligence market, Southern barbeque with lots of hot sauce… and, of course, every good Country Music start from Johnny Cash to Taylor Swift.

By a stroke of luck, we’ve zoned in on Tanium CONVERGE 2019.

Now in its fourth year, CONVERGE has gained some critical mass and this year is due to host close to 1000 attendees for the first time.

Staged from November 18 – 21 in Nashville, Tennessee…  Tanium will host keynotes, breakout sessions and hands-on technical labs as integral elements of the total conference programme.

So what does Tanium do?

Tanium provides customers with insight and control into endpoints including laptops, servers, virtual machines, containers, or cloud infrastructure. The technology can help IT security and operations teams ask questions about the state of every endpoint across the enterprise, retrieve data on their current and historical state, and execute change as necessary.  Tanium also enables organisations to assess security risks and contain threats.

A different style

Tanium says that its Nashville event will drill deep into security and IT operations in a different kind of way.

“There are no sprawling trade show floors, aggressive sales messages, or sessions with overly broad content. Our conference is designed to be different. We want to empower technology practitioners to achieve true visibility and control of their IT environments, using modern architecture and tools built for this century,” notes Tanium, in an event preview statement.

Tanium says that Converge is a chance for technology practitioners to achieve true visibility and control of their IT environments, using modern architecture and tools built for this century.

News bites & product highlights

Recent news saw Tanium selected ranked 7th on the Forbes Cloud 100 list — the company says that is distinguishing itself from peers based on valuation, revenue and growth.

Among its products we find Tanium Reveal, software designed to reduce risks of data exposure, mitigate the impact of breaches and prepare for regulatory compliance obligations.

The technology itself defines sensitive data patterns and works to continuously monitor IT environments for matching files… so that systems managers can then categorise, notify, alert, and take direct action.

The message from Tanium is that the pace of growth in both connected endpoints and the data that they collect and retain, has far exceeded what’s supportable by legacy technologies.

“Tanium Reveal was born from a simple idea: make searching for sensitive data across an enterprise as fast as any other Tanium question – delivering results at speed. We’re excited to help users to define sensitive data patterns, continuously monitor endpoints for matching artifacts, and then categorize, notify, alert, and take action across even the largest and most complex environments,” noted Chris Hodson, chief information security officer at Tanium.

Tanium Performance

Other key products to be showcased include Tanium Performance, a software solution for IT Operations teams to monitor, investigate and remediate performance issues at scale on desktops, laptops and servers.

Hodson says that Tanium Performance provides IT teams with access to rich historical data for a single endpoint, so that they can effectively troubleshoot problems and lower mean time to repair. Customers can then use this data to make better decisions about IT initiatives related to software and hardware changes.

The Computer Weekly team is out to find out just how far most organisations truly lack visibility into endpoint performance issues… and also examine just how far the manual troubleshooting process headache really extends as we dig through Tanium’s messages…

… oh, yes, and we’ll eat some Southern Barbeque too.

The company tweets at @Tanium and the event hashtag is #CONVERGE19.

 


October 14, 2019  4:41 PM

Dynatrace analytics drives business towards the ‘expected normal’

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

Dynatrace has added a new raft of analytics to its software intelligence offering.

Known as Digital Business Analytics, the software itself is made of code (it’s digital), it’s intended for enterprise usage (that’s business) and it performs analytical functions on data already flowing through Dynatrace’s application and digital experience monitoring modules (yep, that’s the analytics part).

The company says it has brought together real-time AI-powered analysis to bind together user experience, customer behaviour and application performance data with business metrics.

The coming together of key indicators is intended to shed more light on sales conversions, orders, customer churn, release validation, customer segmentation and other areas.

Digital Business Analytics joins Application Performance Management (APM), Cloud Infrastructure Monitoring (CIM), Digital Experience Management (DEM) and AIOps as part of the Dynatrace all-in-one Software Intelligence Platform.

‘Expected normal’

The company’s AI-engine is called Davis.

In operation, Davis continually learns what ‘expected normal’ business performance looks like and provides proactive answers to issues for optimisation of compute resources.

“Digital transformation projects are spurring companies to create multidisciplinary line of business teams that run the business with a product mindset and are demanding answers to questions that were previously difficult, slow or impossible to obtain,” said Steve Tack, SVP of Product Management at Dynatrace. “Digital Business Analytics complements existing web analytics tools to deliver real-time and complete results, by combining existing customer facing channels with application and user experience data.

Dynatrace says that as data volume and velocity accelerates, organisations are struggling to make sense of disparate dashboards from traditional IT monitoring tools, web analytics and ad hoc reporting.

The company insists that its Digital Business Analytics product automatically captures business data and analyses it in context with user experience and application performance data.

Key pillars of Digital Business Analytics include: Transactions: Automatic tracing, segmentation and data extraction from business transactions; Analytics: AI-powered analysis, exploration/querying and extraction of business-relevant insights from Dynatrace application and user experience data; Conversions: Visualisation of and collaboration on business-relevant metrics such as conversions and revenue performance by product, customer segment, geo etc. ; and Automation: AI-powered anomaly detection, alerting and root cause determination for business processes, with programmable APIs to trigger business workflows and change events.

 

 


October 10, 2019  3:15 PM

Cloudinary image leader: developing the ‘next’ JPEG

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

This is a guest post for the Computer Weekly Developer Network written by Dr Jon Sneyers, senior image researcher at Cloudinary.

Cloudinary is a SaaS technology company headquartered in Santa Clara, California, with an office in Israel. The company provides a cloud-based image and video management solution to enables users to upload, store, manage, manipulate and deliver images and video for websites and apps.

Sneyers writes…

Last summer I had the opportunity to participate in my third JPEG meeting of the year, the 84th JPEG meeting in Brussels, Belgium, conveniently close to where I live.

These are very exciting times for those of us involved in the world of standards and image formats — and especially for those of us directly involved in the development of what will soon be a major update – the first in about 20 years – to the widely adopted JPEG.

A little history for those less familiar with image formats: JPEG stands for the Joint Photographic Experts Group, which created the standard in 1992. The main basis for JPEG is the Discrete Cosine Transform (DCT), a “lossy” image compression technique that was first proposed by Nasir Ahmed in 1972.

How did we get here?

You might be wondering how we got here and why is a new image format necessary?

Last spring, the JPEG Committee launched its Next-Generation Image Compression Call for Proposals, also referred to as JPEG XL.

The goal was to: “Develop a standard for image compression that offers substantially better compression efficiency than existing image formats (e.g. >60% over JPEG-1), along with features desirable for web distribution and efficient compression of high-quality images.”

My submission, which I dubbed FUIF or the Free Universal Image Format, was one of seven proposals selected to be presented at the 81st JPEG meeting, held in Vancouver, Canada.

That was the first JPEG meeting I participated in, presenting and defending my proposal.

Once all seven proposals were presented we spent the week discussing the pros and cons of each proposal, the subjective evaluation results, and how to get started on JPEG XL.

At the 82nd JPEG meeting in Lisbon, Portugal, it was decided to combine two proposals: my FUIF and Google’s PIK, and make that the starting point for JPEG XL.

Use cases & requirements

At that point, I was promoted to the role of co-chair of the JPEG XL Ad Hoc Group, together with Google’s Jan Wassenberg. The main discussions were on refining the use cases and requirements… and, on how to combine and unify the two proposals into a single image codec.

At the next meeting, held in Geneva, Switzerland, we started writing the formal standard – it was still a very rough ‘working draft’, but it was a good start.

Gradually we transitioned from a situation where the code was the ‘source of truth’ (and the spec draft was describing whatever the code did) to a situation where the spec became the source of truth and the code had to implement what the spec was saying.

The road to ISO

At the 84th JPEG meeting, we took JPEG XL to the next stage, which is ‘Committee Draft’. This is one of the first steps in the ISO standardization process. The next step will be DIS (“Draft International Standard”), and at the end – if all goes well – it will be an International Standard, with the nice number ISO/IEC 18181.

While most of the development work is done now, there is still a lot of tweaking, experimentation and evaluation to be done to make JPEG XL the best image (and animation) codec it can possibly be.

We hope it will become as successful as JPEG, and gain widespread adoption – eventually replacing JPEG, PNG and GIF. To help that transition, we made sure that existing images in one of those three formats can be converted to JPEG XL in a pixel-exact way with guaranteed compression gains.

New JPEG XL encoded images will need only one third of the bytes JPEG needs to reach the same perceptual quality.

We’ve come a long way since the first format and I think most of us can appreciate what these savings and quality of improvements means for the ‘visual web’. These are very exciting times indeed!

About the author

Dr Jon Sneyers, senior image researcher at Cloudinary invented the Free Lossless Image Format (FLIF). His image processing “Muscles from Brussels” help deliver super-optimised images, super-fast.

Cloudinary’s Sneyers: pumped up developing the ‘visual web’.


October 9, 2019  2:08 PM

IFS success services tackle challenge of software ‘outcomes’

Adrian Bridgwater Adrian Bridgwater Profile: Adrian Bridgwater

It’s a tough one and a big ask — you’re a software company and you need to stay current and to show some level of evidence to suggest that your portfolio, suite and wider stack of software and services is constantly evolving.

Stopping short of a full re-architecting of any codebase or structure, enterprise applications company IFS has used a slightly softer term to describe its current engineering overhaul – so not quite a re-architecting of software, but a reimagining of software.

The company has taken the wraps off its new ‘success service’ offerings – a set of service and support offerings that are engineered to deliver predictable costs, manageable timelines and technology-business outcomes.

Software outcomes

That term ‘outcomes’ sounds like marketingspeak, but it’s real ‘thing’ in enterprise software delivery because the industry is talking openly about the suggestion that customers might now start to increasingly look at only paying for outcomes-based delivery agreements.

This is where software vendors charge for software based upon the tangible business success that customers can evidence after software has been deployed. IFS CEO Darren Roos spoke to press in breakout sessions at the company’s annual IFS World user conference and said that he has had conversations about full outcomes-based deals that involve a degree of profit sharing… and that ultimately, customers don’t want to do them.

Regardless then, IFS has focused on outcomes with its latest services reimagining.

The company insists that, for too long, software vendors have neglected the importance of ensuring customer success beyond go-live. As such, IFS has introduced two new service offerings aimed specifically at helping customers maximise what their software can do and deliver.

The new product plays are differentiated to cater to the needs of businesses from global enterprise-scale organizations looking to work extensively with a dedicated customer success team, to mid-sized companies that want right-sized application management and support (AMS) on an ongoing basis.

IFS Select & Success

IFS Select is a services framework for customers – and the company says it supports customers on data-driven strategic decision-making based on real-time data, to ongoing business support, on-site enterprise architects, IT change management etc.

IFS Success provides a services framework that allows customers to choose the outcome-based service components that they need relevant to their business priorities.

The four pillars of IFS Success are: 

  • Value Assurance: Understanding the expected business value and running the initiatives needed to unlock it. 
  • AMS (Application Management Services): Operational and expert application management with ongoing access to IFS as well as quick response and resolution times.
  • Safeguarding: Offering customers choice through a network of partners from system integrators, change management specialists and boutique industry technology houses.
  • Customer Success Management: For customers using two or more of the components above, IFS will work to ensure the business is served with continuous improvement and enhanced support models.

IFS senior vice president of consulting Stefano Mattiello has explained that his company is continuously improving the ways in which the software is deployed and utilised. He notes that the IFS methodology has been extended to include post-go-live value realisation and value maximisation phases to reflect customers’ evolving business needs.

In addition, heavy investment has been made in the IFS Solution Composer to visualize IFS solutions as well as in the IFS Industry Accelerators to help customers go live better, faster, and adopt the software in the most cost-effective ways.

Beyond implementation…

This is a positive discussion point, surely?

We (arguably) need to change the label “After Sales” to “Beyond Implementation Services” and this is perhaps something of what IFS is pushing towards.

 

 


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: