The Computer Weekly Developer Network team are big fans of robust data management, younger-breed tech mavericks who want to carve a new niche in the crowded information intelligence market, Southern barbeque with lots of hot sauce… and, of course, every good Country Music start from Johnny Cash to Taylor Swift.
By a stroke of luck, we’ve zoned in on Tanium CONVERGE 2019.
Now in its fourth year, CONVERGE has gained some critical mass and this year is due to host close to 1000 attendees for the first time.
Staged from November 18 – 21 in Nashville, Tennessee… Tanium will host keynotes, breakout sessions and hands-on technical labs as integral elements of the total conference programme.
So what does Tanium do?
Tanium provides customers with insight and control into endpoints including laptops, servers, virtual machines, containers, or cloud infrastructure. The technology can help IT security and operations teams ask questions about the state of every endpoint across the enterprise, retrieve data on their current and historical state, and execute change as necessary. Tanium also enables organisations to assess security risks and contain threats.
A different style
Tanium says that its Nashville event will drill deep into security and IT operations in a different kind of way.
“There are no sprawling trade show floors, aggressive sales messages, or sessions with overly broad content. Our conference is designed to be different. We want to empower technology practitioners to achieve true visibility and control of their IT environments, using modern architecture and tools built for this century,” notes Tanium, in an event preview statement.
Tanium says that Converge is a chance for technology practitioners to achieve true visibility and control of their IT environments, using modern architecture and tools built for this century.
News bites & product highlights
Recent news saw Tanium selected ranked 7th on the Forbes Cloud 100 list — the company says that is distinguishing itself from peers based on valuation, revenue and growth.
Among its products we find Tanium Reveal, software designed to reduce risks of data exposure, mitigate the impact of breaches and prepare for regulatory compliance obligations.
The technology itself defines sensitive data patterns and works to continuously monitor IT environments for matching files… so that systems managers can then categorise, notify, alert, and take direct action.
The message from Tanium is that the pace of growth in both connected endpoints and the data that they collect and retain, has far exceeded what’s supportable by legacy technologies.
“Tanium Reveal was born from a simple idea: make searching for sensitive data across an enterprise as fast as any other Tanium question – delivering results at speed. We’re excited to help users to define sensitive data patterns, continuously monitor endpoints for matching artifacts, and then categorize, notify, alert, and take action across even the largest and most complex environments,” noted Chris Hodson, chief information security officer at Tanium.
Other key products to be showcased include Tanium Performance, a software solution for IT Operations teams to monitor, investigate and remediate performance issues at scale on desktops, laptops and servers.
Hodson says that Tanium Performance provides IT teams with access to rich historical data for a single endpoint, so that they can effectively troubleshoot problems and lower mean time to repair. Customers can then use this data to make better decisions about IT initiatives related to software and hardware changes.
The Computer Weekly team is out to find out just how far most organisations truly lack visibility into endpoint performance issues… and also examine just how far the manual troubleshooting process headache really extends as we dig through Tanium’s messages…
… oh, yes, and we’ll eat some Southern Barbeque too.
Dynatrace has added a new raft of analytics to its software intelligence offering.
Known as Digital Business Analytics, the software itself is made of code (it’s digital), it’s intended for enterprise usage (that’s business) and it performs analytical functions on data already flowing through Dynatrace’s application and digital experience monitoring modules (yep, that’s the analytics part).
The company says it has brought together real-time AI-powered analysis to bind together user experience, customer behaviour and application performance data with business metrics.
The coming together of key indicators is intended to shed more light on sales conversions, orders, customer churn, release validation, customer segmentation and other areas.
Digital Business Analytics joins Application Performance Management (APM), Cloud Infrastructure Monitoring (CIM), Digital Experience Management (DEM) and AIOps as part of the Dynatrace all-in-one Software Intelligence Platform.
The company’s AI-engine is called Davis.
In operation, Davis continually learns what ‘expected normal’ business performance looks like and provides proactive answers to issues for optimisation of compute resources.
“Digital transformation projects are spurring companies to create multidisciplinary line of business teams that run the business with a product mindset and are demanding answers to questions that were previously difficult, slow or impossible to obtain,” said Steve Tack, SVP of Product Management at Dynatrace. “Digital Business Analytics complements existing web analytics tools to deliver real-time and complete results, by combining existing customer facing channels with application and user experience data.
Dynatrace says that as data volume and velocity accelerates, organisations are struggling to make sense of disparate dashboards from traditional IT monitoring tools, web analytics and ad hoc reporting.
The company insists that its Digital Business Analytics product automatically captures business data and analyses it in context with user experience and application performance data.
Key pillars of Digital Business Analytics include: Transactions: Automatic tracing, segmentation and data extraction from business transactions; Analytics: AI-powered analysis, exploration/querying and extraction of business-relevant insights from Dynatrace application and user experience data; Conversions: Visualisation of and collaboration on business-relevant metrics such as conversions and revenue performance by product, customer segment, geo etc. ; and Automation: AI-powered anomaly detection, alerting and root cause determination for business processes, with programmable APIs to trigger business workflows and change events.
This is a guest post for the Computer Weekly Developer Network written by Dr Jon Sneyers, senior image researcher at Cloudinary.
Cloudinary is a SaaS technology company headquartered in Santa Clara, California, with an office in Israel. The company provides a cloud-based image and video management solution to enables users to upload, store, manage, manipulate and deliver images and video for websites and apps.
Last summer I had the opportunity to participate in my third JPEG meeting of the year, the 84th JPEG meeting in Brussels, Belgium, conveniently close to where I live.
These are very exciting times for those of us involved in the world of standards and image formats — and especially for those of us directly involved in the development of what will soon be a major update – the first in about 20 years – to the widely adopted JPEG.
A little history for those less familiar with image formats: JPEG stands for the Joint Photographic Experts Group, which created the standard in 1992. The main basis for JPEG is the Discrete Cosine Transform (DCT), a “lossy” image compression technique that was first proposed by Nasir Ahmed in 1972.
How did we get here?
You might be wondering how we got here and why is a new image format necessary?
Last spring, the JPEG Committee launched its Next-Generation Image Compression Call for Proposals, also referred to as JPEG XL.
The goal was to: “Develop a standard for image compression that offers substantially better compression efficiency than existing image formats (e.g. >60% over JPEG-1), along with features desirable for web distribution and efficient compression of high-quality images.”
My submission, which I dubbed FUIF or the Free Universal Image Format, was one of seven proposals selected to be presented at the 81st JPEG meeting, held in Vancouver, Canada.
That was the first JPEG meeting I participated in, presenting and defending my proposal.
Once all seven proposals were presented we spent the week discussing the pros and cons of each proposal, the subjective evaluation results, and how to get started on JPEG XL.
At the 82nd JPEG meeting in Lisbon, Portugal, it was decided to combine two proposals: my FUIF and Google’s PIK, and make that the starting point for JPEG XL.
Use cases & requirements
At that point, I was promoted to the role of co-chair of the JPEG XL Ad Hoc Group, together with Google’s Jan Wassenberg. The main discussions were on refining the use cases and requirements… and, on how to combine and unify the two proposals into a single image codec.
At the next meeting, held in Geneva, Switzerland, we started writing the formal standard – it was still a very rough ‘working draft’, but it was a good start.
Gradually we transitioned from a situation where the code was the ‘source of truth’ (and the spec draft was describing whatever the code did) to a situation where the spec became the source of truth and the code had to implement what the spec was saying.
The road to ISO
At the 84th JPEG meeting, we took JPEG XL to the next stage, which is ‘Committee Draft’. This is one of the first steps in the ISO standardization process. The next step will be DIS (“Draft International Standard”), and at the end – if all goes well – it will be an International Standard, with the nice number ISO/IEC 18181.
While most of the development work is done now, there is still a lot of tweaking, experimentation and evaluation to be done to make JPEG XL the best image (and animation) codec it can possibly be.
We hope it will become as successful as JPEG, and gain widespread adoption – eventually replacing JPEG, PNG and GIF. To help that transition, we made sure that existing images in one of those three formats can be converted to JPEG XL in a pixel-exact way with guaranteed compression gains.
New JPEG XL encoded images will need only one third of the bytes JPEG needs to reach the same perceptual quality.
We’ve come a long way since the first format and I think most of us can appreciate what these savings and quality of improvements means for the ‘visual web’. These are very exciting times indeed!
About the author
Dr Jon Sneyers, senior image researcher at Cloudinary invented the Free Lossless Image Format (FLIF). His image processing “Muscles from Brussels” help deliver super-optimised images, super-fast.
It’s a tough one and a big ask — you’re a software company and you need to stay current and to show some level of evidence to suggest that your portfolio, suite and wider stack of software and services is constantly evolving.
Stopping short of a full re-architecting of any codebase or structure, enterprise applications company IFS has used a slightly softer term to describe its current engineering overhaul – so not quite a re-architecting of software, but a reimagining of software.
The company has taken the wraps off its new ‘success service’ offerings – a set of service and support offerings that are engineered to deliver predictable costs, manageable timelines and technology-business outcomes.
That term ‘outcomes’ sounds like marketingspeak, but it’s real ‘thing’ in enterprise software delivery because the industry is talking openly about the suggestion that customers might now start to increasingly look at only paying for outcomes-based delivery agreements.
This is where software vendors charge for software based upon the tangible business success that customers can evidence after software has been deployed. IFS CEO Darren Roos spoke to press in breakout sessions at the company’s annual IFS World user conference and said that he has had conversations about full outcomes-based deals that involve a degree of profit sharing… and that ultimately, customers don’t want to do them.
Regardless then, IFS has focused on outcomes with its latest services reimagining.
The company insists that, for too long, software vendors have neglected the importance of ensuring customer success beyond go-live. As such, IFS has introduced two new service offerings aimed specifically at helping customers maximise what their software can do and deliver.
The new product plays are differentiated to cater to the needs of businesses from global enterprise-scale organizations looking to work extensively with a dedicated customer success team, to mid-sized companies that want right-sized application management and support (AMS) on an ongoing basis.
IFS Select & Success
IFS Select is a services framework for customers – and the company says it supports customers on data-driven strategic decision-making based on real-time data, to ongoing business support, on-site enterprise architects, IT change management etc.
IFS Success provides a services framework that allows customers to choose the outcome-based service components that they need relevant to their business priorities.
The four pillars of IFS Success are:
- Value Assurance: Understanding the expected business value and running the initiatives needed to unlock it.
- AMS (Application Management Services): Operational and expert application management with ongoing access to IFS as well as quick response and resolution times.
- Safeguarding: Offering customers choice through a network of partners from system integrators, change management specialists and boutique industry technology houses.
- Customer Success Management: For customers using two or more of the components above, IFS will work to ensure the business is served with continuous improvement and enhanced support models.
IFS senior vice president of consulting Stefano Mattiello has explained that his company is continuously improving the ways in which the software is deployed and utilised. He notes that the IFS methodology has been extended to include post-go-live value realisation and value maximisation phases to reflect customers’ evolving business needs.
In addition, heavy investment has been made in the IFS Solution Composer to visualize IFS solutions as well as in the IFS Industry Accelerators to help customers go live better, faster, and adopt the software in the most cost-effective ways.
This is a positive discussion point, surely?
We (arguably) need to change the label “After Sales” to “Beyond Implementation Services” and this is perhaps something of what IFS is pushing towards.
The Computer Weekly Developer Network and Open Source Insider team want to talk code and coding.
But more than that, we want to talk coding across the diversity spectrum… so let’s get the tough part out of the way and talk about the problem.
If all were fair and good in the world, it wouldn’t be an issue of needing to promote the interests of women who code — instead, it should and would be a question of promoting the interests of people who code, some of whom are women.
However, as we stand two decades after the millennium, there is still a gender imbalance in terms of people already working as software engineers and in terms of those going into the profession. So then, we’re going to talk about it and interview a selection of women who are driving forward in the industry.
Cécile Tran is a graduate software engineer working at Freetrade, a company that describes itself as a ‘challenger stockbroker’ that has brought commission-free investing to the UK. The organisation is preparing to launch in Europe in early 2020.
CWDN: What inspired you to get into software development in the first place?
Tran: At a young age, I knew I wanted a career that was ever-changing and constantly challenged me to think outside of the box. I had taken a few basic programming courses in undergrad at EPF, an Engineering School in Sceaux, France. I loved the idea of being able to build and code my own games. After all, there aren’t many careers where you can say playing and building video games is advancing your technical skills. Upon realising how enjoyable coding and programming was, I decided to apply and attend CentraleSupélec, where I specialised in machine learning and robotics. While there I started to enjoy software development even more!
CWDN: When did you realise that this was going to be a full-blown career choice for you?
Tran: I started to realise software engineering was the right choice for me when I interned with Volkswagen’s ERL team in California. While interning I was able to build out and develop core tools for research purposes. As time went on, I was given even more responsibility and had the opportunity to build a mobile application that collected data to help develop a predictive algorithm. I was so excited by the idea that my work would contribute to something bigger and would eventually be used in Volkswagen cars globally. To me, this made the work more meaningful and exciting. I also realised that software engineering could give me the financial stability to work and live abroad, which not every job can provide.
CWDN: What languages, platforms and tools have you gravitated towards and why?
Tran: I’ve mostly been a full-stack developer leaning more towards front-end development. I’ve also developed Java EE applications on Unix based platforms. At Freetrade, I now work around more modern technologies/languages hosted with GCP, using Firebase functions and real-time database. The backend is written in TypeScript and our client apps are written in Swift (iOS) and Kotlin (Android). Currently, I’m leading the development around ISA subscriptions for Android. It’s been a great challenge and learning experience for me!
CWDN: How important do you think it is for us to have diversity (not just gender, but all forms) in software teams in terms of cultivating a collective mindset that is capable of solving diversified problems?
Tran: I believe diversity is a fundamental part of solving problems. Without a variety of diverse team members, skills and backgrounds, problems wouldn’t be solved and we wouldn’t be able to evolve our product offerings. The most enriching environments come with diversity!
CWDN: What has been your greatest software application development challenge and how have you overcome it?
Tran: One of the biggest challenges in my career was when I worked in a consulting firm with clients that used legacy infrastructure that was outdated and rigid. My team and I had to convince the client that despite the cost of the new tools, migrating to a new tech stack would benefit the business in the long-term. After numerous discussions, demos and proof of concept, we were able to change their minds about the major advancements new technology could have for their business.
CWDN: Are we on the road to a 50:50 gender balance in software engineering, or will there always be a mismatch?
Tran: I think we’re definitely on the way to better gender balance, but still a long way off from a 50:50 gender split. I don’t think we should fixate on the balance, instead, we should ensure that the same opportunities are available and can help a person, regardless of gender, grow and succeed in their career. As a female engineer, I can see that there’s still quite a lot of work to be done to change mentalities around women in the tech industry, especially in senior-level roles. While we still have a long way to go, I do think we’re starting to see a shift thanks to ambitious and hard-working women in this industry. I look forward to seeing what the next 5, 10 and 15 years will look like.
CWDN: What role can men take in terms of helping to promote women’s interests in the industry?
Tran: I believe there should be an ongoing effort from everyone to remain open-minded and inclusive in every work setting. This means considering women’s work as equal to their male counterparts, removing any gender-bias. As much as I would love to have more women mentors, I’ve never felt that my gender has prevented me from carrying out my projects or achieving my career goals.
CWDN: If men are from Mars and women are from Venus, then what languages or methodologies separate the two (basic) sexes?
Tran: This is an interesting one. I don’t really think any of the languages or methodologies are specific for either gender. I view each language as a way to learn and master my programming skills. I think over the next few years we’ll see an even greater shift in women and men mastering a variety of languages.
CWDN: If you could give your 21-year old self one piece of advice for success, what would it be?
Tran: As I’m just a few years older now, I would simply tell my younger self to continue to be open to new challenges, learn as much as I can and be open to meeting amazing people through work. I’d like to remind myself to enjoy every success, overcome obstacles and save some time for myself.
The Computer Weekly Developer Network (CWDN) team is at IFS World 2019 in Boston, Massachusetts.
IFS is known for its industrial cloud software deployments with a specific focus in areas including Field Service Management (FSM), Enterprise Asset Management (EAM) and Enterprise Resource Planning (ERP) systems.
CEO Darren Roos staged his keynote presentation in his typically direct South African style to detail where the business has been developing over the last year or so.
Roos brews up
IFS CEO Roos confirmed that this is the biggest IFS event to date with 40 % more attendees in 2019. He spoke openly about how much he had been charged by industry analysts to bring more customer focus to this show.
“We see that many of our customers are not always number one or number two in their industry; instead, they are the up-and-coming firms that are looking to truly disrupt. Because of this we have hinged our event this year around the theme ‘For The Challengers’ because it embodies not just our customers’ positions, but also our own, as IFS now increasingly becomes so much more of a force in the industry,” said Roos.
Today in 2019, IFS invests more than double in R&D than it did three years ago… and the company only this week acquired Field Service Management (FSM) company Astea. This makes IFS the market leader (so claims Roos) in the FSM market.
“By combining with Astea, IFS will expand its global footprint beyond its more than 10,000 customers worldwide, of which 8,000 are in service management. In 2020, IFS anticipates FSM license revenues to grow at more than 40%, approximately 80% of which are forecast to be recurring,” noted IFS, in a press statement.
Looking at what companies are doing today with technology, Roos says what the majority of firms really want is a) faster time to value b) better ease of use and c) both of these factors at a lower Total Cost of Ownership.
With an increasingly digital market for all work practices now being built across the IT industry, Roos spoke to users touching IFS products every day. Stage discussions gravitated around how firms can become so-called ‘challenger brands’ and develop new market lines of business.
A Norwegian paint manufacturing company has used the IFS suite to manage its global market delivery to operate in several markets, yet, importantly, be able to be sensitive enough to be able to ‘adapt to local process’ in all the markets it operates in.
The road to ‘digitisation’ and ‘servitisation’ (i.e. being able to define all aspects of business operations as more composable services… and, fundamentally, the move from selling products to selling services) is the journey that many firms are on today and these two terms define much of what IFS talks about in relation to the development of its suite.
Roos has an up-front direct style. So much so that he apologized directly for jumping around between different capabilities in the (arguably quite rapidly expanding) IFS portfolio. It’s not your usual American ‘whoop, whoop – awesome!’ kind of delivery, which, if anything, makes for a refreshing change.
The firm is bringing in more Artificial Intelligence (AI), Augmented Reality (AR) and Internet of Things (IoT) capabilities into its stack… and Roos insisted that he needed to highlight this in his presentation even if this session couldn’t also feature a full technical breakdown.
Between IFS spokespeople and guest speakers at this keynote, the discussion broke down what it might mean to be able to move so-called business 4.0 style operations. The four cornerstones of which might be summarised as:
- Embracing risk
- Leveraging ecosystems
- Creating exponential value
- Customising on a mass level
Rounding out the technical section of his keynote with a thankyou to partners, Roos encouraged attendees to try and extra value from this conference to be able to become a challenger in whichever industry they work in.
Today, it’s all about content, right? The web needs content, apps need content, newswires need content and (dare we say it) some press & publishing channels run on a form of journalism that they like to call content.
With so many technical conferences out there called TechEd, Next or Now and so on… it’s interesting to find one that takes a more content-centric approach, but this is content with intelligence.
Abbyy (the company prefers ABBYY for official branding) is hosting its Content IQ Summit to discuss the future of automation and the impact of Robotic Process Automation (RPA) on businesses who are using AI-enabled skills to understand enterprise content and processes.
The company is known for its foundations in document capture and management but now wants to push the envelope and self-style itself as a provider of so-called ‘Digital IQ’ for the enterprise, so is that cheesy marketing terminology is there know-how (and real intelligence) behind this take on IQ?
The Computer Weekly Developer Network team is off to Nashville, Tennessee from October 23 -25, 2019 to find out.
As well as keynote speakers from Forrester and Intellyx, the event itself is staged to share best practices and host a number of hands-on workshops.
Typical attendees will be professionals in positions related to document and information management… but also C-suite execs and all manner of AI-architects and the wider transept of data-developers and other software engineers.
“Organisations are faced with new challenges associated with the rise of the digital workforce and need guidance and solutions on how to successfully marry artificial intelligence with human intelligence,” commented Ulf Persson, CEO of Abbyy. “The Content IQ Summit provides an excellent opportunity to hear from industry leaders and learn hands-on how organizations can equip digital workers with cognitive skills to understand enterprise content and processes to make intelligent business decisions.”
Forrester VP for service enterprise architecture Craig Le Clair will host a session entitled ‘Invisible Robots in the Quiet of the Night – How AI and Automation Will Restructure the Workforce’ during which he will urge attendees to no longer cling to the remnant of the traditional workplace.
The event’s main three tracks are: market insights and business opportunities; process and product insights; and technologies in-practice.
Content IQ & Process IQ
Essentially, what we have here is not just content… it’s a technology proposition to take enterprise content forward and turn that content into actionable business intelligence. Abbyy also talks about the further move forward to the business benefits of incorporating ‘process intelligence’ into organisations.
According to an event press statement, “[To define Process IQ, we must consider how] the lack of visibility into processes is a major roadblock to intelligent automation. ABBYY Timeline neural network-enabled platform provides true understanding of what’s inside business processes, what processes to target for automation or how to monitor these for success following implementation. We call this Process IQ.”
Technology conventions like to have a tagline and Content IQ Summit is no different, the event’s theme is: “Elevate Your Digital Intelligence” [try typing that into the hotel WiFi to get online eh?]… and Abbyy also promises (as does every vendor worth its salt these days) a focus on how all of these technologies impact the customer and user experiences.
As already reported on Computer Weekly, in August of this year Abbyy announced the acquisition of TimelinePI – a developer of a process intelligence platform designed to allow users to understand, monitor and optimise business processes.
This acquisition is an investment by Abbyy into the emerging ‘process mining’ market. The global process analytics market size is expected to grow to USD 1,421.7 million by 2023 according to Research and Markets.
The business development concept from the company itself is a coalescence of two of the main challenges automation: understanding content (Content Intelligence) and understanding processes (Process IQ).
What else can we expect?
Well, this is going to be Nashville, so expect music content of the country kind, food content of the roasted meat with barbeque sauce kind and a warm welcome of the Southern Tennessee kind.
Attendees should expect all of that… and they should also expect to be polite when making comments about president Trump, he’s popular in those parts.
This is a guest post for the Computer Weekly Developer Network written by Travis Greene in his capacity as director of ITOM at Micro Focus – and this story forms part of a series of posts on CWDN that dig into the fast-growing topic of Robotic Process Automation (RPA).
TechTarget defines Robotic Process Automation (RPA) as the use of software with Artificial Intelligence and Machine Learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform — these tasks can include queries, calculations and maintenance of records and transactions.
Greene writes as follows…
Enterprise Application Integration (EAI) is defined as the use of technologies and services across an enterprise to enable the integration of software applications.
According to TechTarget, “EAI is the task of uniting the databases and workflows associated with business applications to ensure that the business uses the information consistently and that changes to core business data made by one application are correctly reflected in others.”
EAI is supposed to link enterprise applications, such as customer relations management or supply chain management, without requiring data structure changes.
In modern compute environments, Application Programming Interfaces (APIs) take on this EAI role, but enterprises face a challenge when integrating legacy applications that may not be supported by the original developers or vendors. Nonetheless, these applications often continue to support significant revenue for enterprises and cannot be replaced without significant risk and cost.
So, business workers bear the brunt of the workload, performing swivel-chair integration between these applications through menial tasks such as cutting and pasting order numbers from spreadsheets into web interfaces.
Rather than wait on IT to implement a true EAI solution, business users are taking the matter into their own hands with RPA, automating tasks like a user would, through a user interface.
But is this really a revolution in automation, or more of an evolution in EAI?
One way of thinking about this is by analogy to the different approaches to implementing security cameras. Hard-wired cameras (i.e. EAI) are highly stable and secure, but involve higher infrastructural investment, while Wi-Fi cameras (i.e. RPA) are in principle less stable and slower, but are also easily installable and – vitally – more responsive to changing needs.
Beyond RPA screen scrapes
Some capabilities that we associate with RPA have been around for a long time, just in different forms such as screen scrapers and macros. In this sense, RPA can be thought of as an evolution of EAI. So RPA is itself evolving: just as there are ways of making Wi-Fi cameras more secure and reliable, RPA is evolving to meet needs that it previously couldn’t.
For example, task-based automation in a User Interface (UI) is helpful, but limited to what can be performed in a UI.
Newer RPA offerings include API and command line interface integration to expand that capability. Taking the evolution even further is integration with business process management software to expand the use cases across even more processes.
Yet, even as RPA evolves to become more powerful, it risks turning into a maintenance burden like its predecessors. UIs are constantly changing, and this can break the automation if the RPA tool is brittle and unable to handle change automatically. In order for the evolution to continue, RPA must become more resilient, leveraging machine learning and AI to recognise changes and automatically adjust the workflow to accommodate without having to cry for help from a human.
RPA certainly benefits business users who are freed from the mundane tasks of integrating what IT cannot.
But to avoid a future stall in adoption and to fulfill the demands of EAI, it must evolve to integrate more broadly and become more resilient.
Self-styled ‘continuous intelligence’ company Sumo Logic has detailed news of ‘digital omnichannel ecommerce’ agency Vaimo now has adopting its technology to provide security analytics for its services.
Vaimo (rhymes with Flymo) is a provider of Magento and Adobe Commerce Cloud ecommerce services across operations in 15 countries — the company uses its own hybrid cloud and so oversees more than 1,000 servers distributed worldwide alongside an integration with content delivery network provider Cloudflare.
This type of IT stack and framework clearly needs security analytics to gather machine data and lock down channels that need to be firmed up — especially given the looming spectre that is the European Union’s General Data Protection Regulation (GDPR) and the standards set out by the Payment Card Industry Security Standards Council.
The company uses Sumo Logic to gather metrics and logs across its cloud operations, carry out analytics and detect potential attacks on its infrastructure.
“With Sumo Logic, we can automatically evaluate and track good customer behaviour on our clients’ websites for security – however this also means we can eliminate tools like CAPTCHAs that irritate or get in the way of human customers carrying out their shopping,” commented Wilko Nienhaus, chief technology officer at Vaimo.
For security, the Vaimo team uses Sumo Logic to automate its analytics processes to reduce the amount of work that was previously manual. It can apply machine learning-based pattern recognition, flag any deviations from normal activity for investigation and make it harder for bad actors to disrupt clients.
“Retailers are going through digital transformation projects and moving more of their businesses over to online channels to be competitive. This makes them able to serve their customers faster, but it opens them up to risk around security issues or vulnerabilities. Being able to see these issues developing ahead of any attack taking place is essential. Sumo Logic’s unified approach and cloud-native platform make it simpler for companies to automate processes around data collation, understanding and security, keeping businesses secure at scale. Companies need data and security to work hand in hand,” said James Campanini, general manager and vice president EMEA at Sumo Logic.
Sumo Logic works with Vaimo’s security operations teams at present, but the role for Sumo Logic is expanding within Vaimo over time as more teams see the potential around analytics for their use cases.
NOTE: CWDN does not typically focus on customer case studies, but our team recently reported on Sumo Logic platform developments here and this story serves as some additional contextual follow up.
This is a contributed piece for the Computer Weekly Developer Network written by Chris Pope, VP of innovation at ServiceNow.
ServiceNow is a cloud-based platform company that specialises in developing software to transform old, manual ways of working into modern digital workflows, so employees and customers get what they need, when they need it.
Chris Pope has qualifications in electronic engineering and is known for his analysis of enterprise-scale software platforms in real world deployment scenarios in the post-millennial cloud age that we operate in.
Pope writes as follows…
DevOps was supposed to fix everything, right? Okay well maybe not everything, but the coming together of Dev (developers) and Ops (operations) under the umbrella that is DevOps was meant to give us a new way of building software with greater team proximity that would ultimately result in better end products for the user.
Don’t get me wrong, DevOps has done well. But if there is a ‘last mile’ in DevOps that we’ve yet to travel, it actually comes back to people and the way we approach workflows.
Point of paradox
Effective DevOps is still all about technology, obviously. But truly effective DevOps is more about people, process and culture than it is about any single vendor’s fancy DevOps package. I would go so far as to say that most DevOps programmes fail because, paradoxically, they’re too focused on the technology.
I was speaking to a particular company this month who told me how it has embraced Agile development practices through DevOps. The organisation had managed to get itself to two releases per year and was delighted with its DevOps structure, Kanban boards and brown-bagged sandwich team lunches.
You did read that right; I said two releases per year.
I think I said something along the lines of: that’s not Agile, that’s ‘tragile’ (as in tragic), which they didn’t particularly appreciate.
The point is, talking up DevOps and buying in a hundredweight of multi-coloured Post-It Notes is not enough. Getting past two releases a year is only achievable when you show people the value of working differently through a more orchestrated approach.
Intelligent automation appreciation
When employees on both sides of the DevOps coin start to understand what tasks they need to focus on, then they can start to understand what they need to worry about more, and what they need to worry about less.
DevOps allows us to gain huge competitive advantage through automated code functions that happen throughout the software development lifecycle, but automation only happens if people know that the automation advantage is there… otherwise they might carry out the process manually.
This is why I say that DevOps is still a work in progress. When people start to understand where they are and what their place is in the workflow, then they’re more readily freed up to start solving real business problems and creating truly new software functionality. Well executed DevOps practices hinge around a core appreciation of this reality and require less people, because systems run more efficiently.
Cloud’s democratic pervasiveness
Although there has been a huge weight of industry discussion in the decade or so that we’ve had DevOps in its current form, we did learn many of the core change management lessons being tabled here way back in the age of the mainframe. The difference now is that the web is ubiquitous and cloud is democratically pervasive. This is generally a good thing, but it does bring with it a new responsibility.
What I mean is that anybody and everybody can deploy to the cloud and this makes things more complicated at the surface level. In the distributed computing world of cloud, well-orchestrated DevOps with intelligent workflow management has to be in place, or we risk flying blind.
It’s a bit like the over-engineering that old school drivers complain about when they look at modern car engines. You used to be able to get your hands dirty and tinker with the mechanics, but you can’t do that anymore. If we’re going to drive software forward now on a more sophisticated internal combustion engine, then we need to have protocols and processing in place to be able to deal with that level of higher engineering.
The last mile of DevOps
If it is lacking anything, DevOps is lacking tooling for insight into the whole development and operations process. It won’t be easy. Developers hate process. They’d rather be freebasing and creating some wild new ideas that buck convention and play around with abstract design ideas that might create the next Twitter.
Like I said at the start, when it comes to DevOps, people are too focused on technology and not focused enough on people and systems of work as well as human aspirations and requirements.
This is the last mile that DevOps has yet to travel. This is the route to being able to gather code artefacts and comply with all the required levels of governance and auditing. This is the way we can contain, corral and coalesce all the creativity that great DevOps teams have the potential to deliver.
Next time somebody tries to sell you a DevOps platform, dashboard or toolset, do me a favour will you, please? Don’t ask them what the software does, how powerful it is or how well it integrates. Ask them where the people fit in. If you can do that, then I’ll treat you to an all-you-can-eat order of Post-It notes and a side of fries.
Editorial disclosure: Adrian Bridgwater has worked on corporate content projects with ServiceNow.