Enterprise IT Watch Blog


December 12, 2017  7:02 PM

Robot-driven programming is the leading edge of development’s new era

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Artificial intelligence, Programming


Robotics image via FreeImages

By James Kobielus (@jameskobielus)

Auto-programming will become the centerpiece of enterprise application development in 2018. As we move into the new year, we’ll see more application developers leverage the newest artificial intelligence (AI) approaches to algorithmically generate source code. Before long, human being will scarcely need to write a single line of original program code under any circumstances.

Another key new auto-programming technology that we’ll see more of in the new year is robotic process automation (RPA). What’s new about RPA is that it doesn’t program from the “inside-out,” in other words, through the old-fashioned approach of writing application logic in source code and then compiling it into an executable format. Instead, RPA takes an “outside-in” tack, using AI to craft program logic through algorithmic inferences from externally observable application artifacts and behaviors.

Essentially, RPA uses AI-driven software “robots” to infer an application’s underlying logic from its presentation layer, user-interface controls, interaction and messaging flow, and application programming interfaces. Key RPA capabilities include screenscraping of UI presentation elements, optical character recognition of on-screen text, auto-sensing of browser-level control and domain object models, recording of human-user keystrokes and clicks, and user-specified flowcharting of UI flows. In addition, the technology increasingly relies on the latest advances in machine learning (ML), deep learning (DL), natural language processing (NLP), and computer vision, either to automatically build a high-fidelity replica of an existing application, or it can work this same magic from prototypes that users may craft of these artifacts at various levels of business or technical depth.

Some disparage RPA as just the latest fad in screenscraping, but it’s far from that. Like screenscraping, RPA interprets the UI of applications and can infer how to execute application steps identically to those that a human user might take. But RPA’s potential is far greater than that, especially if these software robots are resident on desktops, servers, and other application nodes. In this larger picture, RPA infrastructures can be configured to observe, capture, profile, and reproduce any flow of content, context, and control in any application environment. They can also be setup to non-disruptively and automatically detect changes to APIs, document object models, and other observable application architectures, then triggering changes to auto-generated application code that interface to these features.

Clearly, RPA opens some amazing possibilities for enterprise who are trying to boost developer productivity under tight budget and manpower constraints. Imagine a world in which you can simply point an RPA robot at a well-performing workflow that sprung up in an ad-hoc fashion and ask it to crystallize that into a repeatable, documented workflow. Now imagine that you can apply RPA to work this same magic across distributed microservices in your private cloud, in complex multi-clouds, or even among edge nodes on the Internet of Things.

For those reasons, this technology might also unnerve people who’ve made their careers writing the application code and orchestration logic that underpins many enterprise applications. In fact, many RPA implementations come into organizations throughout business operations who are frustrated waiting for IT shops to deliver on their complex application requirements and who want to take the matter in their own hands. Indeed, RPA robots may be able to produce a first cut on these auto-built orchestrations better and faster than an expert human process designer. This is, in fact, how many organizations are using RPA, providing flowcharting tools that both traditional developers and business users can use to graphically monitor and tweak the process definitions that distributed RPA infrastructures automatically generate.

You might regard RPA as the Trojan Horse of the “citizen programmer.” One of the technology’s big advantages in legacy environments is that, by building applications from external interfaces, it requires little or no changes to existing IT systems. Consequently, it’s an easy drop-in to app-dev and process design shops, allowing developers to boost their productivity by implementing lightweight orchestrations among built-up applications. In 2018, we’ll see a fair amount of application development migrate to the “edges” of the business, in the form of full-fledged RPA-based developers in every line of business.

Also in 2018, we’ll see established RPA solution providers become more prominent players in the cloud application development arena. Chief RPA tool vendors include Automation Anywhere, BlackLine, Blue Prism, Kofax, Pegasystems, and UiPath. These solution providers are blurring the already fuzzy line between RPA, business process orchestration, Web content management, and application development. Wikibon expects that many of them to be potential acquisition targets in 2018 for larger, deeper-pocketed providers of cloud application development suites.

Furthermore, the new year will see the AI features of commercial RPA tools grow in sophistication. RPA vendors will leverage technological advances in computer vision, sentiment analysis, topic modeling, and intent-based modeling. Tools for optimizing the AI models in RPA environments will also grow more sophisticated, being able to train the code-generation process through supervised learning on a deepening pool of field-proven code-builds.

Programming won’t grow entirely automated in 2018, or even anytime in our lives. Though algorithms will soon do the bulk of programming work—via RPA, ML, and the like—but we will still need humans in the loop to certify that the robots are building only the code we need them to build.

December 11, 2017  8:03 AM

TechTarget’s weekly roundup (12/4 – 12/11)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Artificial intelligence, AWS, CLI, Data protection, Ransomware


Backup image via FreeImages

From ransomware to mergers & acquisitions, it was a busy year in the data protection market. Check out this year’s top trends and news in this week’s roundup.

1. Data protection trends: Ransomware, M&A deals dominate news – Paul Crocetti (SearchDataBackup)

Ransomware made international headlines in backup 2017 news, and vendors looked to mitigate its effects. Vendors were also busy with acquisitions and hyper-converged backup.

2. Keyboard data leak exposes millions of personal records – Michael Heller (SearchSecurity)

A keyboard data leak by mobile developer Ai.type exposed millions of personal records through misconfigured MongoDB database settings.

3. With AI-based cloud management tools, context is king – Kristin Knapp (SearchCloudComputing)

While it’s still early days for adoption, IT pros say AI-powered cloud management tools reduce a lot of the grunt work associated with performance and root-cause analysis.

4. Five years later, CLI still rules as an operational interface – Eamon McCarthy Earls (SearchNetworking)

How long will CLI remain a dominant operational interface? That’s one of the key questions network analysts are asking in this week’s blog roundup.

5. AWS AI services toss machine learning keys to dev teams – David Carty (SearchAWS)

AWS re:Invent featured updates to database and internet of things technologies, but new cloud AI services such as SageMaker broaden the appeal of the cloud provider’s AI tool set.


December 4, 2017  9:15 AM

TechTarget’s weekly roundup (11/27 – 12/4)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Artificial intelligence, Deep learning, Human Resources, Oracle, Tintri


Artificial image via FreeImages

What do the Oracle AI moves mean for the vendor and its customers? Find out what one expert thinks in this week’s roundup.

1. Taking stock of Oracle AI application and database moves – David Essex (SearchERP)

Holger Mueller of Constellation Research gives his take on Oracle AI developments, including the Autonomous Database, and explains why the new recruiting module in Oracle HCM Cloud is a big deal.

2. Radiology AI and deep learning take over RSNA 2017 – Shaun Sutner (SearchHealthIT)

AI and deep learning applications were superhot at RSNA 2017, as value-based medical imaging and PACS and VNA systems remained the top topics for imaging professionals.

3. Tintri storage solves healthcare firm’s primary and DR needs – Sonia Lelii (SearchDisasterRecovery)

Cross Country Healthcare standardized its production and data protection storage on Tintri and now it has 400 VMs working through the flash platform. Next up, a cloud setup.

4. AU combines talent analytics with HR management – Patrick Thibodeau (SearchHRSoftware)

People analytics can involve quantitative analysis. This is a specialized skill set that universities have been rushing to fill through master’s degree programs in analytics.

5. NSA data leak exposed Army INSCOM project information – Michael Heller (SearchSecurity)

Yet another publicly accessible cloud storage bucket exposed government data; this time it was an NSA data leak which included information on an Army intelligence project.


November 27, 2017  8:50 AM

TechTarget’s weekly roundup (11/20 – 11/27)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Artificial intelligence, Azure, Data breach, Visual Studio


Driving image via FreeImages

What do you think of how Uber handled the breach and response? Check out the details of the data breach in this week’s roundup.

1. Uber breach affected 57 million users, covered up for a year – Michael Heller (SearchSecurity)

A 2016 Uber breach affecting data for 57 million users was covered up by the company, including a $100,000 payment to the attackers to keep the incident quiet.

2. CA Technologies antes up with AI in mainframe software tools – Ed Scannell (SearchDataCenter)

CA Technologies’ debut of mainframe software tools with AI and machine learning extends the platform’s relevance in a world increasingly focused on cloud and DevOps.

3. Visual Studio Live Share aims to spur developer collaboration – Darryl Taft (SearchSoftwareQuality)

Visual Studio Live Share enables developers to collaborate on code in real time to build safer, higher-quality applications.

4. How to win in the AI era? For now, it’s all about the data – Nicole Laskowski (SearchCIO)

Deep learning pioneer Andrew Ng explains why data, not algorithms, gives companies a first-mover advantage in the current AI era. Plus: the four traits of an AI company.

5. Microsoft Azure cloud database activity takes off at Connect(); – Jack Vaughan (SearchSQLServer)

Microsoft Connect(); 2017 saw the addition of MariaDB and Cassandra to the Azure cloud database lineup. Also discussed: A set of Spark-based analytics services called Azure Databricks.


November 20, 2017  9:29 AM

TechTarget’s weekly roundup (11/13 – 11/20)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Cloud Computing, DevOps, Kaspersky, Kronos, Procurement


Cloud image via FreeImages

How should organizations prepare for cloud computing technologies in 2018? Check out six different routes in this week’s roundup.

1. Cloud computing technology for 2018: Transform or die – Joel Shore (SearchCloudApplications)

Oracle roadshow keynoters count six alternative routes to adopting cloud computing technology, and they note that the best implementations demand a deep understanding of the customer.

2. Kronos Workforce Dimensions stirs HR tech world – Shaun Sutner (SearchHRSoftware)

Workforce management leader Kronos intended to reshape itself in a turbulent HR tech world, say executives behind the vendor’s new mobile-first SaaS system, Workforce Dimensions.

3. Kaspersky sheds more light on Equation Group malware detection – Rob Wright (SearchSecurity)

A lengthy Kaspersky report offers more insight into how the antivirus company discovered Equation Group malware and came to possess classified U.S. government data.

4. Procurement transformation a main focus at CPO Rising Summit – Jim O’Donnell (SearchERP)

At the CPO Rising Summit, procurement experts discussed how procurement and supply chain will involve technologies like blockchain and AI, but some are skeptical this will happen soon.

5. DevOps transformation in large companies calls for IT staff remix – Beth Pariseau (SearchITOperations)

Large enterprises, such as Kaiser Permanente, base their moves toward DevOps practices on organizational changes that force a shift in IT team mindset.


November 13, 2017  8:17 AM

TechTarget’s weekly roundup (11/6 – 11/13)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Cloud Security, DevOps, Equifax, IBM, SQL Server


Encryption image via FreeImages

How do you feel about Equifax CEO Barros not knowing whether customer data is encrypted or not? Check out his testimony in this week’s roundup.

1. Following Equifax breach, CEO doesn’t know if data is encrypted – Madelyn Bacon (SearchSecurity)

News roundup: Following the massive Equifax breach, the CEO said he doesn’t know if customer data is encrypted or not. Plus, flaws were found in IEEE’s P1735 standard, and more.

2. AI’s role in future of DevOps provokes IT industry agita – Beth Pariseau (SearchITOperations)

AIOps has become a white-hot IT buzzword, but whether smart machines can replace, rather than augment, human intelligence is a much thornier question.

3. Microsoft boosts SQL Server machine learning services – Jack Vaughan (SearchSQLServer)

Python and R are among the tools in the SQL Server machine learning toolkit. Native T-SQL scoring is also on the agenda, as uncovered at PASS Summit 2017.

4. Cloud security tools reflect disparate vendor perspectives – Trevor Jones (SearchCloudComputing)

The latest cloud security tools exemplify the major providers’ varied approaches to address user concerns, and protect customers from themselves.

5. IBM Cloud Private pulls from Big Blue’s roots – Darryl K. Taft (SearchCloudApplications)

IBM sticks close to its roots with IBM Cloud Private, which taps Big Blue’s enterprise and middleware strengths to move customers from the data center to private cloud.


November 6, 2017  8:49 AM

TechTarget’s weekly roundup (10/30 – 11/6)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Azure, GDPR, Machine learning, Ransomware, Violin Systems


Storage image via FreeImages

What odds do you give Violin Systems of long-term survival? Learn why the CEO thinks that profits are on the way in this week’s roundup.

1. Violin Systems CEO Abbasi says profits coming in 2018 – Dave Raffo (SearchStorage)

All-flash pioneer Violin emerges from bankruptcy; 2018 plans call for profits, acquisition, NVMe deployments and software-defined scalability.

2. Bad Rabbit ransomware data recovery may be possible – Michael Heller (SearchSecurity)

Security researchers found a way to recover data locked by the Bad Rabbit ransomware without paying, and others said money might not have been the driver of the attacks.

3. ScaleArc brings database load balancing to Azure SQL DB – Jan Stafford (SearchCloudApplications)

This product update explores a new database load-balancing software release, ScaleArc for SQL Server, which is integrated with Microsoft’s Azure SQL Database.

4. Machine learning’s training data is a security vulnerability – Nicole Laskowski (SearchCIO)

Microsoft’s Danah Boyd has a sobering message for CIOs: The data used to train machine learning algorithms is at risk.

5. GDPR requirements put end-user data in the spotlight – Alyssa Provazza (SearchEnterpriseDesktop)

End-user computing technologies can help IT with General Data Protection Regulation compliance, but they aren’t up to snuff when it comes to inventorying data.


October 31, 2017  10:05 AM

TechTarget’s weekly roundup (10/23 – 10/30)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Artificial intelligence, Docker, Machine learning


Artifical image via FreeImages

How will explainable AI change the way your company uses artificial intelligence? Find out how two companies are delivering on this idea in this week’s roundup.

1. Companies want explainable AI, vendors respond – Nicole Laskowski (SearchCIO)

Dispatch from the Strata Data Conference: The push for explainable AI is on, and companies like H2O.ai and Microsoft are looking to deliver.

2. Agility, comradery drive CA Technologies’ strategy turnaround – Ed Scannell (SearchDataCenter)

With its own digital transformation underway, CA Technologies wants to lead legacy mainframe software users into the world of cloud computing and agile development.

3. DHS’ Dragonfly ICS campaign alert isn’t enough, experts say – Michael Heller (SearchSecurity)

The Department of Homeland Security released an alert confirming the Dragonfly ICS cyberattack campaign, but experts said more action is needed to protect critical infrastructure.

4. Channel firms benefit from Docker MTA strategy – Spencer Smith (SearchITChannel)

The Modernize Traditional Applications program, unveiled at DockerCon 2017, is seeing traction among customers interested in adopting container technology, Docker partners said.

5. Amazon, Microsoft crave more machine learning in the cloud – Trevor Jones (SearchCloudComputing)

Gluon, a new open source interface from Microsoft and Amazon, seeks to simplify machine learning in the cloud, as vendors court more of these workloads for their platforms.


October 23, 2017  2:58 PM

The tension between data science professionalism and automation

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Data Science

Data image via FreeImages

By James Kobielus (@jameskobielus)

Data science can function as a sustainable business resource only if it’s managed professionally. Regardless of how your enterprise chooses to organize its data science processes, you need professional management.

One approach for professionalizing your data scientists is to establish internal centers of excellence. To the extent that they come into being, centers of excellence will usually be the pet projects of one or more practicing data scientists who seek to bring great consistency and repeatability to their teams’ practices and procedures.

Of course, professionalism is a two-edged sword. Data scientists are often a proud, stubborn, and fiercely independent, a fact that might make them resistant to innovations in how you organize their work. To the extent that data scientists are given free rein to do what they wish and ad-hoc methods prevail, it may be difficult to establish structured, transparent practices within their teams.

As you introduce industrial-grade automation into your data science practice, excessive professionalism may cause sparks to fly. Data scientists may regard the new automation tools as an irritation, or an affront to their professional judgment, or eve (worst case) as an existential threat. The tensions are likely to grow as automation pushes deeply into the machine learning development pipeline, per my recent discussion here.

However, there’s no turning back to manual methods. Automating the data science development pipeline is the key to operating at enterprise scale. Data scientists will be swamped with unmanageable workloads if they don’t begin to offload many formerly manual tasks to automated tooling. Automation can also help control the cost of developing, scoring, validating, and deploying a growing scale and variety of models against ever expanding big-data collections.

To work in today’s business world, a data scientist must become more like an industrial engineer. In other words, their professional pride must shift toward a 24×7 regimen of building, training, deploying, productionizing, and managing a steady stream of data-driven models. If anything, they will need to master the new generation of data science development tools that:

  • Automatically generate customized REST APIs and Docker images around machine-learning models during the promotion and deployment stages;
  • Automatically deploy models for execution into private, public, or hybrid multi-cloud platforms;
  • Automatically scale models’ runtime resource consumption up or down based on changing application requirements;
  • Automatically retrain models using fresh data prior to redeploying them;
  • Automatically keep track of which model version is currently deployed; and
  • Automatically ensure that a sufficiently predictive model in always in live production status.

Clearly, these levels of automation will still require expert personnel to set up, monitor, and tweak the repeatable workflows they’re managing. In this new industrial order, the role of the working data scientist will become similar to a foreman in a factory that has implemented robotics and computerized numerical controllers.

If you’re data science, DevOps, or IT operations professional, you almost certainly have practical insights for automating data-driven business processes. I would love to hear your thoughts. Please join me on Wednesday, November 1, 2:00-3:00pm (eastern) for the Wikibon CrowdChat “Automating Data Analytics Management to the Max.” You can participate simply by clicking here, logging in with your Twitter handle, and posting your thoughts in an interactive, moderated Q&A format.


October 23, 2017  7:23 AM

TechTarget’s weekly roundup (10/16 – 10/23)

Michael Tidmarsh Michael Tidmarsh Profile: Michael Tidmarsh
Artificial intelligence, Cisco, Docker, Kubernetes


Cloud image via FreeImages

What’s your biggest challenge working across multiple clouds? Find out how Cisco is dealing with multicloud environments in this week’s roundup.

1. Cisco cloud VP calls out trends in multicloud strategy – Trevor Jones (SearchCloudComputing)

With assets in house or on various public clouds, enterprise multicloud trends have shifted as new models emerge, said Cisco’s cloud czar.

2. KRACK WPA2 flaw might be more hype than risk – Michael Heller (SearchSecurity)

Researchers discover a WPA2 vulnerability and brand it KRACK, but some experts say the early reports overstate the risk of the flaw and downplay the difficulty of an exploit.

3. End-user security requires a shift in corporate culture – Eddie Lockhart (SearchEnterpriseDesktop)

It’s important for everyone in a company to take security seriously, including end users. A big part of that is training.

4. CIOs should lean on AI ‘giants’ for machine learning strategy – Nicole Laskowski (SearchCIO)

Components of AI, such as machine and deep learning, will be part and parcel of every enterprise. When devising a machine learning strategy, CIOs should think of it as the next wave of analytics.

5. Docker with Kubernetes forges new container standard – Beth Pariseau (SearchITOperations)

Docker’s support of Kubernetes alongside Swarm is a big shift for containers. IT pros see the benefits of this integration but question its effect on market competition.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: