SOA Talk


May 22, 2009  4:43 PM

Parasoft SOA package addresses business process/system integration testing

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Application Life Cycle specialist Parasoft moved recently to expand coverage of critical aspects of complex transactions extending – using the company’s terms –  “through web interfaces, backend services, ESBs, databases, and everything in between.” The result is a considerable update to Parasoft’s SOA Quality Solution line. Continued »

May 20, 2009  5:26 PM

Red Hat builds on Drools

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Red Hat continues its move up the middleware stack, improving its basic rules engine, and launching rules authoring tools to open the doors of rules development to business analysts. JBoss Rules builds on open-source Drools.

 

The new release is said to include new tooling that makes it easier for business side folks to program rules.

 

How far can easy rules making go, when do the business people have to go to the Java heads to really make things happen? What do you think?

 


May 19, 2009  2:39 PM

Trends in BPM and modeling

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Technologies rarely evolve neatly in straight lines. Instead they bump into one another, and influence each others’ directions. Think of a rack of billiard balls when the cue ball strikes! As an example, look at the technologies that converged in IBM’s recent BPM BlueWorks, which is a modeling tool set for business processes available as a service via the cloud. To top it off, BlueWorks is built in part on IBM’s sMash Enterpirse Mash-up development tool technology. In fact, the front end of BPM, the area where the processes are modeled is very active just about now, and IBM is far from alone in innovating. Continued »


May 14, 2009  4:39 PM

Matsumura: SOA and the world beyond infrastructure silos

Jack Vaughan Brein Matturro Profile: Brein Matturro

By Jack Vaughan

We had an opportunity to speak with Miko Matsumura, chief strategist for Software AG, recently. At a time when SOA seems to be emerging again, chastened if you will, by a near death experience, it was pleasant to converse with Matusmura, who always has an engaging take on the SOA news of the day.

SOA, he said, has become vital. In this he echoed recent comments of Forrester’s John Rymer, saying SOA may be getting to the point where it is just an accepted part of the IT development program, like “the background radiation in the universe.” Both Rymer and Matsumura were among the speakers at Software AG’s SOA Summit last week in Phoenix.

Like Rymer, Matusmura cited Jeffrey Moore’s vaunted ‘crossing the technology chasm’ analogy – it holds that an early majority takes hold of something useful and it becomes normal. To which we’d add: it becomes non-extraordinary and non-magical, and non-fodder for the hypemeisters. It becomes the way people do business.

A big shift, Matusmura said, was very clear at Software AG’s SOA Summit where there was a feeling that people are getting pragmatic about SOA adoption. He said people are clear about the difference between SOA and the classic model of application integration.

Matsumura said we are effectively moving toward transcending ‘siloed’ infrastructure. “This is about much more than connecting disparate applications,” he said. “It requires moving people and behaviors, not just bits.”

This all may call for a more enlightened approach to SOA governance, he suggested.

“If you connect two silos, you have to establish not just the concepts of interoperability, but also of the concepts of sharing,” said Matsumura.

Although we were all supposed to learn to share in kindergarten, this is never easy. People get very attached to their servers and apps.

“We end up forming these tribal groups around silos. There is rivalry for budget,” said Matsumura. “The great majority of the cost complexity in existing IT has human origins.”

Matsumura said a lot of the best practices around governance that are emerging are centered on behaviors. The end result may resemble B2B networks, he suggested, as companies adopt sharing practices within the company along the lines of sharing practices they have already adopted with outside partners.


May 12, 2009  3:47 PM

Double indemnity: SOA and Franco

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Some things just won’t die. These include the “SOA is Dead” debate. Initiated in January, it still is the starting point for many SOA discussions. It reminds one of Spain’s Franco  as depicted years ago on Saturday Night Live.

Generalissimo Francisco Franco lingered near death for weeks before dying in November 1975. His near-death state was daily cited on news reports. After he did die, Saturday Night’s Weekend Update continued to fill its reports with the news that “Francisco Franco is still dead.”

Joe McKendrick winks and refers to the ongoing ”SOA is Dead” debate in an interesting post covering Software AG’s SOA Summit 2009 last week in Phoenix. It seems the next phase for SOA actually is ubiquity – apparently it is near to a form of death. McKendrick cites John Rymer, Miko Matsumura, and others in an engaging post. Stay tuned.


May 11, 2009  5:21 PM

Trawling for Impact: BPM BlueWorks, IBM CloudBurst

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Hey everyone, first things first: Did you give your mother a Private Cloud for Mother’s Day? Ok, now on with the show.

The blogosphere was abuzz with IBM Impact chatter last week. Much centered on the company’s announcement of the CloudBurst private cloud appliance. But there was a lot of guff about IBM’ BPM BlueWorks business modeling as a cloud service as well. There is not a full consensus on ‘what the cloud is’ or ‘what BPM in the cloud is,’ by any means. Guess that makes it interesting. Continued »


May 7, 2009  4:56 AM

Russell Irwin at IBM Impact: ”Right-size first SOA efforts”

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Russell Irwin of Standard Life lists SOA savings and benefits over several years. Still, he cautions software architects to right-size their first SOA efforts. Don’t start with the Holy Grail, he suggested in a presentation at IBM’s Impact Smart SOA Conference 2009 in Las Vegas.


May 6, 2009  7:31 PM

MicroFocus buys Borland and part of Compuware

Jack Vaughan Brein Matturro Profile: Brein Matturro

By Jack Vaughan
Packing a one-two punch, Cobol and mainframe modernization specialist MicroFocus today said it will acquire Borland for about $75 million in cash at the same time it buys Compuware’s testing and quality assurance software business for about $58 million in cash.

In a statement, MicroFocus estimated that Borland achieved revenues of $172 million last year, while the Compuware Testing and ASQ Business achieved revenues of $74 million, in the twelve month period ending March 31.

The move clearly extends MicroFocus’ position in the software quality and application life cycle management market. Development stalwart Borland struggled in recent years as it launched multiple acquisitions and sought to transform from a desktop developer favorite into a broad-range application life-cycle software provider. The assets MicroFocus buys from Compuware include the former NuMega testing software line.

Analyst Dana Gardner suggests Compuware’s “Quality” portfolio divestiture leaves it more room to pursue modernization efforts in healthcare and other government IT environments.

Funny thing though, MicroFocus, which scooped up Borland and the quality/test portion of Compuware, is best known as a modernization concern.


May 6, 2009  5:52 PM

SOAttitude: Something different

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

IBM put on a SOAttitute party for the Impact conference faithful Wed night in Las Vegas. It was a welcome reprieve from the often gloomy daily news, and a harkening back to great techno days of yesteryear. Continued »


May 5, 2009  6:54 PM

Control theory, the economy, BPM, event processing and hydraulic computers

Jack Vaughan Brein Matturro Profile: Brein Matturro

By Jack Vaughan

I would like to talk today about control theory, the economy, BPM, event processing and some other strangeness.

While reading a recent science journal I chanced upon a story on control theory and its limitations when applied to the world economy. In “Everything is Under Control” by Brian Hayes in American Scientist [May-Jun, 2009], the author discusses varieties of feedback control and then analyses how these might apply over some or the other snapshot of time to related feedback algorithms in the system that is the world economy.

This often controversial process – that is, converting the economic system into a divinable process – has never quite worked, and is under renewed scrutiny given large-scale economic turmoil at the moment. Put more simply: the world economy is in an uproar and no one seems quite to know why exactly, and few trust computer programming to figure a way out.

Anyway, what especially caught my attention in Hayes’ piece was a depiction of MONIAC, for Monetary National Income Analogue Computer, an early analog computer that was based on – get ready if you haven’t heard this one – hydraulics! Yes, water, rather than electrons was the goop that moved through this system. The MONIAC system has an unfortunate resemblance to an early toilet. Still, it was a valid stab at a solution for its time.

Which brings us to BPM. At heart it is a cousin to control theory. ‘Let’s figure out the process and put a master control upon it,’ the BPM practitioner, like the MONIAC developer, might say. As patently obvious as it may seem right now, BPM is full of streams and eddies that are yet to be discerned. There are many questions. How do rules engines interplay with BPM? How will front-end business modeling ever truly connect with ‘up-from-the-stack’ business process execution? Is BPM related to event processing?

Hard to easily ascertain questions – no doubt. But as we recently asked the last of these questions to Neil Ward-Dutton (of Macehiter Ward-Dutton Advisors fame) we share his thoughts here on event processing and BPM – and how tenuous their relation is.

Writes Ward-Dutton:

Event processing might sound like something that you might use BPM technology for, but in practice it’s implemented quite differently because the runtime conditions in play (and hence the design philosophies) are very different.

BPM technology typically focuses on highly structured flows of work involving the coordination of multiple systems and/or people. Although there might be high throughput at runtime (in straight-through processing in financial trading scenarios, for example) those flows can be mapped out in advance and centrally orchestrated at runtime.

Event processing technology, in contrast, is optimized to detect, filter and analyse and then act on events that might occur in unpredictable ways, in unpredictable sequences, and at very high volumes in highly distributed environments.

Neil had more to say but here we’d forward you to Macehiter Ward-Dutton Advisors Web site for more analyses on issues of this era. Right on Neil! As well, you might check out the blog of Brian Hayes. My two cents worth: Brian Hayes is something of a national treasure. He continually finds worthy near-philosophical topics in the often dry dessert of computing and math. His work can be found in American Scientist, and on his blog http://bit-player.org/.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: