December 29, 2010 11:46 PM
Posted by: James Denman
, SOA standards
The current state of cloud and API standards is almost an exact match for early SOA and Web services standards, and we expect the standards movement will follow a very similar trend. Hopefully, the cloud standards groups will stand a better chance by learning from the mistakes and successes of the Web services standards.
The discussion of cloud standards at Cloud Camp Boston started by asking the question “What do we want to standardize?” As we looked at standards we found that there are three attributes of apparent concern. These include “API lock-in” (a similar concept to vendor lock-in), migration issues, and the richness or functionality of an API.
One interesting problem with setting standards (For APIs and services both) is the granularity of the work you’re standardizing. Some APIs have a very limited scope and effect only a single application with a single purpose. Others address a broad range of applications with any number of different purposes.
December 20, 2010 4:50 PM
Posted by: Jack Vaughan
, SOA development
There was a time when EAI was the anti-thesis of SOA, but Enterprise Application Integration (EAI) is making a bit of a comeback within the SOA firmament. The fact is that it never really went away.
Back in the day people discovered EAI handled pesky real-world problems. You had a (fill-in-the-blank) workstation or what have you in the corner and you needed to connect it to (fill-in-the-blank) process or what have you in the back room … and quickly. A programmer would write to the two APIs and then labor the rest of his or her career maintaining the point-to-point solution. Continued »
December 16, 2010 6:15 PM
Posted by: Jack Vaughan
, SOA performance management
, User story
The growth of Web-connected systems tapping into back-ends has led to a proliferation of services. The proliferation can lead to increased systems’ loads. That expansion has led development shops to place more emphasis than ever on test and performance tools.
Such an apparent case of ”if you build it, they will come” is described by Sergey Sadovnichiy, manager for enterprise solutions at a large Canadian financial concern.
“What was happening was that services, when they were originally built, were few in number and typically had one consumer. Now the number of consumers has gone up dramatically, as well as the number of Web services themselves, and the complexity of the services has risen,” said Sadovnichiy. These are usually large applications, linking enterprise back-ends to the Web.
To deal with the increased volume and complexity, Sadovnichiy and his team have turned to SOAPSonar tools from Crosscheck Networks Inc.
“We do regression tests of each service from the point of view of each consumer type. We now have automated scripts for major consumers,” he said, adding that the scripts can quickly adapt to each use case.
“A Web service may have, for example, 350 elements. But every user will not use all the elements. In each case we can use a different set of scripts.”
Sadovnichiy said SOAPSonar is used for endurance testing and performance testing, along with regression testing. He also sees 100% test coverage, versus earlier scenarios that were risk-based test schemes covering not more than 20% of code.
Crosscheck CEO Mamoon Yunus said the industry has reached an inflection point in terms of services. “Services are getting like Web sites in terms of traffics. There are more trading partners talking to more systems,” he said.
Meanwhile, more able and interactive front-ends are creating more traffic. These RESTful elements do not directly employ XML or SOAP. Crosscheck tools measure JSON and JQuery REST element performance along with traditional SOAP and XML performance.
“People are using more widgets – AJAX widgets, JQuery widgets. From the browser now you hit these services directly. It is not application-to-application anymore as XML, SOAP and Web services were at first. They were more a classic “machine-to-machine” thing. Now, it is “portal-to-apps. The services now are portal driven.”
Yunus said Crosscheck has just released SOAPSonar 6.0. It allows emulation of a virtually unlimited number of concurrent users, and supports demographically disparate loading agents for cloud computing needs.
December 16, 2010 2:49 PM
Posted by: James Denman
, SOA infrastructure
OSGi is poised to provide a service platform extensive enough to provide ubiquitous modularity – but effectively creating OSGi bundles is still difficult. The Nimble Distribution seeks to address this and related issues. Paremus, an OSGi based private cloud computing provider, and Makewave, the company behind the difficult-to-pronounce Knopflerfish OSGi Service Platform, have teamed up to create and support the new software distribution. The companies suggest their service platform can boost adoption of OSGi in a way similar to Linux as commercially supported “Linux stacks” were introduced. The initial release of the Nimble Distribution includes Paremus OSGi Shell (Posh) – a Unix-like interactive shell and scripting environment, as well as Nimble Resolver – the engine of the Nimble Distribution.
December 15, 2010 2:57 PM
Posted by: Kkriz
By Kathleen Kriz
Modern modeling languages are constantly developing and changing – this includes the most prominent, the Unified Modeling Language (UML). A 2.3 update to UML is supported by training and new tools from vendor No Magic, Inc.
Gary Duncanson, President and CEO of No Magic, said UML is a foundation to modern software development.
“UML is a basic knowledge that every architect out there must have,” said Duncanson. “You can’t claim to be an enterprise architect, a system engineer, a software developer that uses model-driven architecture (MDA).”
“UML is the gateway to all the other related specs, and all these other profiles are built on top of UML whether it’s for a system on a chip to system engineering to enterprise application integration, all those things are built on top of models, and the basic modeling element is UML,” he said.
Clarence Moreland, COO of No Magic, said UML 2.0 improves on predecessors.
“UML 1 to 1.5 didn’t fit the bill because there wasn’t enough granularity,” said Moreland. “The notation, the syntax of UML didn’t map at a low enough granularity to the syntax of the object-oriented programming languages it was designed to support code generation in.”
UML 2.0 is intended to address syntactic and semantic mismatches between object-oriented programming languages and UML, and will also broaden its applicability when it comes to expanding use from lower level to higher lever software engineering.
Yet, the primary reason UML 2.0 was created was to support model-driven architecture.
“A big driver for UML 2.0 was to be able to support OMG’s model-driven architecture initiative,” said Moreland. “The existing specification didn’t have the semantic richness necessary to support MDA so that was the primary driver.”
Older versions of UML are still being used, and according to Moreland, all versions of UML are backwards compatible, meaning it is possible to support UML 2.3 with UML 1.5.
One of the most prevalent concerns with UML among users is redundancy in the language.
“The biggest problem now is redundancy and also the complexity of the language as it grows to support broader applicability,” said Moreland. “But it is a general purpose modeling language so that’s given rise to domain-specific languages which are for the most part narrower, specialized versions of the UML.”
No Magic is also offering a free training course on UML 2.0 to help people to get trained and up to speed on modeling and on UML in particular, said Duncanson.
December 7, 2010 4:12 PM
Posted by: Jack Vaughan
, web applications
As Doug Crockford created JSON it became something of an antidote to XML. This was bound to happen, because the issues developers had with XML were so plentiful. JSON, of course, with an API that fit on a business card, was more than a counter-statement. It was a big success in the making. Twitter recently dropped XML from its API, and this caused a few ripples in the XML/JSON blogosphere. Check this bit on XML versus the Web from Ajaxian.
December 6, 2010 9:54 PM
Posted by: Jack Vaughan
When you wrap up increasingly sophisticated components to run in specific environments, the complicated hand-crafted scripting can become a burden. With this in mind, Netherlands-based XebiaLabs has created what it calls ‘deployment automation’ tools aimed at handling Java and related middleware.
Deployit integrates with build frameworks like Maven, continuous integration tools like Hudson and Bamboo, as well as with familiar CMDBs. Many tools like this are associated with Agile development, but successful rapid Agile development methods can break down if middleware deployment becomes a bottleneck, said Andrew Phillips, Vice President of Product Development at XebiaLabs.
Cost is an issue too. Often these days, high salaried developers end up tasked to do day-to-day deployment for application servers, ESBs, message queues and the like.
“The situation with Continuous Integration tools is that 97% of code gets tested every day,” Phillips said. “But then the stuff sits in a repository somewhere. You need Continuous Deployment too.”
Phillips said XebiaLabs’ Deployit software uses a Unified Deployment model to ensure that deployments across different types of middleware are done consistently.
The software works through a graphical interface. “You take your deployment package and you drag it onto an environment,” he said.
With an ESB or portal that people have developed in a staging environment, the tools can extract and transform the deployment package so it can run in a different environment, according to Phillips. The software is described as ‘agentless’ and includes interfaces for tweaking deployments.
December 2, 2010 5:00 PM
Posted by: James Denman
, Data integration
Dun & Bradstreet (D&B) has been providing credible credit information since the 1930′s. Many businesses and financial institutions rely on information from D&B to make credit decisions, guide their marketing efforts, and supplement their supply chain management. These organizations may soon have the ability to access this information on-demand from the cloud. D&B recently announced D&B360, a data-as-a-service (DaaS) system built on the Informatica Cloud. Informatica provides the back-end integration platform. They are focused on data integration, synchronization, and data quality services on-demand.
The DaaS system is intended to allow enterprise software providers to embed D&B data right in their applications, and is designed to integrate with existing software that uses commercial or professional contact data, like CRM or BI software. In addition, the press release announcing D&B360 states that it “integrates relevant, dynamic information from social media and news sources for a complete picture of businesses.” Which makes me think the DaaS system will automatically cross reference credit information about a business with news stories on the business as well as the business’s Facebook information, and maybe even tweets from the CEO.
November 30, 2010 4:45 PM
Posted by: Kkriz
By Jack Vaughan
Let’s face it, sometimes what’s new is old, and – by the same token – what’s old is new. Some recent stories belie that fact.
Let’s start with James Denman’s article on graph databases. Although it has precursors, the graph data base is a relatively new type of store. It is a niche part of the “NoSQL” movement that has been driven by the success of massively scalable Google, Amazon and Facebook applications. My first reaction to NoSQL was negative, as I’d had a chance way back to cover the early object-oriented databases. These were often touted as “RDB killers.” But they never managed to unseat SQL. It took a while, delving into the NoSQL story to find that, yes, enough has changed to make the SQL alternative worth a look see by architects. Keep in mind, though, that “NoSQL” does not mean “NoSQL” – it means “Not Only SQL.”
XML arose at a time when organizations were only beginning to try to deal with unstructured data. Who knew that interest in ‘random’ text and snippets of data would come to hold sway over interest in the conventional columns and tables that defined data in those days? Unstructured, semantically rich data is now the most interesting to new age data miners, and, as you know, they ‘don’t need no XML’ to parse their way through it. DoD intelligence cullers and the like have been at this quite awhile, as writer Colleen Frye’s recent piece on SOA and semantics discusses. Need for such capabilities have spread far beyond national security. Stay tuned for more on this topic.
It may be a footnote - like the passing of a forgotten Hollywood film star of yore – but it is worth noting. The Web Services – Interoperability group shut down this month and handed future work over to the W3C. In its day, WS-I pushed Web services forward with the promise that big vendors would work together to make sure that their tightly coupled solutions had a genuine way of talking to outsiders via XML in a loosely coupled way. They only succeeded up to a point, and opened the door for SOA. Like only a few Hollywood stars in decline, they knew it was time to move on.
Yes, sometimes what’s new is old, and vice versa. But it is not recommended to take a jaundiced attitude here. One thing does not last always, especially in the technology sector.