Like ”cloud” before it, ”big data” is a nebulous term veiling some actual trends. Google and Amazon have been startling online successes, and much of their achievement seems to stem from massive amounts of Web-based data that they deftly correlate to create powerful views of the customer. Some people see the big data tent coming to cover sports marketing, pizza delivery and more.
But it is not just data at rest that is in question. The need for big data in motion is growing, viewers claim. For its part, middleware stalwart Tibco sees big data, coupled with event processing and fast messaging, as a route to greater market penetration.
“We kind of own the big data problem as it relates to real-time events,” Tibco’s Vivek Ranadive told SearchSOA.com on a recent call. He maintains that even common tasks like pizza delivery – granted, for national chains – will be affected by big data. “When customers inadvertently get cold pizza, the company can pick that up,” and make things better with a free pizza, a coupon or what have you.
“When you think about big data, it is about running twenty-first-century risk. The planet needs an ‘eventing’ platform,” said Ranadive, author of “The Power of Now” (1999) and “The Two-Second Advantage” (2011).
The Tibco event architecture plays a role in a recent user story on SearchSOA.com. Our site recently profiled shipping giant OOCL’s Matt Rosen who shows how challenging markets can be, and how pivotal well-managed technology is in addressing those markets.
Shipping companies were in a tough bind when the 2008 downturn struck, and the going was not easier when recession hit big European markets. OOCL’s performance outpaced competitors, and in some significant part due to Rosen’s application development team, which better enabled efficient business processes for the global shipper.
Among a host of technologies Rosen’s OOCL crew employed was an event processing engine from Tibco Software. OOCL’s habitat – the shipping industry – is among those that advanced middleware maker Tibco is counting on to take it beyond its Wall Street techno roots. – Jack Vaughan
A recent piece by Stephanie Mann looks at SOA design issues today. After over ten years of SOA, some best practices are still emerging. Among the notables Mann spoke with is Robert Daigneau. With stints heading development at both Fidelity Investments and Monster.com – he now heads the Application Development at Slalom Consulting – there are few who have seen more in the way of the evolution of services design patterns than Daigneau.
Daigneau touched upon a most-dreaded pitfall of SOA – here we call it ”boiling the ocean.” It is a sort-of top down approach that must enumerate a gazillion ”services” before writing a line of code. Practicality has move this approach from the top of SOA practices, but there is something very human about it and it can creep out in projects and programs at any moment. Let’s hand the podium over to Daigneau: “If you try to lay it all out there and say, ‘Let’s dream up all the possible services we’ll need,’ that’s the wrong way to do it. There’s always going to be something new you didn’t anticipate, or something you misunderstood because you had too little information. Instead, look at the individual needs of projects and approach it pragmatically from a consumer perspective. Identify and enumerate the services for particular needs; then introduce the services as needed.”
When he says “there’s always going to be something you didn’t anticipate” he touches on something practitioners have learned in the SOA era: There is no final tightly coupled approach that everyone will agree on in all time to come. That SOA adjusted to this fact is a reason it has found as much value as it has as cloud computing, big data and mobile computing have come on line. Read ”Take new approaches to building services with SOA.” For more, stay tuned.
This year, Hewlett-Packard has continued its efforts to stake out a big presence for its tools in the DevOps world – where many organizations see an opportunity to streamline, rationalize and speed up the process of application development and delivery. For instance, the company now offers updated versions of HP Application Lifecycle Management (ALM) and HP Performance Center (PC) along with new Lab Management Automation and Continuous Application Performance Delivery.
Matt Morgan, vice president, Hybrid IT and Cloud Product Marketing, HP Software, says his company has gotten deeply involved in providing DevOps tools because of strong customer demand. He says demand isn’t just among web-focused companies. “A large insurance company that has been a long-term HP customer used to rev applications twice a year but they are now moving to blend development and operations so they can move to a monthly cycle,” he says. Ultimately, he says, consumers are demanding more and better apps and functionality and that, in turn, is driving development cycles across the enterprise. “That is being replicated in every industry,” he says. “Consumers are judging companies by their apps.”
Consequently, Morgan puts DevOps adopters into three categories. At the “top” are Web-oriented companies and mobility companies that started from the ground up with a DevOps kind of approach that supports daily, weekly, or monthly updates. “Search engine companies, Wikipedia, and Zynga are good examples – their whole organization becomes a beta testing site,” he notes.
The second group of companies has not had the same orientation toward DevOps but have “pockets” of new technology adoption where a DevOps approach has been or can be incubated. “A typical example of this kind of company might be an airline where they have hundreds of old apps but they are moving to adopt consumer-facing mobile apps, so in that part of the company they are running those faster cycles,” he notes.
Then, there are all the other companies – the ones that are still operating according to traditional work and development patterns.
“At HP we believe this trend isn’t just about speed and agility; the user is becoming the centerpiece of all design work for software applications,” says Morgan. The implication is that applications can’t and won’t remain “static” any more. There will be a constant demand for upgrades, updates, and adaptations to new business needs. DevOps will be key. -Alan Earls
Web APIs are multiplying as more retailers, media groups, governments and financial services firms start exposing them. At the same time, many companies are still resistant to API management, according to Paolo Malinverno, research vice president at Gartner. The problem with that, he said, is that using APIs is increasingly at the center of what goes on at the “nexus of forces,” Gartner’s term for the convergence of social, mobile, cloud and information. As a result, lack of management could mean serious loss of value.
“It is a fact that the number of APIs grows by the day and, with the explosion of mobile applications, APIs will be used more and more in the future,” Malinverno told a crowd at Gartner’s Application Architecture, Development & Integration Summit this week in Las Vegas.
He noted that daily API calls have skyrocketed into the billions for many well-known companies. Facebook, for example, saw 5 billion API calls per day in October 2009, while Twitter had 13 billion per day in May 2011.
“These companies better know who is calling,” cautioned Malinverno. “They better know how many calls per second they have to field, and they better know what sort of elasticity they need to demand from their cloud platforms to ensure that whoever uses their API is able to use it properly.”
API management is the way to do that, he said. Without it, businesses may lose out on value in their services and their APIs.
“API management is about making an API available on the Web for everybody that you want to use the API—enabling them to call it and get the result they want,” he explained. “Not everybody feels they need API management, but they do. The assessment of the value of the API is a part of API management.”
Malinverno also noted that SOA governance and API management are very closely tied—perhaps even the same. He said SOA governance is “the ability to link a specific intent of your business strategy to the way you develop and operate services.” He advised his audience to build a strong SOA governance strategy together with API management, to create what he called “application services governance.” -Stephanie Mann
IT is changing. And certain companies are going to face the changes better than others, Andy Kyte, Gartner vice president and research fellow, told a crowd at the Application Architecture, Development & Integration (AADI) Summit in Las Vegas. In particular, he said successful IT organizations are going to learn how to effectively manage information technology and meet a growing demand for applications.
“The growth for demand in application services over the next five years is not one or two percent,” he said. “It is massive and exponential. You have massive amounts of legacy applications that need to be modernized; demand is going up; and capacity to meet that demand is not increasing.”
Kyte noted that while user expectations are going up, the ability of businesses to meet those expectations seems to be going down. From the financial crisis to the burgeoning cost of information technology and the need to deliver agile responses faster than ever, many CEOs and CIOs are struggling with how to keep their organizations competitive—or even afloat.
He said companies should follow these disciplines of highly productive IT organizations in order to succeed:
- Break down legacy culture: Instead of having many competitive teams, highly productive IT organizations have lots of teams that are all focused on a common set of objectives. They collaborate and work together toward a shared goal.
- Flatten the application development organizational structure: Create a culture of many equals, without emphasis on titles or positions. This will foster respect, and promote shared ownership and responsibility for methodologies and processes.
- Build the right software: Highly productive IT organizations have good processes for understanding what is really needed, and when. They put effort into the right things. That means insisting on clarity about nonfunctional requirements before technology selection.
Kyte also stressed the importance of taking a holistic view of the entire life of a system, as opposed to fixating on one of its components. – Stephanie Mann
This week in Las Vegas, over 1,000 IT professionals, analysts and practitioners gathered at Caesars Palace to discuss top trends at Gartner’s Application Architecture, Development & Integration (AADI) Summit. The theme of this year’s event is game-changing, an idea centered on what Gartner calls “the nexus of forces.” Gartner’s key point: social, mobile, cloud and information are converging to change technology at every level.
“I believe we all realize that we’re in a pivotal moment in the evolution of technology,” said Gartner group vice president and team manager Jeff Schulman during the summit’s keynote address. “The game is changing.”
Session topics at the event ranged from mobile application development and application integration strategy, to how master data management should drive application architecture. Attendees showed interest in Gartner’s industry perspective on the impact of trends like mobile and cloud at the enterprise level.
“I’m an enterprise architect and I’m here to learn how to prepare our company for the future,” said Dave Bradshaw, an enterprise architect at an insurance company. “I want to learn what Gartner has to say about mobile strategies, the cloud and a little bit about big data.”
Jon Ah You, an IT enterprise application manager at a large oil company, echoed that idea: “My interest is in better understanding mobile strategies, and getting in tune with what’s happening with them in application development,” he said.
Notably, the term ‘big data’ is hard to come by at this year’s AADI event. According to Schulman, that’s no accident. “A lot of the information professionals I’ve talked to don’t trust the term [big data] or don’t like it. The information piece is larger than big data, it’s really about ‘big context’—getting the right info to the right person at the right time.”
Mobile—once second to cloud— took the spotlight during the event’s keynote. When Schulman asked how many attendees had more than two wireless devices with them this week, nearly all hands in the room shot into the air.
Chris Howard, a Gartner chief of research, used this as an example of mobile’s tremendous impact on human behavior and, consequently, on IT.
“You have to create architectures that will deliver the experience to the user—to the device that makes them productive,” he said. “This is really consumerization plus democratization of technology. How prepared are you to deliver this in your environment?” – Stephanie Mann
SOA is far from being the new technology kid on the block. But once it was. Now it is an older kid, and a practical approach to fielding a host of other new technologies. It should not be overstated, but, especially in the SOA services form known as “REST,” SOA is a foundational element of cloud computing, mobile applications and the branch of data integration that is being called operational BI.
The time is winding down on 2012, and we were going through some reporter’s notebooks. Seems that earlier this year, when we caught up with David Besemer, Chief Technical Officer, CTO, Composite Software, he had some interesting comments on SOA’s role, now that it is a more mature practice.
“SOA got a lot of attention three or four years ago. Then it seemed to have waned a bit. But while the waning of the hype occurred, there were projects that showed people getting practical use out of services and APIs,” said Besemer. Among the practical uses he pointed to are new types of data integrations.
Besemer, whose special interest is data integration, said there is a change in focus going on; it is moving things away from a sole preoccupation with the data warehouse. Cloud, big data, and analytical appliances, got the ball rolling, to the point where services-enabled technologies began eating at the edges of the data warehouse.
Non-technical business imperatives are driving the need for decoupled services in broader and broader swaths of computing. Business imperatives are calling for something faster than a data warehouse at times. Said Besemer: “All of the members of the enterprise architecture team are struggling to deliver on requests from the business in regard to data sets.” It is the data that the business needs to make decisions.
The name SOA may be heard less frequently these days. But the idea of abstracted, decoupled services is at the heart of the latest data integration advances. – Jack Vaughan
Integration development managers face a broad array of intimidating jobs these days as they are asked to field corporate technology initiatives. Such initiatives are varied. These can range from the meshing of Java and XML efforts to EDI mapping, from marshaling Excel data into XML to the support of SEC-mandated XBRL initiatives and more. Integration development team members that the manager oversees have often been left to navigate a vast array of open source utilities to deal with these diverse requirements.
But there are also commercial tools available to tackle the problems. Among the tools that help development teams tackle modern integration jobs is Altova’s MissionKit. The latest version of the tool set adds interesting features that target the needs of the day.
Altova’s recently released MissionKit 2013 suite includes updates to its XML Spy tools that offer intelligence assistance for dealing with validation errors: updates to MapForce to support mapping for SQL stored procedures as well as an enhanced API for integrations into Java programs; and updates to its UModel tool that cover UML 2.4 and SysML 1.2. Importantly, UModel and other tools in the suite have improved support for XBRL and its most recent US-GAAP taxonomy, version 2012.
At release time, we talked to long-time industry analyst Peter O’Kelly about the trends driving these tools. Altova’s product line has evolved greatly since planting its original roots as an XML domain tool, said O’Kelly, who served as Altova’s product marketing manager and evangelist.
“It has expanded. It’s not just for people who work with XML,” said O’Kelly who now serves as principal analyst at O’Kelly Assoc.
The toolset tries to buffer developers from underlying complexity, he said, because teams always have to map between legacy and new technologies. It is important to bring users a consistent framework, said O’Kelly.
B2B software company Axway said it will acquire Vordel, the SOA security gateway company that has recently come to include cloud, mobile and social networking support in its offerings. The move will expand Axway’s data governance portfolio, combining Axway’s current managed file, B2B and integration capabilities with Vordel’s API management, SOA governance and identity management technologies. Continued »
When WCF started, the focus was on decoupling. That was a basic SOA tenet, and Microsoft, though it was not that big a singer in the SOA choir, took those principles to heart in formulating .NET design patterns. Of course, the REST dialect of SOA has gained traction. REST can’t exactly be called “decoupled” because it is so much about HTTP. In a recent article on SearchSOA.com, we look at updates to WCF, but there is consideration of ASP.NET’s Web API for REST as well.