Microservices Matters

March 19, 2013  9:48 PM

Another push for HTML5 mobile adoption

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Maxine Giza

Want a challenge? Add the avalanche of enterprise mobile applications to DevOps teams’ already-daunting integration workload, Paul Kopacki says. He’s Sencha’s vice president of marketing, which just released a mobile app integration tool for developers.

Last week, Sencha Inc. of Redwood, Calif., released upgrades to its product line designed to make HTML5 development more simplistic. The company’s core offerings–Sencha Architect, Sencha Ext JS and Sencha Touch–have been enhanced to make it easier to quickly build HTML5 applications for any platform. A new touch bundle for mobile developers was also launched.

Among the key upgrades to the Sencha ExtJS product is a big-data grid. “There are many more data points people want to build into their applications,” Kopacki said in an interview. “Big-data grids are one of the things customers are asking for and we are delivering in this release.”

Financial data companies are among those who rely on the technology on a daily basis. One Sencha client used the technology to build an app for bond traders. The client company, which tracks a great deal of information, needed greater capacity to take the data and share it with its bond traders in a faster pace.

Online accounting software company Xero has also embraced the technology. The company’s CTO, Craig Walker, said a few years ago that he realized Xero wasn’t delivering a positive mobile experience. “Our experience had been with native development and we wanted to move to Android, etc. The mobile touch framework delivers a lot of functionality up front,” he said during an interview. “What would have normally taken us six to nine months took us three.”

While there has been some debate over HTML5, Sencha has a clear stance on the technology: It’s a big proponent. In fact, the company recently developed a copy of the Facebook app to show that developers, not HTML5, were the issue when Mark Zuckerberg abandoned the markup language last fall.

“Some people come to HTML5 from a web perspective and fail to see the power of HTML5,” said Kopacki. “If you come at it from an application development perspective, you use the right tools so that HTML5 is powerful, especially for business applications.”

While the debate over HTML5 is sure to rage on, at least for now, some companies are banking on its ability to quickly aid programmers as they integrate old systems with new technology.

January 18, 2013  4:42 PM

Knoernschild’s “Java Application Architecture”: A brilliant breakdown on modularity

Brein Matturro Profile: Brein Matturro

Jack Vaughan

Running through the history of computing is a quest for modularity. We curse it when it doesn’t work; we take it for granted when it does. Long ago, software engineers began to seek the equivalent of Lego bits, software modules that could be swapped much like bus boards on a hardware backplane. It’s been a long strange trip.

Modularity has gone through various stages in the modern era, with objects, components and, then, services, coming to take the place of Lego pieces in the software world. But, even in one of their (somewhat) recent iterations – that is, the services-oriented OSGi Service Platform – the mechanics software module interaction is not easy for developers or architects to master.

“Java Application Architecture” (Prentice Hall, 2012) by Kirk Knoernschild is one of the more probing books you are likely to find on this subject. Before the year past is very far past, I would like to take some time to discuss the book , as it is one of the better ones I have read lately.

The book has a straightforward principle, which is to provide guidance for those who might set out to design modular software. In Knoernschild’s terms, it looks at ways you can “minimize dependencies between modules while maximizing a module’s potential reuse.” This is, one, a major goal of middleware; and two, a long-time holy grail of software development.

While much of the book portrays garden-variety java problems, a fair amount of “Java Application Architecture” which is subtitled ‘Modularity Patterns with Examples Using OSGi’ also has a helping of OSGi know-how.

A conversation with Knoernschild disclosed that the book arose from an initial interest in uncovering how to leverage different layers of abstraction – to reach a deeper understanding of software architecture, and gain ease of maintenance. Composition of “Java Application Architecture” happened over many years, and there were discoveries.

“Along the way, the book morphed based on me learning more about how to design large software systems based on the Java system, with JAR files as the principle unit of modularity. Then, in the 2006 time frame, I discovered OSGi,” said Knoernschild, “I started digging into OSGi.”

He said he found the ideas of OSGi meshed with his own ideas about Java modularity in general. OSGi, for example looks at JAR files as the main means of re-use, treating a JAR file as a first class citizen. In the book, he explains how to take a monolithic application, modularize it and eventually bring it under the control of OSGi.

At heart, the issues Knoernschild addresses in “Java Application Architecture” are about dealing with complexity. Like Fredrick Brooks’ work, you could say Knoernschild’s effort is to separate the accidental complexity from the essential complexity. His thoughtful look at Java modularity is more than just tools and tricks – it is a foundational framework for thinking about problems of software architecture.

“Designing software is hard. It’s hard because breaking up the systems is so difficult,” said Knoernschild. OSGi’s detractors still argue that it, in itself, in fact, is too difficult. But experience tells us things are hard for a reason, and while the general drive of software is to make things easier, it is a daily battle to effectively simplify the complex. Knoernschild’s book, fights the good fight, and could become a valued companion at many developers’ bench tops. All and all, it is a brilliant breakdown on modularity.

January 9, 2013  6:26 PM

Big data tackles cold pizza, or Real-time gets real

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Like ”cloud” before it, ”big data” is a nebulous term veiling some actual trends. Google and Amazon have been startling online successes, and much of their achievement seems to stem from massive amounts of Web-based data that they deftly correlate to create powerful views of the customer.  Some people see the big data tent coming to cover sports marketing, pizza delivery and more.

But it is not just data at rest that is in question. The need for big data in motion is growing, viewers claim. For its part, middleware stalwart Tibco sees big data, coupled with event processing and fast messaging, as a route to greater market penetration.

“We kind of own the big data problem as it relates to real-time events,” Tibco’s Vivek Ranadive told SearchSOA.com on a recent call. He maintains that even common tasks like pizza delivery – granted, for national chains – will be affected by big data. “When customers inadvertently get cold pizza, the company can pick that up,” and make things better with a free pizza, a coupon or what have you.

“When you think about big data, it is about running twenty-first-century risk. The planet needs an ‘eventing’ platform,” said Ranadive, author of “The Power of Now” (1999) and “The Two-Second Advantage” (2011).

The Tibco event architecture plays a role in a recent user story on SearchSOA.com. Our site recently profiled shipping giant OOCL’s Matt Rosen who shows how challenging markets can be, and how pivotal well-managed technology is in addressing those markets.

Shipping companies were in a tough bind when the 2008 downturn struck, and the going was not easier when recession hit big European markets. OOCL’s performance outpaced competitors, and in some significant part due to Rosen’s application development team, which better enabled efficient business processes for the global shipper.

Among a host of technologies Rosen’s OOCL crew employed was an event processing engine from Tibco Software. OOCL’s habitat – the shipping industry – is among those that advanced middleware maker Tibco is counting on to take it beyond its Wall Street techno roots. – Jack Vaughan

December 29, 2012  7:23 PM

SOA best practice: prepare for unknown futures

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

A recent piece by Stephanie Mann looks at SOA design issues today. After over ten years of SOA, some best practices are still emerging. Among the notables Mann spoke with is Robert Daigneau. With stints heading development at both Fidelity Investments and Monster.com – he now heads the Application Development at Slalom Consulting – there are few who have seen more in the way of the evolution of services design patterns than Daigneau.

Daigneau touched upon a most-dreaded pitfall of SOA – here we call it ”boiling the ocean.” It is a sort-of top down approach that must enumerate a gazillion ”services” before writing a line of code. Practicality has move this approach from the top of SOA practices, but there is something very human about it and it can creep out in projects and programs at any moment. Let’s hand the podium over to Daigneau: “If you try to lay it all out there and say, ‘Let’s dream up all the possible services we’ll need,’ that’s the wrong way to do it. There’s always going to be something new you didn’t anticipate, or something you misunderstood because you had too little information. Instead, look at the individual needs of projects and approach it pragmatically from a consumer perspective. Identify and enumerate the services for particular needs; then introduce the services as needed.”

When he says “there’s always going to be something you didn’t anticipate” he touches on something practitioners have learned in the SOA era: There is no final tightly coupled approach that everyone will agree on in all time to come. That SOA adjusted to this fact is a reason it has found as much value as it has as cloud computing, big data and mobile computing have come on line. Read ”Take new approaches to building services with SOA.” For more, stay tuned.

December 4, 2012  7:49 PM

HP sees investment in DevOps as strategic move

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

This year, Hewlett-Packard has continued its efforts to stake out a big presence for its tools in the DevOps world – where many organizations see an opportunity to streamline, rationalize and speed up the process of application development and delivery. For instance, the company now offers updated versions of HP Application Lifecycle Management (ALM) and HP Performance Center (PC) along with new Lab Management Automation and Continuous Application Performance Delivery.

Matt Morgan, vice president, Hybrid IT and Cloud Product Marketing, HP Software, says his company has gotten deeply involved in providing DevOps tools because of strong customer demand. He says demand isn’t just among web-focused companies. “A large insurance company that has been a long-term HP customer used to rev applications twice a year but they are now moving to blend development and operations so they can move to a monthly cycle,” he says. Ultimately, he says, consumers are demanding more and better apps and functionality and that, in turn, is driving development cycles across the enterprise. “That is being replicated in every industry,” he says. “Consumers are judging companies by their apps.”

Consequently, Morgan puts DevOps adopters into three categories. At the “top” are Web-oriented companies and mobility companies that started from the ground up with a DevOps kind of approach that supports daily, weekly, or monthly updates. “Search engine companies, Wikipedia, and Zynga are good examples – their whole organization becomes a beta testing site,” he notes.

The second group of companies has not had the same orientation toward DevOps but have “pockets” of new technology adoption where a DevOps approach has been or can be incubated. “A typical example of this kind of company might be an airline where they have hundreds of old apps but they are moving to adopt consumer-facing mobile apps, so in that part of the company they are running those faster cycles,” he notes.

Then, there are all the other companies – the ones that are still operating according to traditional work and development patterns.

“At HP we believe this trend isn’t just about speed and agility; the user is becoming the centerpiece of all design work for software applications,” says Morgan. The implication is that applications can’t and won’t remain “static” any more. There will be a constant demand for upgrades, updates, and adaptations to new business needs. DevOps will be key. -Alan Earls

November 29, 2012  6:34 PM

The value of API management

Brein Matturro Profile: Brein Matturro

Web APIs are multiplying as more retailers, media groups, governments and financial services firms start exposing them. At the same time, many companies are still resistant to API management, according to Paolo Malinverno, research vice president at Gartner. The problem with that, he said, is that using APIs is increasingly at the center of what goes on at the “nexus of forces,” Gartner’s term for the convergence of social, mobile, cloud and information. As a result, lack of management could mean serious loss of value.

“It is a fact that the number of APIs grows by the day and, with the explosion of mobile applications, APIs will be used more and more in the future,” Malinverno told a crowd at Gartner’s Application Architecture, Development & Integration Summit this week in Las Vegas.

He noted that daily API calls have skyrocketed into the billions for many well-known companies. Facebook, for example, saw 5 billion API calls per day in October 2009, while Twitter had 13 billion per day in May 2011.

“These companies better know who is calling,” cautioned Malinverno. “They better know how many calls per second they have to field, and they better know what sort of elasticity they need to demand from their cloud platforms to ensure that whoever uses their API is able to use it properly.”

API management is the way to do that, he said. Without it, businesses may lose out on value in their services and their APIs.

“API management is about making an API available on the Web for everybody that you want to use the API—enabling them to call it and get the result they want,” he explained. “Not everybody feels they need API management, but they do. The assessment of the value of the API is a part of API management.”

Malinverno also noted that SOA governance and API management are very closely tied—perhaps even the same. He said SOA governance is “the ability to link a specific intent of your business strategy to the way you develop and operate services.” He advised his audience to build a strong SOA governance strategy together with API management, to create what he called “application services governance.” -Stephanie Mann

November 29, 2012  3:56 AM

Meeting the growing need for apps

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

IT is changing. And certain companies are going to face the changes better than others, Andy Kyte, Gartner vice president and research fellow, told a crowd at the Application Architecture, Development & Integration (AADI) Summit in Las Vegas. In particular, he said successful IT organizations are going to learn how to effectively manage information technology and meet a growing demand for applications.

“The growth for demand in application services over the next five years is not one or two percent,” he said. “It is massive and exponential. You have massive amounts of legacy applications that need to be modernized; demand is going up; and capacity to meet that demand is not increasing.”

Kyte noted that while user expectations are going up, the ability of businesses to meet those expectations seems to be going down. From the financial crisis to the burgeoning cost of information technology and the need to deliver agile responses faster than ever, many CEOs and CIOs are struggling with how to keep their organizations competitive—or even afloat.

He said companies should follow these disciplines of highly productive IT organizations in order to succeed:

  1. Break down legacy culture: Instead of having many competitive teams, highly productive IT organizations have lots of teams that are all focused on a common set of objectives. They collaborate and work together toward a shared goal.
  2. Flatten the application development organizational structure: Create a culture of many equals, without emphasis on titles or positions. This will foster respect, and promote shared ownership and responsibility for methodologies and processes.
  3. Build the right software: Highly productive IT organizations have good processes for understanding what is really needed, and when. They put effort into the right things. That means insisting on clarity about nonfunctional requirements before technology selection.

Kyte also stressed the importance of taking a holistic view of the entire life of a system, as opposed to fixating on one of its components. – Stephanie Mann

November 28, 2012  7:06 PM

Gartner AADI takeaway: Disruptive forces change IT game

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

This week in Las Vegas, over 1,000 IT professionals, analysts and practitioners gathered at Caesars Palace to discuss top trends at Gartner’s Application Architecture, Development & Integration (AADI) Summit. The theme of this year’s event is game-changing, an idea centered on what Gartner calls “the nexus of forces.” Gartner’s key point: social, mobile, cloud and information are converging to change technology at every level.

“I believe we all realize that we’re in a pivotal moment in the evolution of technology,” said Gartner group vice president and team manager Jeff Schulman during the summit’s keynote address. “The game is changing.”

Session topics at the event ranged from mobile application development and application integration strategy, to how master data management should drive application architecture. Attendees showed interest in Gartner’s industry perspective on the impact of trends like mobile and cloud at the enterprise level.

“I’m an enterprise architect and I’m here to learn how to prepare our company for the future,” said Dave Bradshaw, an enterprise architect at an insurance company. “I want to learn what Gartner has to say about mobile strategies, the cloud and a little bit about big data.”

Jon Ah You, an IT enterprise application manager at a large oil company, echoed that idea: “My interest is in better understanding mobile strategies, and getting in tune with what’s happening with them in application development,” he said.

Notably, the term ‘big data’ is hard to come by at this year’s AADI event. According to Schulman, that’s no accident. “A lot of the information professionals I’ve talked to don’t trust the term [big data] or don’t like it. The information piece is larger than big data, it’s really about ‘big context’—getting the right info to the right person at the right time.”

Mobile—once second to cloud— took the spotlight during the event’s keynote. When Schulman asked how many attendees had more than two wireless devices with them this week, nearly all hands in the room shot into the air.

Chris Howard, a Gartner chief of research, used this as an example of mobile’s tremendous impact on human behavior and, consequently, on IT.

“You have to create architectures that will deliver the experience to the user—to the device that makes them productive,” he said. “This is really consumerization plus democratization of technology. How prepared are you to deliver this in your environment?” – Stephanie Mann

November 28, 2012  2:37 AM

SOA influences modern data integration

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

SOA is far from being the new technology kid on the block. But once it was. Now it is an older kid, and a practical approach to fielding a host of other new technologies. It should not be overstated, but, especially in the SOA services form known as “REST,” SOA is a foundational element of cloud computing, mobile applications and the branch of data integration that is being called operational BI.

The time is winding down on 2012, and we were going through some reporter’s notebooks. Seems that earlier this year, when we caught up with David Besemer, Chief Technical Officer, CTO, Composite Software, he had some interesting comments on SOA’s role, now that it is a more mature practice.

“SOA got a lot of attention three or four years ago. Then it seemed to have waned a bit. But while the waning of the hype occurred, there were projects that showed people getting practical use out of services and APIs,” said Besemer. Among the practical uses he pointed to are new types of data integrations.

Besemer, whose special interest is data integration, said there is a change in focus going on; it is moving things away from a sole preoccupation with the data warehouse.  Cloud, big data, and analytical appliances, got the ball rolling, to the point where services-enabled technologies began eating at the edges of the data warehouse.

Non-technical business imperatives are driving the need for decoupled services in broader and broader swaths of computing. Business imperatives are calling for something faster than a data warehouse at times. Said Besemer: “All of the members of the enterprise architecture team are struggling to deliver on requests from the business in regard to data sets.” It is the data that the business needs to make decisions.

The name SOA may be heard less frequently these days. But the idea of abstracted, decoupled services is at the heart of the latest data integration advances. – Jack Vaughan

November 9, 2012  5:56 PM

Integration tool set improves XBRL support

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Integration development managers face a broad array of intimidating jobs these days as they are asked to field corporate technology initiatives. Such initiatives are varied. These can range from the meshing of Java and XML efforts to EDI mapping, from marshaling Excel data into XML to the support of SEC-mandated XBRL initiatives and more. Integration development team members that the manager oversees have often been left to navigate a vast array of open source utilities to deal with these diverse requirements.

But there are also commercial tools available to tackle the problems. Among the tools that help development teams tackle modern integration jobs is Altova’s MissionKit. The latest version of the tool set adds interesting features that target the needs of the day.

Altova’s recently released MissionKit 2013 suite includes updates to its XML Spy tools that offer intelligence assistance for dealing with validation errors: updates to MapForce to support mapping for SQL stored procedures as well as an enhanced API for integrations into Java programs; and updates to its UModel tool that cover UML 2.4 and SysML 1.2. Importantly, UModel and other tools in the suite have improved support for XBRL and its most recent US-GAAP taxonomy, version 2012.

At release time, we talked to long-time industry analyst Peter O’Kelly about the trends driving these tools. Altova’s product line has evolved greatly since planting its original roots as an XML domain tool, said O’Kelly, who served as Altova’s product marketing manager and evangelist.

“It has expanded. It’s not just for people who work with XML,” said O’Kelly who now serves as principal analyst at O’Kelly Assoc.

The toolset tries to buffer developers from underlying complexity, he said, because teams always have to map between legacy and new technologies.  It is important to bring users a consistent framework, said O’Kelly.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: