SOA Talk


October 2, 2012  9:19 PM

SOASTA buys LogNormal; The goal? Gain mobile end-user point of view

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Cloud and mobile test service SOASTA has acquired LogNormal, makers of Web performance tools. That news accompanies SOASTA’s release of mPulse, a real-time user monitoring tool that itself derives from an already established alliance between SOASTA and LogNormal.

The mobile user space differs from that of traditional Web apps, said Tom Lounibos, SOASTA’s CEO, and so requires new types of tools. “On the mobile side, customers are looking at the difference between mobile apps and web apps. It goes beyond pure footprint. It  goes to how the [devices] are being used.”

“With mobile, you have people on the move. They are usually interested in one thing, say, a sports score, or a transaction. None of the developers have too much visibility into this,” said Lounibos.

He said mPulse garners performance measures like bandwidth and page load time as well as engagement metrics like bounce, exit and conversion rates. User metrics like user location, device type, carrier speed, and application usage are captured too. Then, mPulse displays the information via an interactive in-memory monitoring dashboard.

Why did SOASTA move to buy LogNormal? “The company has been a leader in real user monitoring,” said Lounibos. In fact, LogNormal principals have been involved in the roots and flowering of Real User Measurement (RUM). Co-founders Buddy Brewer and Philip Tellis had roles in developing Boomerang open source RUM software and other JavaScript-oriented monitoring endeavors.

The LogNormal technology employs JavaScript tagging that lets you capture end-user activity to get a better understanding of how devices are interacting with services. “It captures the real user experience,” said Lounibos. That experiential data is fed back to the SOASTA in-memory analytical engine, he continued, which is at the heart of its flagship CloudTest platform.

Mobile development, clearly, is changing the overall development ethos. Just when some test shops were getting their arms around the idea of end-user PC interaction monitoring, they face new challenges of a mobile nature. Not surprisingly, SOASTA’s Lounibos sees it a s a key area of attention, as was seen depicted in a recent posting on sister blog Head in the Clouds. – Jack Vaughan

September 27, 2012  5:56 PM

JavaOne 2012 kicks off

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

As JavaOne 2012 kicks off in San Francisco Sept 30, industry experts and practitioners gather to learn about the newest in Java standards, best practices and developments.  New this year are a larger keynote location at the Masonic Auditorium and an expanded schedule that includes hundreds of technical sessions, hands-on labs, and BOFs.

Issues confronting the JavaOne throng are many. The community is looking for Java updates that were held off going back to the tumult that ensued as Java originator Sun Microsystems  was courted by IBM and eventually acquired by database giant Oracle. Java on the server is under some stress, as mobile computing moves to the fore, pushing JavaScript, Java’s faux language cousin, higher in the pecking order.  That has led some to question the future of Java.

Meanwhile, a host of new languages are being fielded on the Java Virtual Machine (JVM) platform, giving developers much to digest. Finally, OSGi  has edged into production as a means for modular Java app making – but it is challenged, some would say, by Jigsaw, a newer modularization scheme that apparently will not make the next Java rev. One thing is clear: Oracle is now firmly established as chief Java overseer.

For Java consultant, trainer and JavaOne presenter Venkat Subramaniam,  founder of Agile Developer, Inc.,  JavaOne is a chance to reach developers who need new ways to be effective on the JVM platform. Among key concerns he cites are concurrency. New multicore chips, he says require new programmatic ways of working with Java, whether the target platforms are conventional or new cloud varieties.

On multicore chips, [software] multithreads are on steroids, Subramaniam said. That brings out problems in Java coding that have not been uncovered in familiar single-core implementations. Multicore chips bring multiple levels of caches, he added.  “Programs that pretend to work correctly get broken. This is not a problem we invited,” he said.

For those worried about Java’s future, Subramaniam has relief. “It will continue. It is powerful. But the way we use it is going to change in the future,” he said.

How will it change?  Subramanian suggests Scala, Groovy and JRuby will appear on the JVM with greater frequency to help deal with a new architecture paradigm that places Web applications in juxtaposition to enterprise apps.

As has been seen in  other recent JavaOne events, JVM and Java as a platform are becoming as important as – or more important than – Java, the language.

“I think Java is very strong and healthy but you have to look at all the different languages that are available,” said Kirk Knoernschild, software developer. He points to Groovy, Scala and Clojure, as well as other languages as examples of the new “Java platform” landscape.

“You really need to focus more on the separation of Java as a runtime platform. More and more, we will see organizations using the right language on top of the platform,” said Knoernschild, author of “Java Application Architecture: Modularity Patterns with Examples Using OSGi” [Prentice Hall, 2012]. “This language could be Java. It could be Groovy. It could be Grails.”

Emerging languages on the JVM are covered in various sessions at JavaOne. Under consideration will be improvements in the JDK, and support for dynamically typed languages on the JVM.  A conference track covers some of the most popular dynamic languages  now appearing on the JVM, such as Groovy, JavaScript, JRuby, Kotlin, and Scala.

In fact, the JavaOne event offers over 500 wide-ranging sessions. Some notable choices include “Rapid Robot Programming,” “Building Mobile Apps with HTML5 and Java,” “Going Real-Time: How to Build a Streaming API,” and “How RESTful is Your REST?”

Among the nearly 540 speakers at this year’s JavaOne are the so-called “Java Champions,” a selection of community-nominated technology leaders who will run a series of technical talks and community-building activities at the conference.

Stephen Colebourne, project lead at Joda.org, is one member of that group. His talk, “From Instants to Eras, the Future of Time in Java,” will look ahead to the inclusion of an easy-to-use, expressive API for times and dates in Java SE 8. “Java Champion” Bruno Souza will talk about “101 Ways to Improve Java: Why Developer Participation Matters.” He will lead a community brainstorming session about how developer participation can influence the development of Java technologies. -Stephanie Mann and Jack Vaughan

 

Join the conversation on Twitter

 


September 20, 2012  6:03 PM

Goodbye to three-tier computing?

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Software in the original mainframe days was all glommed together. Why not? Who was looking? Sometimes, reluctantly, some structure came about. Even in the early mid-range days, code was built up into classes, objects and components that were often loosely strung together.

With standard Java and standard Java servers, fairly strict and familiar three-tier architecture came about. The question to ask now is “Will it last forever?” Like so many things, the fundamental tiers of computing do come up for reconsideration once and a while.

These breezes have been blowing subtly since people cast about for lighter versions of Enterprise Java Beans. More recently, Node.js has arisen as a JavaScript alternative to Java on the server side. Increasingly, the client is the object of interest.

Node.js and other browser-influenced technologies seem to encourage software architects to cast skyward their monolithic three-tier components. As these flying components drift down to earth, they may not settle back up in the same alignment. The sudden near-hegemony of mobile clients is pushing things ahead quickly.  A variety of new architectures are brewing.

In some ways, there seems a growing reaction to the rule of Java and the server. That view emerges from a look at a reporters’ notebook. It’s not going away, but as described in an interview with James Strachan, now senior software consultant with JBoss: “The server side is becoming thinner and thinner.” When SearchSOA.com and TheServerSide.com spoke with Strachan earlier this year at the CamelOne event, the topic of Node.JS was on the docket, but Strachan was expansive.

He said, looking forward:

The server side might just be Amazon Simple DB or Mongo DB or something; there might not be much of a three-tier architecture anymore.

Meanwhile, with flair, he continued:

 

….the client side is becoming bigger and more and more complex; it’s real-time now, everyone’s doing Ajax, real-time updates, and people are doing lots of single-page applications – which is when one Web page starts up and the entire app is in there. There are lots of models, containers, relationships and persistence and “yada-yada.”

 

Strachan notes this is highly driven by mobile applications:

 

In many ways the browsers won. Almost every mobile platform has Web capabilities inside it – Android, iPhone, iOS all have Web browsers and so forth. So the Web has kind of won … most browsers use JavaScript and HTML 5. Silverlight’s dead, Flash is kind of dying … the browser is really where it’s at …  with HTML and JavaScript.

 

Are the new approaches overblown? Is real change far off? Do you see a shift in emphasis to the client? If so, do you think services or SOA have had a hand in breaking down the status quo? -Jack Vaughan

 

@searchsoa

 


September 13, 2012  8:43 PM

Fat Fractal enters the BaaS fray

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

What has sometimes been described as mobile middleware has taken a new tack. Now, the idea of Backend as a Service (BaaS) has begun to take off in the mobile application development space. Proponents of BaaS say it helps developers easily build mobile apps, or any other applications connected to a cloud backend. Some of their views suggest a wholly new computer architecture is in the works.

By way of example is FatFractal, a San Francisco-based BaaS provider that launched just this week. The company describes its product as offering native code support for any connected device, along with an events model and declarative security. FatFractal also says it integrates all of those components as lightweight services.

While it may be the newest BaaS player, FatFractal joins a slew of companies already in the field. Its competitors include StackMob, Kinvey, Applicasa and Parse.

Central to FatFractal’s approach is a NoServer module, which takes JSON requests and handles them via a script execution engine and a Create, Read, Update and Delete (CRUD) engine.

FatFractal CMO David Lasner thinks the new approach is needed. “It’s just hard to do a backend in the cloud and make it work,” he said. “The nature of applications is changing and you’re getting thousands of applications that use a lot of data.” – Stephanie Mann


September 7, 2012  3:16 PM

Choice Hotels rethink IT architecture – employ middleware, SOA, BPM

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Hoteliers were among the first businesses to turn to information technology. Now, their once-new foundational computers have become legacy systems that can block in the way of delivering new products and promotions. These are not always just mainframe systems – once-shiny high-performance mid-range (or larger) Unix systems and even pre-SOA-era application servers may be standing in the way of business flexibility just as easily.

In recent years, as some systems began to show their age, Choice Hotels International, Inc., which franchises over 6,000 hotels, opted to update with middleware systems from Oracle Corp. Choice Hotels’ move is well along. SOA is part of the journey.

“Several years ago, we did an overall assessment and decided there were too many point-to-point connections,” said Rain Fletcher, vice president of application development and architecture, Choice Hotels. “Maintenance was difficult, and our business needed more velocity in delivering new functionality.”

That is the background to Choice’s selection of Oracle Fusion Middleware, SOA Suite and BPM Suite, he said. Oracle application server software also provides the base for those higher level stack elements. Fletcher described Choice’s as “a WebLogic shop,” referring to the former BEA, now Oracle, app server suite.

The legacy systems were becoming “a liability in our ability to execute,” said Fletcher. A re-thinking of IT architecture was needed, he said.

The need for quick technical flexibility is intensely apparent on the Choice Hotels’ website today. It is dotted with special book-early rate offers, gift card offerings, Privilege Point member specials, downloadable iPad and smart phone apps and more. It enables bookings on Comfort Inn, Quality Inn, Econo Lodge, Sleep Inn and other familiar hotel marquees. All this functionality must be fleet and flexible, and supported by backend systems and middleware.

A big part of Fletcher’s drive is to simplify and standardize where possible. “We wanted to default to one standard,” he said. But multiple systems are a fact of life that require developers to be supple. The that is needed is what Fletcher calls “fungibility.”

“We have thirteen different systems. And I don’t want there [to be a need for] ‘tribal knowledge’ of any one of them,” he said.

How does that pan out in operations? Some complexity is unavoidable – but simplicity must be the goal. “Every application server type we have has a different patching policy and security profile,” he said. “I may have four – I don’t want any more,” he said.

SOA and the Oracle SOA Suite have been first steps in gaining the flexibility Fletcher’s organization is looking to achieve, with BPM and modeling deployments to come, he said. Developers go through intensive training in SOA. He said the effort is built around the concept of a “SOA Services Factory.”

“We started with service domain module creation, working with partners to create several high-level service domains,” said Fletcher. Looking forward, he expects to be mapping SOA services with well-defined business processes. The domains provide a framework for the services. As the services grow up they map naturally to how the business thinks, and what the business does, he said.

@SearchSOA


September 6, 2012  6:14 PM

Can stream-based data processing make Hadoop run faster?

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

The Apache Hadoop distributed file processing system has benefits and is gaining traction. However, it can have drawbacks. Some organizations find that starting up with Hadoop requires rethinking software architecture and that acquiring new data skills is necessary.

For some, a problem with Hadoop’s batch-processing model is that it assumes there will be downtime to run the batch in between bursts of data acquisition. This is the case for many businesses that operate locally and have a large number of transactions during the day, but very little (if any) at night. If that nightly window is large enough to process the accumulation of data from the previous day, everything goes smoothly. For some businesses though, that window of downtime is small or non-existent and even with Hadoop’s high-powered processing, they still get more data in one day than they can process every 24 hours.

For organizations with small windows of acceptable, an approach that adds components of stream-based data processing may help, writes GigaSpaces CTO Nati Shalom in a recent blog on making Hadoop faster.  By constantly processing incoming data into useful packets and removing static data that does not need to be processed (or reprocessed) enterprise organizations can significantly accelerate their big data batch processes.  – James Denman


September 5, 2012  11:08 PM

Thomas Erl discusses upcoming SOA, Cloud and Service Technology Symposium

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Later this month, experts and authors from around the globe will gather in London for the fifth annual SOA, Cloud and Service Technology Symposium. This year’s conference agenda reflects aspects of the progress of SOA – both subtle and profound.

In reviewing this year’s submissions, some vivid trends emerged, said Thomas Erl, prominent SOA author, educator and conference chair. “Many sessions are about the convergence of different areas,” said Erl, noting the original event covered SOA, then it covered SOA and cloud computing, and now it has broadened further.

“As you go through all the submissions, you kind of witness an evolution in the industry. It is a reflection as to where the industry itself is going,” he said. As the naming of the event suggests, Erl sees an emerging field that can be called “service technology.”

“In the early days of SOA, people associated SOA with Web services. There was a communications barrier [with people] who thought it was just a way of implementing Web services,” he said. “Now we are seeing many more sessions that look at how [cloud, SOA and services] are applied together, and what the implications are.”

The Symposium, set for Sept 24 -  25 at Imperial College, is slated to cover a broad variety of SOA and cloud-related topics as well. Among scheduled sessions are “Lightweight BPM and SOA,” “Moving Applications to the Cloud: Migration Options,” and “The Rise of the Enterprise Service Bus.” Also on the agenda is a series of on-site training and certification workshops. Billed as “bootcamp-style training sessions,” the workshops will provide preparation for a number of industry-recognized certifications, including SOA architect and cloud technology professional programs.

A key aim of the conference is to offer SOA, cloud computing and service technologies practitioners a look at real-world implementations and field-tested industry practices. However, the event will also cover emerging trends and innovations in the space. Continued »


August 30, 2012  2:37 PM

Connecting Hadoop distributions to ODBC

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

As more enterprises set their sights on Hadoop’s capabilities, new products aim to ease Hadoop integration. Progress DataDirect’s Connect XE for ODBC driver for Hadoop Hive is an example. It boasts scalable connectivity for multiple distributions of Hadoop.

Enterprises looking to carry out additional analysis of data contained in the Hadoop-based store need a reliable connection to their existing predictive analytic and business intelligence tools. That can prove challenging, especially when dealing with multiple versions of Hadoop—distributions include Apache Hadoop, MapR Apache Hadoop, Cloudera’s distribution of Apache Hadoop and others.

“If I’m an [independent software vendor] and I want to onboard Hadoop as a supportive platform, I can either write a bunch of custom code for each specific flavor of Hadoop that I want to talk to—which has massive cost to it, massive complexity and issues related to support—or I can try to piece together some support matrix with the existing technology that’s out there for connectivity,” said Michael Benedict, vice president and business line manager, Progress DataDirect.

The company’s newest driver provides enterprises with another option. “Customers can plug in our driver under their normal code maps [to] applications that already support ODBC today, and they are able to take advantage of Hadoop for all of their customers,” Benedict explained.

The driver offers support for several common Hadoop distribution frameworks, including Apache, Cloudera, MapR, and Amazon EMR. At the same time it provides Windows, RedHat, Solaris, SUSE, AIX, and HP-UX platform support. According to Benedict, the release of this driver reflects a growing need to analyze and process big data.

“[Enterprises are] consuming, analyzing and taking action on a much larger set of data than they have in the past,” he explained. “The reason why that’s changed is that, while you could store that data in the past, you just couldn’t really do it cost effectively. Big data/Hadoop allows you to do it in a slightly more cost-effective manner. Plus you’ve got a lot of technology that’s being built around this to enable you to better monetize and take action on data.”

By offering one unified driver, Progress DataDirect says it is filling demand for better connectivity to all the major platforms supporting the major distributions of Hadoop. Set to ship at the end of October, preview access to the product is now available on a limited basis to current customers. -Stephanie Mann


August 27, 2012  5:53 PM

Lightweight scripts bear down on Java ecosystem

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

In a recent report on the state of Java, IDC analyst Al Hilwa notes that the Java ecosystem is healthy and on a growing trajectory, with more programming languages than ever now hosted on the Java Virtual Machine (JVM).  Hilwa, program director for application development software at IDC, gives credit to Oracle for a mostly successful custodianship of Java, since its acquisition of Sun Microsystems two years ago.

There are some clouds on the horizon, as could be expected for a language and architecture that has been atop the heap of enterprise middleware for so many years. Writes Hilwa: “Java is under pressure from competing developer ecosystems, including the aggressively managed Microsoft platform and ecosystem and the broader Web ecosystem with its diverse technologies and lightweight scripting languages and frameworks.”

While looming lightweight languages, frameworks and runtimes do portend a new state of Java , Java’s ability to evolve to absorb new technologies has indeed proved remarkable to date. There is reason to believe there is still more to come.


August 20, 2012  8:22 PM

Skills for big data: Hadoop, Pig, Cassandra and more

Jack Vaughan Jack Vaughan Profile: Jack Vaughan

Q: What is a data scientist? A: It’s a DBA from California. The joke belies the fact that the world of big data skills right now is pretty much topsy-turvy. If you would you like to look at a short list of skills associated with big data initiatives, you are out of luck. Try a long list instead.

The skills list – courtesy of the IT skills specialists at Foot Partners, LLC – includes Apache Hadoop, MapReduce, Hbase, Pig, Hive, Cassandra, MongoDB, CouchDB, XML, Membase, Java, .NET, Ruby, C++ and more.

Further, the ideal candidate needs to be familiar with sophisticated algorithms, analytics, ultra-high-speed computing and statistics – even artificial intelligence. The needs of big data, which arise in part from modern computing’s ability to produce more and more bits and bytes, mean that developers have to hone their skills significantly. Suddenly, SQL-savvy developers have to obtain NoSQL skills.

New technology like Hadoop is so raw that the developer is often forced to create his or her own software tools, which is a skill in itself.  Writes the Foote crew:

Hadoop is an extremely complex system to master and requires intensive developer skills. There is a lack of an effective ecosystem and standards around this open source offering and generally poor tools available for using Hadoop.

Foote warns that there is only more of the same to come, especially as unstructured data from sources such as sensors and social media pile up in the in-bin. Note to big data scientists of tomorrow: get ready for the deluge! – Jack Vaughan

 


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: