[ANALYSIS] – Microsoft has made its Oslo design team part of the company’s Data Programmability group. The news was released via the web blog of Doug Purdy, product unit manager for Oslo. It has been a journey. Continued »
Semiconductor memory advances have powered a new era of portable music and mobile devices. In fact, the general ascent of computer technology has been very much based on cheaper, faster and larger semiconductor memory. The People in the White Smocks are working to ensure that the march continues. Service-oriented architects take note … Continued »
Modern IT architectures like SOA allow and require web applications to change rapidly along with business needs. Some of the major bottlenecks to this sort of agility are configuration errors inside application deployment scripts. When an application is altered, the scripts that deploy it onto the web may need altering as well. But when nobody on the team can remember who wrote the script or what parts need to be rewritten, the downtime can cost enterprises a lot of money.
A newer breed of software is emerging that tries to take the scripting out of deploying and configuring Web applications. One vendor, Phurnace Software provides an automated framework for deploying Java EE applications. Larry Warnock, president and CEO of Phurnace, said mucking about with deployment scripts needs to be a thing of the past.
Scripting can be a black hole for productivity. “You have to make it very specific to what you’re doing and then there is no feedback,” said Warnock. “It either works or it doesn’t and you don’t know what you did wrong.”
Phurnace is meant to mask many of the complexities of deployment parameters and configuration with a “black box” approach. One of the greatest challenges in deploying Web applications is configuring the application server and reconfiguring it when the app changes. Software like Phurnace Deliver take the hand coding out of the equation when facing these issues.
The overnight news is that web application framework specialist SpringSource was scooped up by virtualization giant VMware for about $362 million in cash and equity plus the assumption of some $58 million of unvested stock and options. It seems the industry thrust toward cloud computing is creating stranger and stranger sets of bedfellows… Continued »
The reduced pace of the summer seems to bring out the morbid curiosity in some of the best and brightest of SOA analysis. Last summer, if you recall, SOA began its death march, which culminated at the end of the year (when things are really really slow) with the pronouncement that “SOA is dead…” Continued »
by Jack Vaughan
Sometimes we have to remind ourselves about the obvious things. Business Process Management is about processes. SOA is about architecture. The two have been involved in a tango in recent months, as software architects work with their business-side brethren to make change happen in the organization. On one level, the dance of architecture and process is very familiar. Yet it plays out today in unique ways as you will find in our BPM tutorial.
UPDATED – IBM’s recent move to purchase SPSS has been pegged as an attempt to improve its analytical software portfolio. It is that.
But like other IBM purchases, it is also about installed base. Among a host of pretenders, SPSS is a real software company–but it became a real software company by delivering and successfully updating products customer boughts over many years.
It did just enough research and development (and acquisitions) over the years to have a cool story now that analytical software is hot. It also stuck to its knitting over many years when coolness advisors would have led it into relational data stores, OLTP, and so on.
SPSS is a good fit for IBM, which continues to quietly pack up a lot of software companies that are older than startups, chock full of customers, and open to new technology and, in many cases, technology standards. Count Telelogic and ILOG among them. These acquisitions further thin out the ranks of mid-tier software companies–ones that are neither big nor small. Some days it seems like the future of software companies is one of the very big and the very small.
Last month we talked with SPSS. The company is in the news at the moment, as IBM has launched a $1-billion-plus bid to buy the firm.
When we spoke, the topic was XML standards for analyzing data. This is one of the places where SPSS is clearly in the vanguard.
SPSS reps discussed Version 4.0 of the Predictive Model Markup Language (PMML) for statistical and data mining models from the Data Mining Group (DMG). The company is incorporating PMML V.4.0 into upcoming versions of its PASW Modeler (formerly Clementine) data mining workbench and PASW Statistics (formerly SPSS Statistics).
PMML Version 4.0 is an amazing example of what XML (the “X,” recall, is for “extensible”) can do. The newly released version of PMML offers support for time series models, support for multiple models (both segmented models and ensembles of models), and improved preprocessing of data. Preprocessing of data here refers primarily to “outliers,” those dubious pieces of data that work for Malcom Gladwell but which tend to muck up the usual works of statistical analysis.
Going forward, said Jing Shyr, SVP & Chief Statistician at SPSS Inc., PMML will allow corporations to more readily embed statistical models into operational systems that enhance business processes. Shyr takes her role as chief SPSS statistician seriously, but has some fun. “If any number looks suspicious, it’s my fault,” she told us.
What does the Data Mining Group’s Bob Grossman think of the latest moves with PMLL, and how these might play out in typical corporations? “With Version 4.0, PMML now handles all of the common use cases that occur when deploying analytic models in practice,” Grossman said in a prepared statement.
We asked for more and he responded via email. His response: “PMML calls multiple models, which include segmented models and ensembles of models. After that the most important new features in PMML are improved support for preprocessing data and for time series data.”
By Rob Barry
CloudCamp in Boston last week was clear evidence that enterprises are starting to take a serious look at cloud computing. Given, a lot of them are looking at cloud the way marketers look at social media. There’s a strong sense it will before long be a dominant force in how business is done online. But just how? Well that’s where people are still scratching their heads.
A number of vendors lined up to give their take on what’s important in the cloud. EMC said its Atmos product provides a scalable “internal cloud,” Signiant talked about using the cloud to move massive amounts of digital media, and Microsoft prepared 25 slides for a 5 minute talk on Azure.
Stories were similar: a CIO said IT has to cut costs and cloud’s technology on-demand approach might be an avenue. Companies are exploring the possibilities but seem timid due to the security issues inherent in sharing databases with others.
One highlight was sitting in a room of architects and consultants discussing their thoughts on the cloud. These folks all flocked together to heatedly discuss how cloud is decreasing the need for IT operational infrastructure but, at the same time, causes businesses to “subscribe” to much of their technology.
By Mike Pontacoloni
Google App Engine is a straightforward example of cloud computing: You create an application, but your storage, bandwidth, and hosting needs are provided by Google’s computers, not yours. Such simplicity is only apparent on paper, though. Making the move from traditional IT development to cloud-based development comes with challenges.