In their book, Chandy and Schulte discuss the results of reducing elapsed time for business processes, obtaining contextual information to accompany event information, the nature of event objects, and the changing workplace we can somewhat imagine as event-driven architecture, which is largely the province now of Wall Street financial systems, becomes more and more mainstream.
In a recent interview, Cal Tech Professor Chandy told us there is quite a bit more involved. Event processing, he said, is no less than the ”stuff of life.” There is plenty of reason to think computerized continuous monitoring, quick response and adjustment will be at the center of how civilization runs in years to come.
”It can even apply to something as basic as water, because you need sensors in water to make sure it’s flowing right and that it’s the right quality. There are increasing issues of contention over water rights and how water is distributed. All of that requires continuous monitoring and response. It’s also important in getting food ”from farm to fork,” he said.
Fnd for more on Chandy’s and Schulte’s book, go to the ”Event Processing: Designing IT Systems” page on the McGraw-Hill professional books site.]]>
“One of the things we have all recognized is that for healthcare to improve, we need information to be available where and when it’s needed to those who are authorized to see it,” said Mary Jo Deering, Director for Informatics Dissemination at the National Cancer Institute, and a member of the Office of the National Coordinator (ONC) for Health IT within the Department of Health and Human Services. Deering said governance will be an important factor in helping to move data where it is needed while making certain that protections for security and privacy are provided.
As part of that effort, the ONC for Health IT plans to establish governance standards for the Nationwide Health Information Network (NHIN). Related rulemaking should take a step forward this week with planned public hearings in Washington DC. Deering said the process will take off in mid-November when the NHIN presents the final recommendations to the Health IT policy committee.
Dr. Doug Fridsma, acting director of the Office of Interoperability and Standards, says this new governance is using a SOA-based approach. Companies within the NHIN and other federal partners use SOA to help support data exchange, Fridsma said. “Much of the work is around a service-oriented architecture in the way in which the [data] exchange occurs,” said Fridsma.
NHIN uses different tools related to SOA. One of the tools, Connect, is an open source initiative in the public domain that helps to promote interoperability in the healthcare system and also enables secure data exchange among healthcare providers and agencies. “Connect, takes the specifications that we have for the NHIN and creates the SOA software that is described in those specifications,” said Fridsma.
The fact that there’s a large open source community to help support the exchange is beneficial to the NHIN. According to Fridsma, it has helped to form the foundation for NHIN exchanges, and also helped with the work that’s going on with presidential initiative to improve health care.
The effort is ongoing. Just last week, the Federal Health Architecture (FHA) sponsored a Connect Code-A-Thon at the Mayo Clinic in Rochester, Minn. to try out Connect Software and address related issues. The event included software developers from companies, health networks and universities.]]>
At JavaOne, Oracle indicated it was building Java APIs for JavaFX Script like binding in JavaFX 2.0. This includes support for high performance lazy binding. Non-Java languages will be able to take advantage of this binding library.
Oracle followed up all this with a proposal to contribute the Sapphire (a jab at SAP?) Java desktop user interface designer to Eclipse (which, we might recall, itself was a jab at Sun).
The slate of new stuff includes Project Prism graphics pipeline acceleration on hardware or via software. It will target DirectX on Windows platforms (both 32 and 64 bit) and OpenGL on other systems. Will HTML5 codecs be a point of contention between Google and Oracle, as Android has? This is the point where our embedded pundit chimes in: ”Time will tell.”
Much of the original Java effort was focused on the client-side, but the server-side has been far more fruitful. The advent of HTML5 could give Java another chance to shine on the front-end, but Oracle likely faces an uphill battle there.]]>
The latest release aims to improve customization and control. It includes the option to maintain separate settings for separate operating modes. Users will also be able to specify which runtime properties will be used when launching programs from the Eclipse-based isCOBOL integrated development environment.
Among other changes, isCOBOL now has a graphical user interface (GUI) which is intended to streamline migration to isCOBOL. The GUI can also be used to move data from indexed file systems to relational databases. Veryant has also made a Graphical Indexed File Editor (GIFE) available. The GIFE is designed to help developers read, modify, add, or delete individual records in an indexed file via a graphical interface.]]>
Well, if the OracleWorld people read the tea leaves right when Oracle rolled out the ExaData data warehouse in a box a couple of years ago, then they weren’t surprised by the heavy dose of hardware at the first day of this year’s OracleWorld. But if they didn’t read those tea leaves, they wandered into a big bundle of surprise at this year’s event. Oracle’s purchase of Sun is shaping up as a sea change for the company led by yachtsman Larry Ellison.
Center stage at Oracle World was the ExaLogic cloud in a box. It puts together Oracle VMs on Solaris or Linux, clustered WebLogic servers, the Coherence data cache, as well as Oracle JRockit and HotSpot Java Virtual Machines (destined to be combined, oh Java faithful), all optimized and running in a 386-architeced box with racks connected by InfiniBand , and employing a clustering scheme that works nicely with ExaData. Oracle spokepeople repeatedly called it a “Middleware Machine.” Their middleware story is suddenly a hardware story, which may take a while for some to digest. What ExaData did to data, ExaLogic is intended to do to logic. “We are making Java sing on hardware,” said one Oracle technologist, citing recent benchmarks.
While it is an extraordinary box – a blinking edition was standing next to several replicas of IronMan in the Moscone lobby – claims that this is the first middleware machine are slightly inflated. Solace and Tibco have put messaging middleware on specialized hardware chips. Earlier this year IBM added data caching to its DataPower appliance. Oracle’s machine is truly an impressive piece of hardware, however, looking a bit like…well… a mainframe.
While the JavaOne crew across Market might be wondering what parts of the JRockit VM would stay and which parts of the HotSpot VM would go, and what Oracle would eventually do with Java, the Java developer ranks had yet another question to ponder: What would things be like if it had ever occurred to Sun Microsystems to ‘make Java sing on hardware?’
Keep an eye out for more OracleWorld and JavaOne coverage at TechTarget sites such as SearchOracle.com, TheServerSide.com and, of course, SearchSOA.com.]]>
“Some of the people who were using Memcached were coming to us when a server went down and asking how they could get their data back – the answer is, you can’t,” explains James Phillips, a company co-founder. To address this vulnerability, NorthScale designed Membase, a new product that aims to build on the flexibility and speed inherent in Memcached but with an added layer of persistence for the data.
Phillips explains that with Membase the data-in-system memory is written to a multi-tier storage model that includes SSDs, spinning media, and online storage like Amazon S3. “That means, as data comes in you can cache it and persist it to a durable media and it can be protected if there is a problem. In addition, we have added replication capability so Membase is fully replicated like a database management system,” Phillips explains. Furthermore, since Membase allows you to replicate to any number of nodes you can use that capability to provide automatic failover in a cluster. Membase also supports rebalancing to provide elastic characteristics, he says.
All of this, according to Phillips, is a step toward a new way of looking at handling data. “You hear a lot of talk about Big Data, which focuses on being able to crunch large data sets and provide analytics in a cloud environment. We make the point that the “big” data came from a Big Audience or user base,” says Phillips.
Phillips says in a real time system, with large numbers of concurrent users, it is best to have data spread widely rather than concentrated because that allows the web-based application to request data from many servers at the same time. “Membase is a database that is optimized for real time storage,” and large numbers of users, he says.]]>
Asking the creative questions is behind new approaches to data analytics. As we mentioned in our bit on Big Data analytics, it is calling for whole new sets of skills – ones garnered from not just the data side, but also from the development, team and applications specialists. We noted the smorgasbord of skills for which to be on the lookout: Clojure, Scala, Python, Hadoop, Java, R, Mathlab, Erlang, LISP, Cassandra and CouchDB.
A look at recent SearchSOA.com articles on data integration shows it takes multifarious shapes. In an expert Q&A, David Linthicum talks about emerging Web data services. In a brief tutorial on transmitting data, Wiliam Brodegen outlined some basic methods, including JSON, Google Protocol Buffers and XML. From JBoss World, our staff blogged about changes to Hibernate Persistence Engine functions that bring in forward for developers working in the latest version of J EE. On sister site Ebizq.net, Joe McKendricks blogged about the use of a data layer for SOA, and your reporter looked at the dawning of the age of enterprise data mashups app stores.
At heart, the history of computing has always been about ever better data reporting and analytics. Maybe the first really sticky computer saying was “Garbage in – garbage out.” Let’s remember that Data Processing led to Information Processing which led to what is now called Information Technology. Maybe we are coming back to the beginning of things, with all the present emphasis on data analytics and complex event processing.
Thanks to Forrester’s Randy Hefner for the Picasso quote. Randy peppers his e-mails with an ever-changing series of great quotations that usually seem relevant to exactly what is going on.]]>