Information Technology Management with a Purpose

Apr 16 2012   4:56AM GMT

Are you ready for in-memory computing?

S R Balasubramanian Profile: S R Balasubramanian

Over the last few years we have seen phenomenal changes in the technology landscape. New technology breakthroughs keep taking place every now and then; either in the hardware space or in new software and process ideas and these have been altering our approach to computing. I am now talking about the latest move to exploit computer memory to store and process data.

The concept

In the conventional form, we typically store data on the hard disk, and when we want to perform a task, we pull out the relevant data and applications for the purpose on to the computer’s main memory, which is where computations happen. The time taken for the entire process therefore depends on the access time for reading data on the disk and the speed with which the data is loaded on to the processor and memory. Though disk and channel speeds have been enhanced over time, they still are much lower than today’s processor and memory speeds. With computing needs of enterprises on the increase, accessing data from disks has become a bottleneck.

Going down the memory lane, I remember storing data on tapes and cartridges. Then as disk costs (cost per GB) reduced, came the concept of use of disks as tapes. Now as memory costs go down further, it has become possible to store larger chunks of data in the memory.

The concept at work

With the emergence of multi-core processors and the sharp decline in prices of processors and memory, a few vendors have developed a technology that makes possible for even large enterprises to dispense with hard disks and store and perform all operations in the main memory. This boosts performance enormously as compared to systems based on retrieving data from hard drives. The pioneers in this area have been German Software vendor SAP and TIBCO.

Curious to know more about this approach, we, at our CIO Association, invited SAP to hold a seminar explaining the concept and their offerings. The seminar was well-received by the audience as SAP gave an introduction to its new BI product, HANA (high-performance analytic appliance). HANA is a composite solution comprising the software solution and an appliance. SAP has tied up with various hardware vendors including IBM, HP and others, and such servers come to customers pre-configured. With complete data residing in the main memory, computing gets very fast; SAP claims increase of speed by a factor of 10 on data loading and by a factor of 100 on reporting. That makes real time reporting and ability to take faster decisions possible. The data is stored in a compressed form using a horizontal format (compression ratio of 6+) and the data split and stored with check bits to ensure recovery in case of failure of the system.

Use cases

While it is always good to hear about new technologies and their breakthrough capabilities, it is the questions of its relevance and usefulness to user organizations that come to fore. So let us take a look at the possibilities where a product like this could be of help:

  • Significantly faster processing: This could be useful when you want to do a quick profitability analysis in a complex organization with multiple products – you can process in seconds which would otherwise take a few hours. You could analyze the sales performance during the day and take appropriate action before the competition wakes up. Similar use cases can be for production scheduling, logistics, etc.
  • Replacing complex data warehouses with in-memory data storage: Data warehouses aggregate data in various different aggregates in order to have the answer ready when the question comes. With in-memory computing, you don’t need to do the aggregates but can just calculate on the fly. That means less cost for infrastructure to run large-scale analytics systems.
  • Run features that you could not do before: You may have been constrained in the past from running complex iterations including linear programming models and other optimization solutions as they took enormous time and resources. Various what-if analysis and future scenarios calculations can also find a use here.

Should we take a plunge?

Well, the technology is here and it works. However as in the case of all new introductions, organizations would do well to assess their requirements and work out a justification for adopting in-memory computing. The technology will get further fine-tuned as it matures and we would see more applications getting on to the memory. The technology is truly transformational and need to be watched closely and adopted if the situation so demands.

 Comment on this Post

There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

Share this item with your network: