Posted by: S R Balasubramanian
breakthrough technology, hypervisor, resource optimization, server virtualization
Virtualization as a subject has been talked of for many years now but it is only during the last few years that it has come into large scale usage by many organizations and data centers. While large organizations have embraced this technology due to sheer necessity, many small and medium-sized organizations are still out of this gambit. So it makes sense to explain this technology solution and make it easy to understand.
In simple terms, virtualization is a method of running multiple independent virtual operating systems on a single physical computer. Let us say it is the masking of server resources from server users, including the number and identity of individual physical servers, processors, and operating systems. All server resources are put together as a pool from which each application or task draws out the resources it needs, thus making the application or the user believe that his task is running on a separate and dedicated server.
How this works is simple to understand. Virtual servers work through a middle layer called a ‘hypervisor’, which masks the servers from the users and takes over the task of allocating resources to each task from the server pool. To understand further let us consider the following statements:
Virtualization enables us to combine servers from multiple generations into the same virtualized server pool.
Virtualization allows a group of inefficient servers to be replaced with a fewer number of machines.
Virtualization is a software that allows a piece of hardware to run multiple operating system images at the same time.
In the conventional set up, we keep on adding servers as we introduce more applications in the organization and when the numbers grow very large it becomes unwieldy and difficult to manage. Further, every application consumes resources differently; leading to some servers being strained to the full and some others have it easy with very little usage. This leads to sub-optimal utilization of server resources which is undesirable. Adding to this is the difficulty of provisioning space to host several servers.
If you are thinking of implementing virtualization, I would recommend the following steps:
Clearly assess the requirement of server resources in a three-year horizon, taking into account additional applications that are likely to be introduced; and the projected growth of the organization leading to higher number of users and the increasing size of data to be processed.
Evaluate and select a virtualization software that is appropriate, given the current IT landscape and size of the set-up.
Re-asses the hardware resources and plan new servers that would make use of this new environment.
Choose an implementation partner who has adequate experience and expertise in the software selected.
Work closely with the partner to work out an optimal design that works best to accommodate your applications.
A breakthrough technology?
I think it is. It is amazing how the researchers worked out a way to break the paradigm of considering the processor as undividable and to create a mechanism by which the processor lends itself for manipulation. It is a wonderful solution to leverage on and I hope most of us use this to make optimum use of the resources.