Big players in the virtualization world griped about the absence of performance benchmarks for virtual machines on CIO Talk Radio yesterday and discussed some of the issues surrounding virtualization standards.
Guests on the show included: Simon Crosby, Chief Technology Officer of the Virtualization and Management Division of Citrix; Tom Bishop, Chief Technology Officer, of BMC Software; Dr. Tim Marsland, Sun Fellow, Chief Technology Officer, for the Software Organization at Sun Microsystems Inc.; and Brian Stevens, Chief Technology Officer and Vice President of Engineering at Red Hat.
The glaring ommission in this lineup: VMware, Inc.
The panelists on CIO Talk Radio didn’t mention VMware by name, but did complain that some companies aren’t being open with their performance data, thus prohibiting the virtualization industry from publishing comparative performance data.
VMware’s licensing agreement for ESX allows users to conduct internal performance testing and benchmarking studies, and allows those users (and not unauthorized third parties) to publish or publicly disseminate the data provided that VMware has reviewed and approved of the methodology, assumptions and other parameters of the study.
Users that have published benchmark data, like Sr. Systems Engineer Mark Foster did on his blog, have had to unpublish results because of VMware’s stipulations.
Meanwhile, the SPEC Virtualization Committee has been working to create standard benchmarks for VMs. The committee’s goals are to deliver a benchmark that will model server consolidation of commonly virtualized systems such as application servers, web servers and file servers; provide a means to compare server performance while running a number of VMs; and produce a benchmark designed to scale across a wide range of systems.
SPEC expects these benchmarks to be available by the end of this year, but the timeline is not set in stone, according to the website.
Sun’s Marsland said benchmarking progress has been slow because there isn’t an easy way to define a workload, and a large number of benchmarks are required.
“We are talking about a virtual computer, with lots of aspects that need to be benchmarked,” Marsland said. “Every component that gets virtualized needs to be benchmarked.”
Having an open, standardized way of benchmarking is expected to push virtualization further into the mainstream because it will eliminate false perceptions about performance, panelists said. For instance, “there is the thought that I/O intensive workloads can not be virtualized, and the absence of benchmarks prevents us from proving otherwise. It is important for us to have good benchmarks out there,” one panelist on the show said.
Though users look at benchmarks, this type of data is most useful to vendors and OEMs who can use the performance standards to improve the technology, and of course, market their products.
“More open scrutiny of performance results will help us to improve as an industry overall,” Bishop said. “There are ways to measure performance in non-virtual environments, and people are adapting those techniques to get the most out of their virtualized environments.”
In terms of application performance in virtual environments, the issues differ depending on the data center infrastructure. The network, the servers and the storage all affect performance, said Stevens of RedHat.
Another problem with virtualization? There are support challenges. If an application running in a VM starts acting wacky, the application vendor may not support it, Crosby said.
Licensing and support in virtual environments has been a major gripe with Oracle, for example, which does not support running its applications with VMware.
“It is a reasonable concern…right now there is irrational market based control. Some folks are abstaining from supporting certain apps [in virtual envionments]. As customers demand support, things will hopefully get rational, by next year I hope,” Crosby said.