Posted by: Randy Kerns
Almost every conversation about storage includes performance. That’s because storage system performance is important for the responsiveness of applications. Most vendors go to great efforts to provide performance data for their storage systems. This performance data provides valuable information for making decisions about deployment and how particular applications are used.
But the performance information must be credible for it to help make decisions. If the performance data is inaccurate or not applicable for the way the customer will use the system, that vendor’s performance data will be discounted by decision makers in the future. Vendor performance information is greeted with skepticism anyway. Producing inaccurate or inapplicable information quickly turns skepticism into distrust.
For performance information to be useful, the correct performance testing software must be used in a controlled environment that represents the customer applications and configurations. The use case dictates the type of information required, and performance testing software must be capable of reproducing the desired environment. Using the wrong storage exerciser program can give misleading information and misrepresent the performance for a particular application.
A good example would be performance for a Virtual Desktop Infrastructure (VDI) environment. VDI represents a complex workload for storage that changes quickly. A storage system that can respond to changing workloads would have advantages over one that may be excellent in certain aspects but cannot adapt quickly.
The performance testing of storage for VDI environments must replicate the dynamics of the changing VDI workloads. A standard exerciser test program for storage meant to exhibit storage system characteristics by driving I/Os with predefined read/write ratios cannot mimic the actual workload. The only way to accurately get useful information for a storage system’s capabilities in a VDI environment is to use actual workload captured streams that are played back against the storage system. The storage system’s capability to adapt to the complexity of the I/O characteristics can be demonstrated this way. Scaling the workload can show how many virtual desktops the system can support within the acceptable parameters.
For IT personnel making a strategic decision, evaluating performance requires testing in their environment, running industry standard benchmarks specific to the types of applications they use, or using third-party supplied information. Results should only be considered if they are relevant to the application.
Introducing new storage systems into environments represents risks for IT. The big risk is not having the performance to meet the needs. Performance information obtained with relevant testing and test software can help minimize those risks.
(Randy Kerns is Senior Strategist at Evaluator Group, an IT analyst firm).