I also had chance to question George on the company’s rationale for dubbing its product a “storage hypervisor,” a term I’ve long thought misleading.
If you’re not familiar with DataCore, its product is a fully featured software storage controller that takes physical storage media from any vendor to create one large pool and manages volumes within that, with features including replication, snapshots and CDP, unified storage capability, thin provisioning, and RAID striping.
Item No. 1 of the news is that DataCore has upgraded its SANsymphony-V to Version 9. The product is aimed at enterprises and particularly those that want to establish public or private clouds.
Features of V9 aimed at enterprise requirements include mirroring for high availability, disk migration, automated storage tiering and load balancing.
Cloud-friendly features include storage provisioned automatically to fulfil user catalogue choices, automated creation and breakdown of disk volumes to meet user need, and scale-up and scale-out capability for performance growth.
Item No. 2 of the news is DataCore’s launch of a pay-as-you-serve cloud service provider license programme. It does what it says on the tin; you pay DataCore to use its software according to the storage capacity you’re supplying to your customers.
I like these announcements. While I can’t comment on the details of how well the products work, I can applaud a product that drives a wedge into the hardware/software lock-in that blights a very large chunk of the storage industry. And it seems to me a progressive development to allow service providers to match their storage infrastructure costs to their revenues.
I don’t, however, like the term ‘storage hypervisor,’ which litters DataCore’s publicity.
I understand why the vendor uses it. It’s all about trying to differentiate itself in the market.
I put it to Teixeira that DataCore’s is essentially a storage virtualisation product. He agreed, to an extent. His argument is that it has come from a storage virtualisation lineage,but is now more than that. As he sees it, storage virtualisation has gone through three distinct phases.
First came the logical volume managers like Veritas Volume Manager and StorageTek’s Iceberg. Then came the true storage virtualisation products, the likes of FalconStor’s software products, for example, but also hardware like IBM’s SVC. These he characterises as having relatively basic functionality and a requirement for storage expertise. Against these two he contrasts the latest phase — led by DataCore, of course — and this is the age of the storage hypervisor: a storage virtualisation product, true, but one boasting sophisticated storage management features.
The problem with storage virtualisation as Teixeira sees it is that it has been hijacked as a category by the hardware vendors. It should refer to a software product that’s independent of hardware, he says.
“We’re trying to get people to think again; and like VMware or Hyper-V we want them to think of higher-level supervisory software and interchangeable hardware,” he said.
OK, there’s a logic to this, but it’s arguable that all this does is add confusion. For a start, it’s debatable whether such a product constitutes a hypervisor at all as it has historically — from IBM’s first use of the term in the mid-1960s — meant a supervisory layer overseeing subordinate operating systems, and there’s no way you can call what DataCore oversees OSes.
Then there’s the problem that other vendors are trying to hitch themselves to the hypervisor bandwagon – for example Virsto, which also calls itself a storage hypervisor vendor while selling a read/write logging product that speeds up storage I/O and has nothing to do with storage virtualisation as we know it.
On SearchStorage.co.UK, I really can’t see us ever referring to the likes of DataCore as a storage hypervisor vendor. We need to be able to categorise products clearly, and anything that pools disparate physical storage into a single manageable pool will be a storage virtualisation product, whether it comes as hardware or software. We need to do it for the sake of the reader, so he or she can understand the functional essence of the product and compare like with like. Repeating vendor hype would be spreading obfuscation and doing the reader a disservice.
Follow me on Twitter: AntonyAdshead]]>
In the consumer IT world Microsoft was forced, for example, to make the use of Internet Explorer a choice rather than an obligation when you bought a PC/operating system. In the business IT world, the last decade or so has seen the liberation of the operating system from the processing hardware, most notably in the case of Unix flavours/RISC chips and their supercession by Windows and Linux on x86 servers.
Spending time talking to storage vendors at SNW Europe, it’s clear there are competing forces at work in storage too. While some vendors want to sell you the entire software/hardware stack, others want to let you buy point products that allow you to build your own system. They’re like centripetal and centrifugal forces at work – one tends to pull things together, the other flings them apart.
DataCore’s products are one example of the tendency towards a pulling apart of the storage ecosystem in a way that allows the customer greater freedom to choose what they buy. DataCore provides software products — that can reside on a dedicated server or as a virtual appliance on a hypervisor — from which one can build a full-featured storage array with, for example, thin provisioning and CDP. The disk component can comprise direct-attached drives on servers, commodity white-box arrays or existing arrays from storage vendors. DataCore can also auto-tier to anything from server-side cache to the cloud via traditional classes of spinning disk.
And you get all this for a claimed (by DataCore) cost of less than €20,000 for 10 TB of fully featured storage. You’d be hard pushed to get that for less than €100,000 from the storage market leaders, so what’s standing in the way of adoption of software-only storage products?
The key obstacle is the huge marketing machines of the large vendors, said Alex Best, DataCore’s technical business development director. But that won’t last, he said.
“People aren’t dumb, and they know they’re being ripped off every time they need to do a forklift upgrade and go through a six-month deployment project when they must move from array generation A to generation B. They used to swallow this, but they won’t in future. We’re doing for storage what VMware did for servers,” he said.
There’s no doubt that buying integrated storage products is the incumbent method of doing things. And the revenues show it. The likes of EMC, NetApp and HP’s storage division clock up tens of billions of dollars of revenue per year; the software product players added together would barely make a billion.
But then it seems to make perfect sense to allow customers the freedom to choose: to buy the intelligence of storage as a software product and add commodity drives — which are the same in every storage array — as that, as a commodity item and not as a branded expansion pack/shelf.
The big storage players clearly think their model is the correct one, that it’s what customers want. For example, IBM certified storage consultant Kurt Gerecke’s response was that customers spend the premium on products that lock software to hardware because they want to know the software will work with the hardware and will continue to do so.
“With our products the software that drives them is guaranteed to work with the hardware. The customer wants to know it is tested and will be easy to install. With software products, who will guarantee they work together? It’s a risk assessment, and most customers prefer a turnkey solution and don’t want to invest in complexity. They pay more for IBM than they would for a software-only product, but the extra cost is like an insurance fee,” said Gerecke.
Others say the driver is the channel. Warren Reid, DotHill’s marketing director, said it is the channel that pushes for integrated products. “People in the channel have told us they need a simple product, not one made of many components, with customers going to many places. They want a simple bundle they can quote on easily.”
NetApp, however, shows a slightly different approach. There’s no doubt it is wedded to huge revenues from integrated storage products but is also looking to test the waters with software-only products.
John Rollason, EMEA solutions marketing manager for NetApp, said, “There’s not the market there for DIY integration. The market is going the other way; FlexPods show this.” But Rollason also pointed out that NetApp has a limited-availability virtual storage controller for use in VMware hypervisors, the Ontap-V. “Our CTO has said we will do more along those lines,” he said. “If it became something the market wanted, it’d make sense for us to offer it.”
This is one to watch; to see whether a tiny tendency can become a movement.
Follow me on Twitter: AntonyAdshead]]>