November 13, 2013 7:44 AM
Posted by: Dave Raffo
A few days ahead of EMC’s ballyhooed official XtremIO launch, Hitachi Data Systems (HDS) made flash news of its own this week. HDS bumped up the performance, capacity and number of arrays supported by the Hitachi Accelerated Flash (HAF) modules that are the heart of its flash strategy.
HDS first brought out its home-grown flash modules for its enterprise flagship Virtual Storage Platform (VSP) array in November 2012. Last July, it added them for the Hitachi United Storage (HUS) VM enterprise unified storage system, which HDS pushes as its preferred all-flash platform.
Today it added 3.2 TB HAF modules, double the capacity of the original modules. The 1.6 TB HAF trays are still available. HDS also rolled out new flash optimization software for HUS VM that it claims can deliver more than 1 million IOPS in one system. The optimization software previously delivered 500,000 IOPS. The flash modules are also now available for the HUS 150 midrange array.
To get 1 million IOPS, customers must use at least four HAF trays plus Hitachi’s flash controller. The code is a separate license.
Each HAF is a 2U tray with a controller and up to 12 flash drives. The 3.2 TB drives brings the total capacity per tray to 38.4 TB. An all-flash HUS VM holds up to eight HAF modules for 308 TB with 3.2 TB drives.
Bob Madaio, HDS senior director of product marketing, said HAF gives HDS an advantage in the flash wars over rivals using traditional third-party SSDs.
“We built a lot of smarts right into the device, such as flash management, wear leveling and garbage collection,” he said. “That’s a big differentiator for us.”
November 11, 2013 2:58 PM
Posted by: Randy Kerns
“We missed you at the [name deleted] Institute for Sports Medicine.” This was the title of an email I received, reminding me of when I blew out a knee skiing years ago, and the subsequent rehab process.
The knee injury was not a positive experience. It was one of those events that you use to gauge different points in your life. Among other things, to me it meant no more runs on black diamond trails. But the email didn’t only bring up personal memories. The introductory paragraph also reminded me of marketing messages I’ve seen recently in the storage industry:
“Do you have something that has been bothering you for a while? Sometimes early intervention of an issue can be addressed conservatively through a variety of treatment options. Don’t let a little problem right now become a big problem later!”
This had several negative implications. The first was that I’d hurt myself again and did not seek treatment. This is playing on the probability that I’ve been injured once so I’m likely to do it again. That’s either a reflection on the activities I’m involved in or that I have a tendency to get hurt. The other is that I have not addressed the problem that results. The final sentence is a thinly veiled prediction that it will get worse.
This type of marketing is not much different than what we see in the storage industry – not to mention super paranoia information security marketing. Many approaches around storage data protection try to get the potential customer to identify with a recent real-world disaster. If the customers have experienced a data loss or unavailability incident themselves, even better.
There are several approaches taken, depending on the focus and product being sold. The easy association is with data protection and the need to guard against high-profile disasters. Headlines involving IT usually involve bad news and protecting IT from the calamities experienced by others is a great sales approach.
There are also several negative sales approaches used for storage systems. One is to avoid the type of failures that result in data unavailability. Purchasing a more reliable, mature system with the best support available is the answer. There are enough examples to remind potential customers of failures with inadequate storage systems or storage software because their functions were not mature (meaning they lacked extensive field experience in critical environments).
Arguing from the negative works with those that have been injured previously. (I can feel my knee twitch a bit now). But it may not work so well with those who have not. A better approach may to explain the value a solution brings. In most cases, that value must be explained in economic terms.
Unavailability to information has an economic impact. Looking at the potential impact and what it means requires understanding the customers and their businesses. From there, you have to show the prevention alternatives provided by the product and the economics – cost vs. value. This requires more homework and evaluation but provides a better solution with better understanding to the customer. It may not be as quick a sales opportunity as associating with a negative, potentially painful, event but it is probably the best for the customer and leads to building trust and follow-on business.
(Randy Kerns is Senior Strategist at Evaluator Group, an IT analyst firm).
November 8, 2013 4:49 PM
Posted by: Dave Raffo
David Flynn, who built Fusion-io into the early market leader in PCI flash storage and left abruptly in May, this week pulled in $50 million in funding for his new software startup Primary Data.
Flynn and Fusion-io co-founder Rick White haven’t given much specifics on their new company. The fundingpress release said they plan to bring a product to market in 2014 that will solve problems caused by a “new era of storage that spans from flash to cloud.”
I spoke with Flynn recently and he didn’t get much deeper into product specifics than that, but said “I want to be a vendor to help solve problems coming our way. We want to deal with the hard problems of managing distributed data, performance and capacity.”
He added that these management problems are leading to trends such as putting data in the cloud and on bring-your-own-devices (BYOD). “I think we’re at a tipping point,” Flynn said. “Hardware is either being sent off to the cloud or to a person’s own devices. Running a business, you don’t put in your own server infrastructure now, you send it off to somebody in the cloud.”
He said this is changing the way people buy storage, and who they buy it from. Flynn said he expects a lot more organizations to take the lead from the way companies such as Amazon and Facebook are building their architectures.
“The cloud is forcing a distributed model for storage,” Flynn said. “It’s not centralized any more, it’s distributed in an infrastructure. It’s commodity off-the-shelf industry standard platform storage, not proprietary systems from EMC. You think Amazon uses NetApp or EMC? No way, that’s crazy. And anybody who wants to compete with Amazon is not going to use NetApp or EMC. It’s not cost competitive, and you can’t scale it big enough.
“Facebook is having servers built to their specs by Chinese ODMs, and not the major system vendors anymore.”
As for BYOD, Flynn said IT may still consider it a nuisance but will growth to love it. “Didn’t PCs come about in the same way?” he said. “Administrators held on to their mainframes and they hated PCs. I think the same is true for BYOD. It’s inevitable because of productivity improvements. People are doing most of their work on their own devices — tablets, smartphones, and laptops.”
Flynn said flash is in a similar position as the cloud – it’s here to stay but brings a new set of management issues.
“I think there are still significant challenges with how to get the manageability of traditional storage with the performance of a distributed flash architecture and the capacity of distributed cloud object storage,” Flynn said. “There are still significant challenges with how to deploy and use flash and we are still in the early innings with that. But that problem is not just about flash. It’s about distributed storage architectures. That’s a big problem. Doing something centralized makes it easy. Managing a mainframe was easy compared to managing PCs on everybody’s desktop.”
How will Primary Data try to solve these problems? That’s one of the secrets that will be revealed in 2014.
November 7, 2013 10:27 AM
Posted by: Dave Raffo
Twitter isn’t the only technology IPO this week. Security and backup vendor Barracuda Networks began trading on the New York Stock Exchange Wednesday with promising initial results.
While Barracuda is still more security than storage, CEO BJ Jenkins said backup makes up about one-third of Barracuda’s new business, and is increasing year-over-year at a faster rate than the 26% overall market growth.
He attributes that to Barracuda’s end-to-end data protection approach. While it still sells its backup software standalone, most deals are for integrated appliances along with cloud subscriptions for backup and disaster recovery. Barracuda maintains its own multi-petabyte cloud, and Jenkins said most of its backup appliance customers also use it.
“If you back up into the cloud and have an issue locally, you can spin up a virtual server in our cloud and run your business off a deduped backup copy,” Jenkins said. “This end-to-end offering as made a big difference. Customers used to buy Symantec and some kind of disk and tape, and rotate tapes and do replication for DR.”
Barracuda sells mostly to SMB and mid-range companies, competing primarily with Symantec Backup Exec.
Jenkins, who ran EMC’s backup division before becoming Barracuda CEO in Nov. 2012, said one reason Barracuda went public is to gain more credibility with customers who want to know their security and data protection vendors are stable companies. Unlike flash vendor Violin Memory’s first day as a public company, Barracuda’s price rise in the hours after its IPO. Barracuda began trading at $18 – the low end of its projected range – but its shares closed at $21.55 Wednesday.
“I feel good about the first day of trading,” Jenkins said. “We were fortunate to get out before Twitter. They’ve taken a lot of oxygen out of the air.”
November 6, 2013 12:54 PM
Posted by: Dave Raffo
Coho Data this week pulled in $25 million in funding to expand and market the scale-out storage platform it launched into beta last month.
CTO Andy Warfield said new funding will be used to add features to the Coho DataStream series in 2014. The DataStream is a hybrid storage system that combines a software-controlled switch with PCI flash and hard drives that Coho sees as a building block for companies who want Amazon-style storage.
The original product is file-based. Warfield said the vendor has seen little demand for Fibre Channel or iSCSI but there have been requests for FC over Ethernet (FCoE), so FCoE support will likely follow next year. There will also probably be SMB protocol support coming to go with NFS-only in the original version, and deduplication and replication are on the roadmap list.
Warfield said Coho Data will also announce a product upgrade path next year. “You can expect a continuous and dynamic approach to upgrades than has historically been the case for storage products.”
As Coho Data prepares to make its product generally available, Warfield said the target audience has shifted from the original plan of marketing to small-to-medium businesses (SMBs).
“We found SMBs had no need for the performance we get from our box,” he said. “But we found larger storage environments with from three petabytes to 10 petabyte range environments often had performance pain with their existing enterprise storage.”
Coho DataStream micro arrays ship with 3.2 TB of raw flash (Intel SSD 910 PCI cards) and 36 TB of spinning disk. Warfield said the startup uses a pricing model similar to the Amazon AWS provisioned IOPS model. He said 40,000 IOPS with a three-year support contract costs around $2.50 per GB.
The B funding round was led by new investor Ignition Partners with previous investor Andreeson Horowitz participating, and brings Coho Data’s total funding to $35 million.
November 5, 2013 11:48 AM
Posted by: Dave Raffo
If you’ve never heard of Load DynamiX, that’s probably because until today the start-up was known as SwiftTest. And if you never heard of SwiftTest, that’s probably because until today it only sold its storage validation software directly to storage vendors.
Along with the name change, Load DynamiX today launched a series of infrastructure and application performance validation appliances for IT organizations. The appliances generate massive loads to stress enterprise storage systems, simulate production workloads and validate new devices before putting them into production.
The appliance models include the 10G Base Series with two 10 Gigabit (10 GigE) ports and support for iSCSI and NAS protocol emulation; the 10G Advanced Series with support for NFS 4, SMB 3, HTTP/S, CDMI and OpenStack Swift protocol emulation on top of the Base Series; FC Series with two 8 Gbps Fibre Channel (FC) ports and FC and iSCSI emulation; and the Unified Series with support for two 10 GigE and two FC ports and iSCSI, NFS, SMB 2, FC and SCSI emulation. List prices are $130,000 for the 10G Base, $225,000 for the 10G Advanced, $95,000 for the FC and $180,000 for the Unified appliances.
All of the appliances include Workload Insight Manager, the software the vendor has made available to storage vendors since 2009.
Load DynamiX VP of marketing Len Rosenthal said EMC, Dell, NetApp and Hitachi Data Systems (HDS) use Workload Insight Manager to test their storage arrays.
Rosenthal said each 2u Load DynamiX appliance has the load generation capabilities of 20 servers, and they emulate the I/O profile of applications. He said the appliances are an alternative to using Iometer with a bank of servers. Unlike Iometer, Load DynamiX simulates metadata.
“We’re about understanding changing workloads,” Rosenthal said. “We get people to simulate workloads before going live.”
Rosenthal said GoDaddy.com used Load DyanmiX to validate a hybrid solid-state drive (SSD) storage array and significantly reduced its cost before putting it into production, and the Healthcare.gov site fiasco was caused at least in part by lack of load testing before going live.
If you’ve never heard of the Healthcare.gov fiasco, that’s probably because you’ve been spending too much time trying to get your SAN or NAS up to speed.
November 1, 2013 4:23 PM
Posted by: Dave Raffo
Struggling storage vendors companies Overland Storage and Tandberg Data today confirmed their plans to combine and try to turn two money-losing businesses into a winner. The companies today said they have reached agreement for Overland to acquire Tandberg in an all-stock transaction.
No purchase price was given, but Tandberg will become a wholly owned subsidiary of Overland. Overland CEO Eric Kelly and CFO Kurt Kalbfleisch will remain in their current roles and COO Randy Gast, Overland’s senior VP of worldwide operations and services, becomes COO of the new company. Cyrus Capital, which bought Tandberg out of bankruptcy in 2009, will get two of seven board positions.
On a conference call to discuss the deal, Kelly said the merged companies had more than $100 million in revenue last year – with around $60 million coming from Tandberg – and combining them provides “a clear path to profitability.”
Both companies have struggled on their own. Along with Tandberg’s bankruptcy, Overland has been losing money for years and its fortunes took a steep downturn after it lose at tape OEM deal with Hewlett-Packard in 2005 that accounted for most of its revenue. Overland has been trying to rebound as a storage systems company since then, although it still tapes tape drives and libraries to go with SAN and NAS systems and disk backup.
Tandberg also sells tape libraries and drives, RDX removable disk, disk backup and low-end NAS. Kelly pointed out that Tandberg’s tape and NAS products are sold into a lower-end of the market than Overland’s, and there is little or no competing products.
“The product lines are complimentary with minimal overlap,” he said.
Overland executives disclosed in May that they were discussing a merger with Tandberg. Kelly said he hopes the shareholder vote needed to close the deal will come by the end of the year.
November 1, 2013 9:39 AM
Posted by: Dave Raffo
Integrated backup appliance vendor Unitrends has new ownership while management remains the same and vows to move deeper into cloud-based data protection.
Insight Venture Partners completed a majority investment in Unitrends this week, giving it control of the startup best known for integrated backup appliances. Insight general partner Mike Triplett said Unitrends’ management team was one of the things he likes about the company. He said Insight will keep Unitrends CEO Mike Coney and his management team while Triplett and Richard Wells of Insight join the Unitrends board.
“There are three things we like about Unitrends,” Triplett said. “We like that it’s in a large and growing market segment, we like the management team, and the product is head and shoulders above the competition.”
Triplett likes the market so much that he also sits on the board of virtual machine backup software specialist Veeam Software and Acronis – two other Insight investments. He said he’s not concerned about being involved with companies that compete with each other because there is plenty of backup to go around.
“The market is big enough that everyone can prosper and do well,” he said.
Coney became Unitrends CEO in 2009 after working for Acronis and Veritas (now part of Symantec). He said Unitrends has about 260 employees and he expects to go grow substantially with the Insight investment. Although it does not disclose revenue and income figures, Unitrends claims it has grown revenue in 19 straight quarters and its year-over-year bookings increased 72% last quarter.
Coney said the vendor will continue to build on its integrated appliance platform, but “the biggest roadmap area of us is the cloud and DR as a service.” He said those plans include selling to managed service providers, offering its appliance customers options for replicating to the cloud for DR, and connecting to public clouds such as Amazon and Microsoft Azure.
He said Unitrends will maintain its focus on the mid-market – companies with from 50 to 1,000 customers.
October 30, 2013 12:52 PM
Posted by: Dave Raffo
, data backup; replication; snapshots; data deduplication
CommVault went against the grain and reported better-than-expected financial results last quarter. That makes the backup software vendor “public enemy number one” to its larger competitors, according to CEO Bob Hammer.
CommVault’s revenue of $141.9 million last quarter grew 20% from the previous year and six percent over the previous quarter. The revenue figure and the company’s $17.4 million net income beat Wall St. expectations. That comes after EMC, Symantec and IBM all missed expectations, including slow growth or declines in backup software.
Still, CommVault is not immune from problems plaguing the storage industry, such as slow federal government spending and companies’ cautious approach to closing big deals. Most of all, it faces pricing pressure from the big boys of data protection.
When asked if larger competitors Symantec, EMC and IBM are doing anything different competitively, Hammer said they were coming up with “tricky, crazy pricing initiatives” such as deep discounts and product bundling.
“Those guys are completely irrational in their pricing policies,” Hammer said on CommVault’s earnings call with analysts. “We’ve become public enemy number one. So any tricky, crazy pricing initiative they can possibly think of, they throw at customers and we’re pretty savvy in understanding what those are and can parry them pretty well. But that’s their primary weapon. We’re pretty well attuned to what each of these different vendors are doing there and respond accordingly. So my answer to them is, bring it on.”
CommVault has some tricks of its own to play in the form of new features for its Simpana 10 platform. Hammer said will bolster Simpana 10 “in the very near future” with products including enhanced archiving for Mircrosoft Exchange and SharePoint, self-service try and buy products for SMBs, features for virtual machine administrators and more partners for its IntelliSnap array-based snapshotting.
All of that goes with the Reference Copy archive option CommVault added last week that allows customers to index and classify data to low-cost storage.
Unlike several storage companies, CommVault did not have to reduce its forecast for this quarter although Hammer admitted there are possible pitfalls ahead. Although CommVault reported its revenue from the U.S. federal government increased 43% from last year, Hammer said “We are particularly cautious about U.S. federal government spending due to uncertainty associated with the recent fiscal impasse.” He also said he expects “softness” in big deals of greater than $500,000. “Many in the industry have reported big deal cancellations and pushouts,” he said.
Enterprise deals – which CommVault defines as $100,000 and up – only increased three percent last quarter.
“We understand we’re in a weak environment and also lumpy,” Hammer said. “So when you start getting into possibly seven-figure deals which makes a difference in our performance, we’re just issuing a concern. The positive is that the opportunities are there and the negative is we’re in an environment where those deals get pushed out, and there could be some future problems.”