Although CommVault had a strong quarter to finish 2012, CEO Bob Hammer spent nearly as much time on the backup vendor’s earnings call this week talking about what is coming rather than what happened.
Without going into too much technical detail, Hammer previewed Simpana 10 ahead of its Feb. 25 rollout. He said the next version of the Simpana backup and management software will include current core technologies such as depuplication and snapshots while adding features to help attract customers in new markets.
Hammer said Simpana 10 will include new features for protecting data on mobile devices and analytics – areas that have not been well-served by traditional backup products — and increased capabilities for storing and managing data in private and public clouds.
Some of the features will include next generation archiving, improved integration with shared services, and more comprehensive reporting and monitoring.
“We’re doing all of this in a way nobody else has done it before,” Hammer said in an interview after the earnings call. “We have massive scale and automation and multi-tenancy, and only make one copy for backup and archive.”
Hammer said the changes are needed to reflect the way customers now protect and manage data, which is created much faster and has more regulatory requirements tied to it than ever before.
“Data management technology has to change,” he said. “With mobile data, customers are having problems protecting data because the technologies they’re using today aren’t adequate to back up in an adequate timeframe. They don’t have time to back it up, and when they do, they have difficulty finding it.”
Part of the scalability for Simpana 10 will be inclusion of an object-based repository, Hammer said. He said Simpana will handle metadata separately, the way object-based storage systems do. “And it’s open, so a third party can write to it,” Hammer said. “You can crawl the network and get data in in a different way than we’ve done in the past. You can search the metadata before you search objects, then do a deeper dive on the content itself.”
CommVault is doing fine with Simpana 9. Its $128 million revenue last quarter was up 24% over the previous year and 8% over the prior quarter. Hammer said he expects to hit $1 billion in annual revenue in a few years, with Simpana 10 the catalyst.
Fusion-io has greatly benefitted from Facebook and Apple for more than two years, with those companies buying tens of millions of dollars’ worth of Fusion-IO PCIe flash cards almost every quarter. Now Fusion-io is seeing the downside of having two customers make up most of its business.
Fusion-io dropped its revenue forecast for this quarter to $80 million from a previous estimate of around $137 million because Apple and Facebook won’t make any meaningful purchases this quarter or next. Fusion-io expects to lose from $10 million to $15 million this quarter after making $13.9 million last quarter. Its stock price fell 22 percent Wednesday after it revealed the forecast drop, and fell another 13 percent today.
Last quarter, Facebook bought $41 million of Fusion-io products and Apple bought $19.3 million, making up a little more than half of the Fusion-io’s $120.6 million in total revenue. In the previous quarter, Facebook and Apple bought $33 million worth of Fusion-io products apiece.
Now the spigot has been turned off, at least temporarily. Fusion-io CEO David Flynn spent a lot of his earnings call Wednesday assuring analysts that Apple and Facebook will return to their status of prolific flash buyers after the six-month lull.
“This is really about the timing of when they put in new infrastructure, not whether or not Fusion-io is a key part of that infrastructure,” Flynn said.
He said Fusion-io has proven its value to those companies, adding “when the pull the trigger for deploying is not in our control.”
Fusion-io could have other problems in the coming months. EMC is pledging more PCIe-based flash products following its VFCache caching software launched in 2012. And Seagate this week invested $40 million and signed a reseller deal with Fusion-io rival Virident Systems. Seagate will sell Virident PCIe flash products through its OEM partners and channel partners.
Flynn shrugged off the Seagate-Virident news, saying “the business is not driven at the OEMs. The business is driven at the end-user customer. So having companies that build components, really they won’t be at the point where the competition is actually happening, and that’s with the end users.”
On a more optimistic note, Flynn said the ioScale cards Fusion-io launched this month are off to a good start. The high-density low-cost ioScale cards scale to 3.2 TB and are aimed more at cloud providers and telcos than enterprise’s mission-critical apps. Flynn said ioScale products were involved with five of the vendor’s 10 $1 million-plus orders last quarter.
Hard-drive giant Seagate is carrying out its flash strategy one piece at a time. This week it added PCIe flash to its portfolio by making a $40 million investment and signing a reseller deal with Virident Systems.
Seagate will sell Virident’s Flash Max II PCIe solid-state cards and will jointly develop new products with Virident. Seagate is Virident’s fourth strategic investor, joining Intel, Cisco and an unidentified large storage vendor that industry sources say is EMC.
Seagate’s flash strategy is to partner rather than develop products on its own. It sells Pulsar solid-state drives (SSDs) through a co-development deal with Samsung and it has invested in flash controller startup DensBits.
“We’ll have a stack of solid state solutions that span from the client all the way through to the flash server high-application data demands of the data center,” Seagate chief sales and marketing officers Albert “Rocky” Pimental said Monday on Seagate’s earnings call with analysts.
Virident launched its Flash Max II single-level cell and multi-level cell cards in August 2012, ranging in capacities from 550 GB to 2.2 TB. Virident’s main competitor is PCIe flash leader Fusion-io.
In a report on the Seagate-Virident deal, Objective Analysis semiconductor analyst Jim Handy wrote that he expects more than two million PCIe SSDs to ship by 2016. He expects PCIe SSDs to outsell SAS-based SSDs.
Seagate has been criticized for moving slowly in the SSD market, but Handy attributes that to the vendor’s exhaustive testing of flash products before releasing them. “With this move Seagate has moved from being a non-participant in the PCIe SSD space to an enviable position with one of the highest-performing products available,” Handy wrote.
Virident now has $116 million in funding from venture capitalists and strategic investors. It closed a $26 million round in September when Mike Gustafson replaced founder Kumar Ganapathy as CEO.
Actifio bills itself as the “radically simple data copy storage company,” but apparently it’s pricing and packaging models weren’t radically simple enough. So Actifio 5.1 released today has new packaging and pricing, including a turnkey solution combining storage and software, a gateway version without storage, and a utility pricing model.
Actifio claims it made 62 sales in the fourth quarter of 2012 for an average price of more than $200,000 with its appliances that change the data protection process. But the pricing could confuse customers used to more traditional backup.
“Customers told us, ‘It can be difficult to comprehend exactly what I’m buying,’” said Andrew Gilman, Actifio’s senior director of global marketing. “So we spent time crafting our packaging. We have pre-packaged configurations that make it easier to understand what’s showing up on the data center floor, and how it can be deployed quickly.”
The Actifio 100T turnkey system comes in eight standard configurations, ranging from 7 TB of protected data for remote offices to 100 TB for enterprises. Customers can stack 100 TB configurations up to 2 PB.
Actifio Gateway lets managed service providers use their existing hardware to build clouds and services based on Actifio’s copy data technology. Its software provides multitenancy and is designed to work on a mix of virtual, physical servers and storage devices.
The utility pricing model lets customers pay as their data volumes grow. It includes software, storage, maintenance and installation, starting at around $4,000 per TB per year for 30 TB and dropping to around $600 per TB in larger volumes.
Gilman said Actifio began piloting the utility model in late 2012, and it is now available on all models with more than 30 TB. Customers can sign up for three-year or five-year subscriptions.
Gilman said Actifiio will let its channel and service provider partners decide whether to use the utility model for their customers.
Actifio also added features in 5.1. They include bare metal restore to enable rebuilds of Windows and Linux hardware, the ability to non-disruptivet failover testing, and the ability to failback and SyncBack after a disaster by only moving changed application data from the DR site to the primary site.
EqualLogic founder Paula Long has a new storage startup, and she says its goal is to do for data scientists what EquaLogic did for storage administrators.
Long was one of three founders of iSCSI storage pioneer EqualLogic, which proved Ethernet can play a major role in network storage. She teamed with former EqualLogic marketing VP John Joseph a year ago to start DataGravity, which today pulled in $30 million in Series B funding to give it a total of $42 million with no products due for at least a year.
DataGravity CEO Long and president Joseph won’t give too much detail on what the company is up to, except to say it’s about gaining business intelligence from data in an easier way than is possible today.
“At EqualLogic our mission was to put an IT administrator in every storage array so an IT generalist could become a storage expert quickly and not spend a lot of time managing storage,” Long said. “We also went for the all-inclusive approach instead of dim sum although everybody thought we were nuts. We took a complicated solution and made it simple.
“Now we’re taking storage experts and marrying them with analytics and database experts, and deriving what happens when you put all those people together. We’re taking EqualLogic’s core principles of simplicity and everything packaged together, and moving that into delivering data intelligence in a simple way.”
The dim sum approach she referred to is the way the major storage vendors sold all of their software features as separate licenses. EqualLogic included all of that in the price of the array, which fit well with its strategy of offering Ethernet storage that was easier to manage than Fibre Channel and didn’t require expensive FC networking devices. EqualLogic was poised to go public when Dell acquired it for $1.4 billion in 2008.
“The last time we looked at iSCSI, SAS, SATA, and all the key enabling tools,” Joseph said. “In this company, we’re looking at a different level of the problem. The focus is now on data and the value that data brings to someone’s industry.”
Long said she expects DataGravity to start shipping product in 2014. She did say it would sell devices and not just software. When asked if the systems would be similar to big data analytics products such as Oracle Exadata, EMC Greenplum, and IBM Netezza, she said “I don’t think there’s anybody similar to us. I’ve seen slices of what we’re doing, but not the same solution.”
Venture capitalist firm Andreessen Horowitz led today’s round, with previous investors Charles River Ventures and General Catalyst Partners also participating. Andreesen Horowitz partner Peter Levine joins DataGravity’s board.
Long said DataGravity went for a B round early to get Andreesen Horowitz and Levine on board. “We’re frugal New Englanders,” she said. “We weren’t planning on raising money until the fourth quarter of this year. But Peter approached us. We thought having him join us early so he can help shape the company would be a great help.”
Levine is the former CEO of XenSource and executive vice president of Veritas Software long before Symantec acquired it. He’s also a current board member of startup Actifio. He said a startup’s leadership plays a key role in his decisions to invest.
“The reason we invested in this company is because of Paula and John,” he said. “No one in storage is more strategic or better able to articulate the storage market than Paula. The concept of turning data into information is quite profound.”
When asked what challenges DataGravity faces, Levine said “Getting a product out is always a challenge. And any time you build a new category, there’s the education and evangelism involved with going after a new market.”
Taking a step back and looking at the new or updated storage systems coming out reinforces trends I see for storage hardware in the industry. By storage hardware, I’m referring to physical system components such as the controller cards, processors and data flow electronics.
The majority of the underlying hardware used in storage comes from standard components. Many of the available storage systems use Intel-based server designs. The value to the vendor from the sale of a storage system comes primarily from the embedded software running on that hardware and associated server software used in conjunction with that storage system.
The embedded software features and capabilities that differentiate one storage system from another are the most focused on product characteristics. For most storage vendors, the software-based features are categorized and put into suites of functionality that have additional charges over and above the basic system costs.
Storage systems that cannot continue to add unique value gravitate to a price based on the hardware costs with some added margins. Characterized as a “race to the bottom,” the undifferentiated systems are generally referred to as commodity storage.
The view that storage hardware is generally moving to commodity may be a natural evolution but is personally disappointing. The manufacture of the common platform for storage systems has moved to lowest cost regions of the world. The equitability of low wages is a different topic, but the commoditization does have a direct relationship with larger scale production and magnifies cost reduction value.
There is a parallel that comes to mind, which is the U.S. steel industry. The technology and production processes for the steel industry were developed and refined in the U.S. over time. But production has moved to lower-cost regions in the world based on price pressures. With the movement of a manufacturing industry, the investment to build production facilities is used to modernize the production.
The surface level similarity between the storage hardware system base and the steel industry has several factors: production moved to follow lowest cost manufacturing, underlying technology was not changing, and investments in the industry were to make manufacturing more efficient.
There still are custom hardware based storage systems in the industry, but they are a minority and the investments in new ideas and hardware technologies seem limited. The competitive price pressures and the continued investments required for innovation make it more challenging to sustain a company with novel technology.
(Randy Kerns is Senior Strategist at Evaluator Group, an IT analyst firm).
Symantec executives Wednesday indicated backup and storage would continue to play a big role in its future, especially after a strong fourth quarter for NetBackup.
Symantec folded its earnings report into a two-hour presentation focused mainly on outlining the company’s strategic direction under CEO Stephen Bennett, who replaced Enrique Salem last July.
Symantec executives said NetBackup revenue increased 12% year-over-year, highlighting the company’s overall four percent revenue increase. Bennett and CFO James Beer said NetBackup appliances continue to sell well. They did not mention Backup Exec, which the execs have said needed fixing in the past few earnings call. Backup Exec had a tough year, following a poorly received upgrade of the SMB backup application at the start of 2012.
Symantec’s storage management product revenue grew one percent, which is the second quarter of growth after several of decline.
Despite the good results, last quarter’s earnings were overshadowed during the presentation by Symantec’s future plans. NetBackup would seem to figure prominently in those plans, especially on appliances.
The execs also said they would simplify licensing and improve technical support for all products. Bennett said he intends to make it easier to get support through other methods than by phone. “We weren’t delivering technical support experiences for our customers that we were proud of,” he said.
The Symantec execs included an object storage platform on its list of 10 product areas it will focus on. It’s not clear if that would be a new platform or object storage built into FileStore. In 2010, Symantec folded development of its nascent S4 object storage file system into FileStore.
We’ll have to wait for details on Symantec’s object storage plans. As Bennett said Wednesday, “We have nothing to announce today” as far as product specifics.
Symantec didn’t announce layoff specifics either, although Bennett said there would be a $275 million severance charge. Bloomberg reported that up to 1,000 jobs will be cut, mostly middle managers. Some will be replaced by new hires in research and development. And Reuters reported that Bennett does not plan to sell any of Symantec’s assets for now.
“We’re going to have bigger, better jobs for fewer people,” Bennett said.
Bennett will actually have one fewer job. He is giving up the chairman role he also held. Symantec split the CEO and chairman roles, naming board member Dan Schulman as the new chairman.
Bennett said he spent much of the last quarter of 2012 on a world tour meeting with customers, employees and investors while putting together the company’s forward strategy. He said the next year or so will be one of transition.
“We have to change the tires while the car is running,” Bennett said. We know which customer he got that idea from.
When startups raise funding, contributors often include strategic investors who hope to benefit from the technology the new company develops. Storage startups usually turn to technology industry giants for strategic investments.
Cloud storage startup Symform is going in a different direction. Symform today said it received a $3 million strategic investment from Second Century Ventures (SCV), a venture capital fund of the National Association of Realtors (NAR).
The Seattle-based Symform’s cloud consists of local drive space contributed by its subscribers. So maybe you can say it’s in the realtor business because it sells real estate on customers’ storage.
But the real estate trade association didn’t fund Symform because it considers it a kindred spirit. NAR will make Symform subscriptions available to its 1.1 million members as a membership benefit. The realtors add cloud backup, and Symform adds to its subscription base.
Symform bills itself as a crowd-sourced cloud network. Organizations join the network by contributing extra local drive space in exchange for fast and secure backup. Besides providing backup, the network also synchronizes data from any device.
“We see ourselves as the Skype of data storage,” Symform CEO Matthew Schiltz said.
The startup’s technology encrypts data 256-bit AES. It breaks data into 64 MB blocks, divides each block into 64 1 MB fragments, and randomly distributes them across separate devices in the Symform cloud storage network. The architecture regenerates and redistributes missing fragments even when devices fail, according to Symform technical documents.
The technology is integrated with NAS storage devices from Synology, Netgear and QNAP. The company claims it has active users in 138 countries, up from 46 at the end of 2011. It also claims more than 7 billion fragments now are stored in the Symform cloud.
The $3 million is officially part of an $11 million Series B funding round Symform announced last April. It initially raised $1.5 million in seed money, so the company has raised total of $20 million in funding. Its previous investors include Longworth Venture Partners, OVP Venture Partners and WestRiver Capital.
Brocade completed its CEO search this week, hiring Lloyd Carney to replace Mike Klayko.
Klayko said last August that he would step down as soon as the board found a replacement, ending an eight-year tenure as Brocade’s CEO. During that time, Brocade acquired its main storage switch rival McData, outdueled Cisco for the top spot in storage networking revenue and spent $2.6 billion to get into the Ethernet market by acquiring Foundry Networks. But Klayko failed to attract a buyer for Brocade despite a great deal of speculation that the company was for sale several times over the past few years.
If you’re looking for hints on where Carney might take Brocade, two things about his resume stand out. First, he has little storage background and plenty of network experience. Second, he sold off two companies he ran – Xsigo Systems to Oracle last year and Micromuse to IBM in 2005.
Xsigo is the closest Carney has come to a storage company. Xsigo actually did I/O virtualization and was more of a networking play, but did work with storage gear. After Oracle bought Xsigo, it tried to recast its technology as software-defined networking. IBM acquired network management software vendor Micromuse for around $865 million, and Carney stayed with IBM for one year to run the Micromuse division.
He has also been COO at Juniper Networks, president of Nortel’s wireless internet division and a vice president at Bay Networks as well as CEO of his own angel investment firm. Carney obviously knows his way around Silicon Valley, which could help if Brocade puts itself up for sale again. If not, you can expect the vendor to continue its push to become an Ethernet network leader while holding on to the No. 1 Fibre Channel network spot for as long as that market remains lucrative.
There have been plenty of acquisition rumors around Brocade over the years, despite Klayko’s insistence in 2009 that the company was not for sale. Hewlett-Packard and Dell were believed to be considering buying Brocade before they acquired other networking companies, and there has also been talk of private investors buying Brocade.
A two-tier data archiving approach can help free primary storage capacity, reduce expenses from regular data protection, and meet compliance or business requirements for specific data.
A two-tier strategy divides archive data based on the probability of accessing that data. Archive data can be accessed online as one tier and a deep archive as another where access may be more involved.
An online archive has these characteristics:
- Data can be transparently and directly accessed by users or applications without other intervening processes.
- The time to access data is only nominally affected compared with primary storage, with no impact on users and applications.
- Typically NAS is used because the largest amount of archived data is in the form of files. There may also be support for objects depending on systems in use.
- The online archive has support for compliance requirements such as immutability with versioning of files, audit trails of access to data, and regular integrity checking of the data.
- The storage is much less expensive than primary storage.
- Only changed files are replicated for protection.
- Systems have built-in longevity with automatic, transparent migration to another platform. The migration is non-impacting to operations or staff.
A deep archive would have different characteristics than the online archive, including:
- Data moved to the deep archive is not expected to be needed again for any normal processing.
- Access from a deep archive may require greater time than applications than tolerate.
- Data may be stored in the form of objects with metadata about ownership and retention controls in order to permit massive scaling. The storage could be on local systems or in a cloud service.
- Longevity is handled automatically by the systems or service with transparent migrations.
- Compliance features are fully supported including digital data destruction.
- Protection is automatic with geographically separated replicated copies.
There is justification for a two-tier archive. You can gain large savings from moving data not expected to be used again to the lowest costing storage without compromising protection, integrity, or longevity. Economic models show the advantage and the compounding value over time as data is retained and more is added. Development of new systems and software that support object storage for very large scale of items in the archive and transparent migration for longevity are enabling wider usage. For all of these reasons, a two-tier archive is a good fit for a storage strategy.
(Randy Kerns is Senior Strategist at Evaluator Group, an IT analyst firm).