Storage Soup


January 26, 2017  12:44 PM

Commvault products rollout promised throughout 2017

Sonia Lelii Sonia Lelii Profile: Sonia Lelii

Following a quarter of solid revenue growth to end 2016, Commvault Systems Inc. plans a string of product enhancements throughout 2017. The additions are designed to improve Commvault’s performance in the cloud, and with software-defined storage and business analytics.

Commvault Wednesday reported $167.8 million in revenue last quarter, a 7% increase from last year. Software revenue of $77.3 million increased 8% year over year, while service revenues of $88.5 million increased 5%. Commvault broke even for the quarter following two straight quarters of losses.

During the earnings call Wednesday, CEO Bob Hammer laid out plans for a Commvault products rollout that will culminate in the Commvault GO 2017 user conference in November.

Hammer said the company plans to add capabilities for business analytics, search and business process automation as part of its strategy to become a full-scale data management player for on-premises and in the cloud deployments.

“Next month, we will further enhance our offerings with new solutions with industry-leading Web-based UIs and enhanced automation to make it easy for customers to extend data services across the enterprise Commvault solutions,” Hammer said of the Commvault products roadmap. “[We will deliver] some of the key enhancements tied to the journey to the cloud and converged data management.”

The enhancements include new data and application migration capabilities for Oracle applications and the Oracle cloud, big data, fast data and SAP HANA. Commvault already supports Hadoop, Greenplum and IBM’s General Parallel File System.

Products for the AWS cloud

Commvault will also add tools for migrating and cloning data resources to the cloud. These include automated orchestration of compute and storage services for disaster recovery, quality assurance, development and testing, optimizing cloud protection, and recovery offerings inside and across clouds to secure data against ransomware risks.

Earlier this week, Commvault added optimized cloud reference architectures for Amazon Web Services (AWS) that will make it easier for customers to implement comprehensive data protection and management in the AWS cloud.

Commvault customers will have the ability to direct data storage to specific AWS services — such as Amazon Simple Storage Service (Amazon S3), Amazon S3 Standard-Infrequent Access and Amazon Glacier for cold storage.

Hammer said the amount of data stored using the Commvault software within public environments increased by 250% during 2016.

“When you look at our internal numbers, in both cases, we’ve had strong pull from both AWS and Microsoft Azure,” Hammer said. “The pull from AWS has been stronger, so there’s a higher percentage of customers’ data in AWS, but I will also say that we are gaining a lot of momentum and traction with Microsoft and Azure.”

Hammer said Commvault continues to make progress on its software-defined data service offerings that are in early release.

“More and more of our customers are replacing or planning to replace their current IT infrastructure, with low-cost, flexible, scalable infrastructures, similar to those found in the public cloud,” he said.

“Our teams have been hard at work to embed those cloud-like capabilities directly into the Commvault data platform, so we can ensure the delivery of a new class of active, copy management and direct data usage services across an infrastructure built with low-cost, scale-out hardware,” Hammer said.

Other upgrades to Commvault products include new and enhanced enterprise search, files sync-and-share collaboration, cloud-based email and endpoint protection during the middle of 2017.

Growth dependent on new products

Commvault has been working to dig itself out of a sales slump that began in 2014. Hammer said the company still faces some critical challenges, and continued growth depends on its ability to win more large deals. A lot of its success will turn on releases of new Commvault products.

“Our ability to achieve our growth objectives is dependent on a steady flow of $500,000 and $1-million-plus deals,” he said. “These deals have quarterly revenue and earnings risk due to their complexity and timing. Even with large funnels, large deal closure rates may remain lumpy. In order to achieve our earnings objectives, we need to prudently control expenses in the near-term without jeopardizing our ability to achieve our software growth objectives for our critical technology innovation objectives.”

Commvault added 600 new customers during the quarter, bringing its total customer base to 240,000. Revenue from enterprise deals, defined at sales of more than $100,000 in software, represented 57% of the total software revenue and the number of enterprise deals increased by 22% year-over-year.

January 20, 2017  8:41 AM

Veeam Software cracked $600 million revenue mark in 2016

Sonia Lelii Sonia Lelii Profile: Sonia Lelii

While much of the storage market is stagnant or down, data protection vendor Veeam Software said it grew revenue 28% in 2016 by expanding its business into enterprises and the cloud.

Veeam, a privately held company, this week reported its financial results for 2016. It claimed $607.4 million in bookings in 2016, which included new license sales and maintenance revenue, compared to $474 million in 2015.

Doug Hazelman, Veeam  Software’s vice president for product strategy and chief evangelist, said the bulk of the growth came from its flagship Veeam Availability Suite. The suite handles backup, restores and replication through Veeam Backup and Replication along with monitoring, reporting and capacity planing in Veeam ONE for VMware vSphere and Microsoft Hyper-V deployments.

But the Veeam Cloud and Service Provider VCSP program, which offers Disaster Recovery as a Service (DRaaS) and Backup as a Service (BaaS) helped contribute to the revenue growth, Hazelman said.

VCSP generated 79 percent year-over-year growth in 2016 as Veeam pushed to move upstream into the enterprise. License booking grew by 57% annually from enterprise-level customers.

Veeam reported the VCSP program expanded to more than 14,300 service and cloud providers. The vendor claims 230,000 customers worldwide and its Veeam Availability Suite protects up to 13.3 million virtual machines, with  1 million virtual machines using the VCSP management product.. The company added 50,000 paying customers last year.

“They are not all enterprises customers,” Hazelman said. “It’s (still) a lot of SMB commercial accounts (but) we added 761 enterprise customers in 2016.”

Hazelman said the cloud portion of Veeam’s business helped close many deals. Veeam has four business segments — SMB, commercial accounts, enterprise-level accounts and the cloud.

“The VCSP product is the fastest growing,” Hazelman said. “It’s one of the fastest growing segments. It’s not the biggest in revenue but it’s the fastest growing.”

Last year Veeam added a fully functional physical backup server backup product. Veeam Software initially started as a virtual machine backup specialist but moved into physical backup due to customer requests as it moved into enterprise accounts.

“The physical server did help a lot on closing deals but it didn’t add a lot to the total year number,” he said.

 


January 16, 2017  6:04 PM

Alluxio in-memory storage software gains EMC mindshare

Garry Kranz Garry Kranz Profile: Garry Kranz

In-memory storage startup Alluxio has struck a partnership with Dell EMC.

The news marks Alluxio’s first formal alliance with a North American storage vendor. Alluxio in September integrated its software on Chinese vendor Huawei Technologies’ FusionStore elastic block storage.

The San Mateo-based vendor’s Alluxio Enterprise Edition software will be available on Dell EMC Elastic Cloud Storage (ECS) appliances. ECS is the successor to EMC Atmos object storage.

The Exabyte-scale ECS arrays are built using commodity servers and EMC ViPR virtualized storage. Since acquiring EMC, Dell gradually has started moving EMC software-defined storage products to its PowerEdge line.

The ECS private cloud uses active-active architecture to support Hadoop Distributed File System data analytics. Dell EMC sells ECS on turnkey appliances and also as a managed service.

Alluxio Enterprise Edition is a commercial version of the startup’s open source-based virtual distributed in-memory storage software. It allows applications to share data at memory speed across disparate storage systems.

The chief attributes are high availability and high performance. Alluxio allows data to persist in a host to speed real-time data analytics.

Alluxio software is designed to accelerate Apache Spark and other workloads that process data in memory.  Storage from disparate data stores is presented in a global namespace.

EMC is not expected to bundle Alluxio in its software stack. Alluxio CEO Haoyuan Li said the partnership allows EMC to better recommend the in-memory storage to ECS customers that need high performance.

“The benefit we bring is allowing EMC ECS customers to pull data from other storage systems. Previously, you had to either move the data manually into the new compute cluster. We automate data movement,” Li said. “We  also accelerate end-to-end performance of your applications with our memory-centric architecture, which manages the compute-side storage.”

Partnering with EMC is a feather in the vendor’s cap, although perhaps not as noteworthy as if Alluxio was qualified to run on Dell EMC Isilon scale-out NAS. Big data jobs tend to use HDFS as the underlying substrate, not EMC ECS.

“ECS as a storage platform does not have a huge share of the market at this point, so this partnership won’t have a material impact on Alluxio’s top line. But it’s an interesting partnership and definitely a win for them that could lead to other partnerships with Dell EMC,” said Arun Chandrasekaran, a research vice president at Gartner Inc.


January 11, 2017  4:58 PM

Panzura raises $32 million in funding

Sonia Lelii Sonia Lelii Profile: Sonia Lelii
Panzura

Cloud NAS vendor Panzura raised $32 million in Series E equity funding this week to expand its product and give organizations an alternative to what CEO Patrick Harr calls a “dying on-premises model.”

Harr said Panzura will add support for block, Internet of Things and Hadoop interfaces for data analytics to go with its original NAS protocols. He also plans expansion outside the U.S., into  the U.K. and Europe.

“We are focusing on scaling our business in two areas,” Harr said. “One is on the channel side and second is the continued expansion of our product portfolio. We are adding addition protocols to consolidate what we view is the dying on-premise model.”

Harr said the on-premises only storage model will collapse, and he offers Panzura as a cloud-first model for building a hybrid cloud. Since he became Panzura’s CEO in May 2016, the vendor has expanded its  its hybrid cloud storage controllers and added archiving capabilities. He said he has also hired 18 engineers during his eight months at Panzura.

“We are very much in growth mode,” he said.

Harr said in 2016 Panzura added 100 new enterprise-level customers and expanded its partnerships to include Amazon Web Services, Google, IBM and Microsoft Azure. It also added 26 petabytes of customer enterprise storage.

Panzura’s new Freedom Archive software that moves infrequently accessed data to public clods or low-cost, on-premises storage could  bring the vendor into new markets. Target archive markets include healthcare, video surveillance, gas and seismic exploration and media and entertainment. Freedom Archive is a separate application from Panzura’s flagship Cloud NAS platform, which caches frequently used primary data on-site and moves the rest to the cloud.

Last summer, Panzura launched a new series of cloud storage controllers with more capacity and the ability to expand to handle multiple workloads. The new 5000 hybrid cloud storage controllers replace Panzura’s previous 4000 series.

The E funding round brings Panzura’s total funding to around $90 million. The investment was led by Matrix Partners and joined by Meritech Capital Partners, Opus Capital, Chevron, Western Digital and an undisclosed strategic investor.


January 11, 2017  7:28 AM

Kaminario secures $75 million funding round

Carol Sliwa Carol Sliwa Profile: Carol Sliwa

All-flash array pioneer Kaminario kicked off 2017 with a cash infusion of $75 million to accelerate global expansion and fuel product support for non-volatile memory express (NVMe) technologies.

Kaminario’s fifth funding round increased its overall total to $218 million since the company launched in 2008. Waterwood Group, a private equity firm based in Hong Kong, led the latest financing effort. Additional investors included Sequoia, Pitango, Lazarus, Silicon Valley Bank and Globespan Capital Partners. Kaminario’s most recent previous funding came in January 2015.

Founder and CEO Dani Golan said Kaminario has more than doubled revenue in each of the last four years. Global expansion and go-to-market efforts will focus on eastern and western Europe, Asia Pacific and the Middle East.

Kaminario will concentrate on incorporating NVMe technologies into the company’s K2 all-flash array. Kaminario currently uses SATA-based 3D TLC NAND flash drives. NVMe-based PCI Express (PCIe) solid-state drives (SSDs) can lower latency and boost performance, and NVMe over Fabrics (NVMe-oF) can extend the benefits across a network.

Golan said Kaminario does not support NVMe SSDs yet because “the price is too high.” He added that the NVMe-oF technology “is not mature enough to run in mission-critical and business-critical environments.”

A handful of new companies are starting to ship products with NVMe drives, but Golan said Kaminario’s NVMe support will probably wait until 2018.

“The ecosystem is not there yet,” he said.

Golan said startups that currently support NVMe use drives directly attached to servers. But, with a mature array platform on the market, Kaminario needs “to drive a full storage software stack over NVMe Fabrics,” he said.

“The big gain is [going to be with] NVMe over Fabrics, because NVMe drives are just media. That’s not interesting. The interesting part is NVMe over Fabrics and NVMe shelves,” Golan said.

Kaminario’s architecture allows customers to add controllers or shelves in any combination, scaling compute separately from storage, Golan said.


December 28, 2016  8:28 AM

Survey says: Business disaster recovery plans need work

Paul Crocetti Paul Crocetti Profile: Paul Crocetti

While companies may be confident in their disaster recovery strategy, DR planning and testing still has a ways to go, according to results of a survey by data protection vendor Zetta.

Among 403 IT professionals, 88% said they were somewhat or very confident in their disaster recovery. But while 96% said they have some type of DR, 40% responded that their organization lacks a formally documented DR plan. For those with a plan, 40% said they test it once a year, while 28% rarely or never test it.

Similar to Zetta’s findings, a recent TechTarget survey found that companies are generally confident in their business disaster recovery plans. Also similarly, 65% test their DR plan just once a year or less, according to the TechTarget survey.

“Companies need to be more rigorous around how they develop their DR plans,” Zetta CEO Mike Grossman said. That’s especially important given that more than half of the companies in the survey experienced a downtime event in the last five years.

It’s not enough for business disaster recovery plans to ask, “What happens if a hurricane hits?” According to the survey, the most common type of downtime event for an organization in the last five years was a power outage. For those that had a downtime event in that time period, nearly 75% said their organization suffered a power outage. Only 20% experienced a natural disaster in the last five years. A hardware error was the second most common response, with 53%. Both a human error and a virus/malware attack registered at close to 35%.

When people think about disaster recovery, they often think of catastrophic events. “In reality, that’s not what causes the biggest impacts day to day,” Grossman said.

Several outages made news in 2016, including incidents at Delta Air Lines and Southwest. With proper DR testing, organizations can prevent major problems like outages.

Grossman recommends testing business disaster recovery plans a minimum of once per quarter and preferably once per month.

“Unless you have a continuous process, including testing, you’re not really protected,” Grossman said.

But “not all testing is created equal,” he warned. The more rigorous and real-life, the better.

A lot of companies like to think they’re protected but they’re not, Grossman said.

According to the “State of Disaster Recovery” survey, 55% changed their DR strategy after a downtime event. It’s a positive that companies are paying more attention to the issue but a negative that they underestimate the risk, Grossman said.

For those that made changes following their last downtime event, 55% of organizations changed their DR approach, 55% added DR technology and 39% increased their DR investment. Almost 1 in 4 respondents said they increased DR testing.

As IT gets more complex, companies like Zetta need to make DR easier to manage, Grossman said. But how do you take the complexity out of something that’s complicated?

The cloud helps. Zetta is “cloud-first,” Grossman said, with backup in the cloud and failover to the cloud. And security, which has traditionally been a challenge in the cloud, is getting better.

Ninety percent of IT professionals who are using the cloud in their disaster recovery strategy said they are confident in their DR, according to the survey. Seventy-four percent of organizations using only on-premises DR said they are confident in their plans.

“It’s simpler,” Grossman said of the cloud. “It provides better protection.”


December 27, 2016  4:46 PM

Using multiple BDR products doesn’t always translate into faster data recovery

Sonia Lelii Sonia Lelii Profile: Sonia Lelii

Companies spend a lot of money on disaster recovery solutions but that doesn’t translate into faster data recovery, according to a survey conducted by Quorum titled the “State of Disaster Recovery.”

The report, which surveyed 250 CIOs, CTO and IT vice presidents, found that 80% of the companies surveyed claimed it takes more than an hour to recover from a server failure and 26% said it takes more than two hours for data recovery. Only 19% said it took less than an hour to recover. Seventy-two percent consider the speed of backup and data recovery as “critical.”

“All those backup and disaster recovery (BDR) products aren’t making their recovery any faster,” the report claimed. “While speed is essential for continuity and security… a staggering 80 percent of respondents need more than an hour to recover from a server failure. And its gets worse: more than a quarter need more than two hours.”

Sixty-four percent use more than three different disaster recovery solutions, with 26% using more than five and less than 40% using between one and three different disaster recovery products.

Moreover, a majority of the respondents said they wanted a method to simplify the management of all the BDR products they are using. Ninety percent of the respondents want to consolidate their disaster recovery solutions into one dashboard.

The report shows that the movement to the cloud has grown. Seventy-five percent of the survey respondents are using cloud-based disaster recovery solutions, while 36% use a hybrid model mixing on-premises and cloud DR. Thirty-nine percent use on Disaster Recovery as a Service (DraaS).

Eighty-nine percent have plans for more cloud-based disaster recovery solutions, with five percent stating they have no further plans and six percent stating they “don’t know.”

Disaster recovery products are growing in importance as as concerns about security increase. Seventy-seven percent said they have used their disaster recovery solutions after a security threat event occurred. Fifty-three percent respondents are worried about security threats compared to concerns about hardware failure, backup disk corruption or a natural disaster.

“Natural disasters crashing in on a data center, an employee error or a hardware failure can all pose immense problems for an organization,” the report stated. “But a skilled and willful attack can cripple a brand for years and could cost a literal fortune. Ransomware attacks particularly depend on a team’s inability to recover quickly.”

Companies are diligent about testing the production level with their current disaster recovery products. Eighty-eight percent of the respondents said they can achieve production-level testing with their current DR.


December 22, 2016  10:31 AM

Asigra’s data recovery report details how little gets recovered

Paul Crocetti Paul Crocetti Profile: Paul Crocetti

How much data do you actually recover?

That’s a question that Asigra users answered in a data recovery report.

Featuring statistics gathered from nearly 1,100 organizations across eight sectors, between Jan. 1, 2014 and August 1, 2016, the backup and recovery software vendor’s report found that those users only recover about 5% of their data on average.

“People really don’t recover a lot of data,” said Eran Farajun, executive vice president at Asigra. “Ultimately they’re paying like they recover all their data.”

Farajun compared the situation to what many experience with cable bills – customers often pay for hundreds of stations but don’t watch all of them.

Broken up by industry, manufacturing and energy recovered the most, averaging about 6%, according to the data recovery report. Public sector and health care recovered the least, at about 2%.

Users picked file-level systems as the most common data type restored.

The most common reason cited for a data restoration request was to access a previous generation of data, selected by 52% of users. Ransomware was a major cause of that need, Farajun said.

The second most common reason for data recovery was user error or accidental deletion, with 13%. A lost or stolen device was third with 10%. Interestingly, disaster was only picked by 6% of respondents, according to the data recovery report.

Asigra is working on improving cybersecurity and how it can best combine with data protection, Farajun said. In the face of the growing threat of ransomware, Farajun also suggested organizations educate their employees, have strong anti-virus protection and back up their data.

The average size of a recovery across all sectors was 13 GB.

Farajun described cost as the bane of a company’s relationship with its backup vendor.

“Mostly [companies] don’t feel they can do anything about it,” Farajun said. “You can do something about it.”

In 2013, Asigra launched its Recovery License Model and now almost all of the vendor’s customers use it. Pricing is based on the percentage of data recovered over the course of a contractual term, with a ceiling of 25%.

Asigra did a healthy amount of research before launching the model. It looked into other markets, such as the music and telecommunications industries, and assorted “fair-pay” cases. Music customers, for example, can now buy one-song downloads vs. an entire album that they may not listen to in its entirety.

“What happened?” Farajun said. “People bought boatloads and boatloads of songs.”

Asigra had been nervous when undertaking the new model. It anticipated a three-year dip but revenue started to go up after 12 months, Farajun said.

So why hasn’t this model caught on more in the backup market?

“There’s no incentive for software vendors to reduce their prices,” Farajun said. “We’re trying to price based on fairness.”

Farajun said the data recovery report vindicates the vendor’s underlying premise.

“People don’t recover nearly as much as they think they do and they overpay for their backup software.”


December 21, 2016  7:29 PM

Druva Cloud Platform zeros in on inactive data

Garry Kranz Garry Kranz Profile: Garry Kranz

Data protection provider Druva has launched platform-as-a-service capabilities to support indexed search queries of data across local and public cloud storage.

The Druva Cloud Platform is designed to help enterprises better manage and use information related to analytics, compliance, e-discovery and records retention. More than 30 RESTful APIs are included to allow third-party vendors to access data sets in Druva storage.

The API calls allow disparate information management applications to pull data directly from Druva InSync and Phoenix storage platforms. Druva cloud storage uses Amazon Web Services or Microsoft Azure as a target destination for inactive data that companies need to keep for legal regulations.

Global source-side data deduplication creates a single gold copy in the cloud. The Druva cloud technology takes point-in-time snapshots of queried data and applies advanced encryption. Changed data blocks are synchronized to deduplicated data sets in Amazon Web Services or Microsoft Azure.

The APIs allow disparate information management systems to communicate directly with Druva to improve data hygiene, said Dave Packer, a Druva vice president of product and corporate marketing.

“We designed Druva Cloud Platform so your data doesn’t have to traverse across corporate networks.  We take care behind the scenes to ensure handoffs occur accordingly, without taxing internal systems,” Packer said.

Druva’s SaaS pricing is based on deduplicated data and starts at $6 per user per month.


December 21, 2016  2:54 PM

Panzura wants to provide archiving freedom, for a price

Dave Raffo Dave Raffo Profile: Dave Raffo

Cloud NAS vendor Panzura is expanding into archiving.

The vendor today made available Freedom Archive software that moves infrequently accessed data to public clouds or low-cost on-premises storage.

Panzura CEO Patrick Harr describes Freedom Archive as storage for “long-term unstructured archived data that now sits on-premises on traditional NAS or tape libraries. The key thing is, it’s for active data.”

Harr said target markets include healthcare, video surveillance, gas and seismic exploration and media and entertainment.

Freedom Archive is a separate application from Panzura’s flagship Cloud NAS platform, which caches frequently used primary data on-site and moves the rest to the cloud. Freedom Archive is available on a physical appliance or as software only. It uses caching algorithms and smart policy manager to identify cooler data and move it from on-premises storage to the cloud. Freedom Archive compresses, deduplicates and encrypts data at-rest and in-flight.

Freedom Archive supports Amazon Web Services, Microsoft Azure, Google, and IBM Cloud public clouds, and private object storage from IBM, Hitachi Data systems and Dell EMC. Customers can download the software from Panzura’s web site. Pricing begins at less than two cents per GB per month, which does not include public cloud subscriptions. There is a 30-day free trial period for the software. IBM Cloud is offering 10 TB of free storage for 30 days and AWS will give 10 TB of free storage for 14 days to Freedom Archive customers.

Harr said Chevron, American College of Radiology, NBC Universal, Time Warner Cable, and law enforcement agencies already use Freedom Archive. The product became generally available today.

An expansion into archiving was among the goals Harr laid out when he became Panzura CEO earlier this year.

Harr emphasized Freedom Archive is for active data rather than cold data that rarely if ever needs to be accessed. That means Panzura is not competing with public cloud services such as Amazon Glacier, Microsoft Azure Cool Blob and Google Coldline storage.

“This is complementary to what Google, Azure and AWS do, not competitive,” he said. “Glacier is not for active data, and it’s extremely expensive to pull data back from Glacier. Ours is a hybrid cloud where you still have a performant nature to your data.

“Chevron has to access data in real-time instead of waiting for a slow response that doesn’t meet the business need. In the medical space, you don’t want to have to wait when you pull back an MRI.”


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: