The processes for keeping data safe when employees leave a company are fundamental data protection best practices: backup, archive and encryption. Yet barely half of the organizations that took part in a recent survey have a plan that ensures data can be recovered if an employee changes or deletes it on the way out the door.
Osterman Research conducted a survey of 187 IT and human resources professionals in October 2016 and released the findings this month. The results show organizations are generally not prepared for data theft protection issues with departing employees, said Osterman Research president Michael Osterman. The report found that fewer than three in five organizations have a backup and recovery platform that ensures data can be recovered if an employee maliciously changes or deletes data before giving notice to leave.
“They know what to do, they’re just not doing it very much,” Osterman said.
Osterman suggested organizations should develop a plan for this issue and nail down who’s in charge of ensuring sensitive data is protected.
The report found that 69% of the business organizations surveyed had suffered significant data or knowledge loss from employees who had left.
Those employees may not have taken data mischievously. According to the report, there are three reasons employees leave with corporate data: They do it inadvertently; they don’t feel that it’s wrong; or they do it with malicious intent.
Mobilizing mobile protection
The BYOD movement has complicated matters. For example, an employee can create content on a personal mobile device and store it in a personal Dropbox account or another cloud-based system. That content never hits the corporate server.
“Get control over that kind of content,” Osterman said. One way to do that is to replace personal devices with ones managed by IT.
Virtual desktops can help data theft protection. Because they store no data locally, virtual desktops make it more difficult for employees to misappropriate data, the report said.
The report stressed it is important that “every mobile device can be remotely wiped” so former employees don’t have access to the content.
“Enterprise-approved apps and any associated offline content can be remotely wiped, even if the device is personally owned,” the report said.
Backup, archive, encrypt
A proliferation of cloud applications also makes it harder to recover employee data.
“While IT has the ability to properly back up all of the systems to which it has access, a significant proportion of corporate content, when stored in personally managed repositories, is not under IT’s control,” the report said. “Office 365, as well as most cloud application providers, do not provide backup and recovery services in a holistic manner, and so organizations can have a false sense [of] security about the data that is managed by their end users.”
To maintain complete visibility of sensitive corporate data across all endpoints, cloud applications and other storage repositories, the report suggests deploying a content archiving system.
“Email archiving is the logical and best first place to start the process of content archiving, but other data types — such as files, social media content, text messages, web pages and other content — should also be considered for archiving as well,” the report said.
The data theft protection report advocates encrypting data in transit, at rest and in use, regardless of its location. In addition to manual encryption, Osterman Research recommends encryption that automatically scans content based on policy and then encrypts it appropriately.
“Encryption alone can prevent much of the data loss that occurs when employees leave a company,” the report said.
Report ‘hit a nerve’
In a fairly decent economy, approximately one in four employees will leave a company in a year, Osterman said.
An Osterman Research client originally suggested the organization undertake the data theft protection report.
“I think it hit a nerve with a lot of companies,” Osterman said.
The sponsors of the report were Archive360, Druva, Intralinks, OpenText, Sonian, Spanning, SyncHR and VMware.
The fundamental goals of the report were to make people more aware of the issue and what can happen if they are not careful with data, and to raise awareness about backing up data and archiving, Osterman said.
Quantum’s scale-out storage business is growing like a weed, with the help of a large weed grower.
You have to love a market where deals include tape plus disk, and range from law enforcement to legal marijuana merchants. The Quantum video surveillance storage business last quarter included all of that.
Gacek said Quantum closed the most video surveillance deals ever last quarter. Running through a list of large wins, he included police departments in Canada and India, as well as smaller law enforcement agencies and “a company focused on [the] emerging cannabis growth market, where surveillance of the facility is critical.”
Each large Quantum video surveillance deal included StorNext software, disk plus tape, “reinforcing the power of our tiered storage value and expertise,” Gacek said on Quantum’s earnings call Wednesday.
Flash, dense drives push disk backup deals
Quantum’s disk-based backup revenue grew 17% year over year to $22.9 million. That success came after the release of the enterprise DXi6900-S deduplication library that uses flash to speed up data ingest. The 6900-S also includes Seagate 8 TB self-encrypting hard disk drives. Gacek said DXi libraries won seven-figure deals at an Asian taxation department, a European insurance company and other large deals at a U.S. telecom and European supermarket chain.
“It’s a combination of flash that handles metadata and 8 terabyte drives that give it density. Nothing else looks like it,” Gacek said of the DXi6900-S.
Scale-out (StorNext) revenue increased 12% to $39.8 million, including Quantum video surveillance deals. Scale-out storage also includes media and entertainment, and technical workflows such as unstructured data archiving. Quantum claimed more than 100 new scale-out customers and a near-70% win-rate in the quarter in scale-out tiered storage.
Total data protection revenue, including tape, increased 3% to $83.1 million despite a small drop in tape automation.
Overall, Quantum’s revenue of $133.4 million for the quarter increased $5.4 million over last year, and its $5 million profit follows a slight loss a year ago.
Gacek forecasted revenue of $120 million to $125 million this quarter, which is Quantum’s fiscal fourth quarter. “We are teed up for a good one next quarter, but I am not using superlatives like great and fantastic yet, which I think we have potential for,” he said.
Quantum video surveillance, archiving deals include tape
Part of Gacek’s reason for optimism is new uses for tape in cloud archiving.
“We believe there is a shift in tape usage to the archive scale-out, cloud-like architectures,” Gacek said. “And I think you are going to see tape media usage go up quite dramatically as an archive use case.”
More legalized marijuana might help as well.
Following a quarter of solid revenue growth to end 2016, Commvault Systems Inc. plans a string of product enhancements throughout 2017. The additions are designed to improve Commvault’s performance in the cloud, and with software-defined storage and business analytics.
Commvault Wednesday reported $167.8 million in revenue last quarter, a 7% increase from last year. Software revenue of $77.3 million increased 8% year over year, while service revenues of $88.5 million increased 5%. Commvault broke even for the quarter following two straight quarters of losses.
During the earnings call Wednesday, CEO Bob Hammer laid out plans for a Commvault products rollout that will culminate in the Commvault GO 2017 user conference in November.
Hammer said the company plans to add capabilities for business analytics, search and business process automation as part of its strategy to become a full-scale data management player for on-premises and in the cloud deployments.
“Next month, we will further enhance our offerings with new solutions with industry-leading Web-based UIs and enhanced automation to make it easy for customers to extend data services across the enterprise Commvault solutions,” Hammer said of the Commvault products roadmap. “[We will deliver] some of the key enhancements tied to the journey to the cloud and converged data management.”
The enhancements include new data and application migration capabilities for Oracle applications and the Oracle cloud, big data, fast data and SAP HANA. Commvault already supports Hadoop, Greenplum and IBM’s General Parallel File System.
Products for the AWS cloud
Commvault will also add tools for migrating and cloning data resources to the cloud. These include automated orchestration of compute and storage services for disaster recovery, quality assurance, development and testing, optimizing cloud protection, and recovery offerings inside and across clouds to secure data against ransomware risks.
Earlier this week, Commvault added optimized cloud reference architectures for Amazon Web Services (AWS) that will make it easier for customers to implement comprehensive data protection and management in the AWS cloud.
Commvault customers will have the ability to direct data storage to specific AWS services — such as Amazon Simple Storage Service (Amazon S3), Amazon S3 Standard-Infrequent Access and Amazon Glacier for cold storage.
Hammer said the amount of data stored using the Commvault software within public environments increased by 250% during 2016.
“When you look at our internal numbers, in both cases, we’ve had strong pull from both AWS and Microsoft Azure,” Hammer said. “The pull from AWS has been stronger, so there’s a higher percentage of customers’ data in AWS, but I will also say that we are gaining a lot of momentum and traction with Microsoft and Azure.”
Hammer said Commvault continues to make progress on its software-defined data service offerings that are in early release.
“More and more of our customers are replacing or planning to replace their current IT infrastructure, with low-cost, flexible, scalable infrastructures, similar to those found in the public cloud,” he said.
“Our teams have been hard at work to embed those cloud-like capabilities directly into the Commvault data platform, so we can ensure the delivery of a new class of active, copy management and direct data usage services across an infrastructure built with low-cost, scale-out hardware,” Hammer said.
Other upgrades to Commvault products include new and enhanced enterprise search, files sync-and-share collaboration, cloud-based email and endpoint protection during the middle of 2017.
Growth dependent on new products
Commvault has been working to dig itself out of a sales slump that began in 2014. Hammer said the company still faces some critical challenges, and continued growth depends on its ability to win more large deals. A lot of its success will turn on releases of new Commvault products.
“Our ability to achieve our growth objectives is dependent on a steady flow of $500,000 and $1-million-plus deals,” he said. “These deals have quarterly revenue and earnings risk due to their complexity and timing. Even with large funnels, large deal closure rates may remain lumpy. In order to achieve our earnings objectives, we need to prudently control expenses in the near-term without jeopardizing our ability to achieve our software growth objectives for our critical technology innovation objectives.”
Commvault added 600 new customers during the quarter, bringing its total customer base to 240,000. Revenue from enterprise deals, defined at sales of more than $100,000 in software, represented 57% of the total software revenue and the number of enterprise deals increased by 22% year-over-year.
While much of the storage market is stagnant or down, data protection vendor Veeam Software said it grew revenue 28% in 2016 by expanding its business into enterprises and the cloud.
Veeam, a privately held company, this week reported its financial results for 2016. It claimed $607.4 million in bookings in 2016, which included new license sales and maintenance revenue, compared to $474 million in 2015.
Doug Hazelman, Veeam Software’s vice president for product strategy and chief evangelist, said the bulk of the growth came from its flagship Veeam Availability Suite. The suite handles backup, restores and replication through Veeam Backup and Replication along with monitoring, reporting and capacity planing in Veeam ONE for VMware vSphere and Microsoft Hyper-V deployments.
But the Veeam Cloud and Service Provider VCSP program, which offers Disaster Recovery as a Service (DRaaS) and Backup as a Service (BaaS) helped contribute to the revenue growth, Hazelman said.
VCSP generated 79 percent year-over-year growth in 2016 as Veeam pushed to move upstream into the enterprise. License booking grew by 57% annually from enterprise-level customers.
Veeam reported the VCSP program expanded to more than 14,300 service and cloud providers. The vendor claims 230,000 customers worldwide and its Veeam Availability Suite protects up to 13.3 million virtual machines, with 1 million virtual machines using the VCSP management product.. The company added 50,000 paying customers last year.
“They are not all enterprises customers,” Hazelman said. “It’s (still) a lot of SMB commercial accounts (but) we added 761 enterprise customers in 2016.”
Hazelman said the cloud portion of Veeam’s business helped close many deals. Veeam has four business segments — SMB, commercial accounts, enterprise-level accounts and the cloud.
“The VCSP product is the fastest growing,” Hazelman said. “It’s one of the fastest growing segments. It’s not the biggest in revenue but it’s the fastest growing.”
Last year Veeam added a fully functional physical backup server backup product. Veeam Software initially started as a virtual machine backup specialist but moved into physical backup due to customer requests as it moved into enterprise accounts.
“The physical server did help a lot on closing deals but it didn’t add a lot to the total year number,” he said.
In-memory storage startup Alluxio has struck a partnership with Dell EMC.
The news marks Alluxio’s first formal alliance with a North American storage vendor. Alluxio in September integrated its software on Chinese vendor Huawei Technologies’ FusionStore elastic block storage.
The San Mateo-based vendor’s Alluxio Enterprise Edition software will be available on Dell EMC Elastic Cloud Storage (ECS) appliances. ECS is the successor to EMC Atmos object storage.
The Exabyte-scale ECS arrays are built using commodity servers and EMC ViPR virtualized storage. Since acquiring EMC, Dell gradually has started moving EMC software-defined storage products to its PowerEdge line.
The ECS private cloud uses active-active architecture to support Hadoop Distributed File System data analytics. Dell EMC sells ECS on turnkey appliances and also as a managed service.
Alluxio Enterprise Edition is a commercial version of the startup’s open source-based virtual distributed in-memory storage software. It allows applications to share data at memory speed across disparate storage systems.
The chief attributes are high availability and high performance. Alluxio allows data to persist in a host to speed real-time data analytics.
Alluxio software is designed to accelerate Apache Spark and other workloads that process data in memory. Storage from disparate data stores is presented in a global namespace.
EMC is not expected to bundle Alluxio in its software stack. Alluxio CEO Haoyuan Li said the partnership allows EMC to better recommend the in-memory storage to ECS customers that need high performance.
“The benefit we bring is allowing EMC ECS customers to pull data from other storage systems. Previously, you had to either move the data manually into the new compute cluster. We automate data movement,” Li said. “We also accelerate end-to-end performance of your applications with our memory-centric architecture, which manages the compute-side storage.”
Partnering with EMC is a feather in the vendor’s cap, although perhaps not as noteworthy as if Alluxio was qualified to run on Dell EMC Isilon scale-out NAS. Big data jobs tend to use HDFS as the underlying substrate, not EMC ECS.
“ECS as a storage platform does not have a huge share of the market at this point, so this partnership won’t have a material impact on Alluxio’s top line. But it’s an interesting partnership and definitely a win for them that could lead to other partnerships with Dell EMC,” said Arun Chandrasekaran, a research vice president at Gartner Inc.
Cloud NAS vendor Panzura raised $32 million in Series E equity funding this week to expand its product and give organizations an alternative to what CEO Patrick Harr calls a “dying on-premises model.”
Harr said Panzura will add support for block, Internet of Things and Hadoop interfaces for data analytics to go with its original NAS protocols. He also plans expansion outside the U.S., into the U.K. and Europe.
“We are focusing on scaling our business in two areas,” Harr said. “One is on the channel side and second is the continued expansion of our product portfolio. We are adding addition protocols to consolidate what we view is the dying on-premise model.”
Harr said the on-premises only storage model will collapse, and he offers Panzura as a cloud-first model for building a hybrid cloud. Since he became Panzura’s CEO in May 2016, the vendor has expanded its its hybrid cloud storage controllers and added archiving capabilities. He said he has also hired 18 engineers during his eight months at Panzura.
“We are very much in growth mode,” he said.
Harr said in 2016 Panzura added 100 new enterprise-level customers and expanded its partnerships to include Amazon Web Services, Google, IBM and Microsoft Azure. It also added 26 petabytes of customer enterprise storage.
Panzura’s new Freedom Archive software that moves infrequently accessed data to public clods or low-cost, on-premises storage could bring the vendor into new markets. Target archive markets include healthcare, video surveillance, gas and seismic exploration and media and entertainment. Freedom Archive is a separate application from Panzura’s flagship Cloud NAS platform, which caches frequently used primary data on-site and moves the rest to the cloud.
Last summer, Panzura launched a new series of cloud storage controllers with more capacity and the ability to expand to handle multiple workloads. The new 5000 hybrid cloud storage controllers replace Panzura’s previous 4000 series.
The E funding round brings Panzura’s total funding to around $90 million. The investment was led by Matrix Partners and joined by Meritech Capital Partners, Opus Capital, Chevron, Western Digital and an undisclosed strategic investor.
All-flash array pioneer Kaminario kicked off 2017 with a cash infusion of $75 million to accelerate global expansion and fuel product support for non-volatile memory express (NVMe) technologies.
Kaminario’s fifth funding round increased its overall total to $218 million since the company launched in 2008. Waterwood Group, a private equity firm based in Hong Kong, led the latest financing effort. Additional investors included Sequoia, Pitango, Lazarus, Silicon Valley Bank and Globespan Capital Partners. Kaminario’s most recent previous funding came in January 2015.
Founder and CEO Dani Golan said Kaminario has more than doubled revenue in each of the last four years. Global expansion and go-to-market efforts will focus on eastern and western Europe, Asia Pacific and the Middle East.
Kaminario will concentrate on incorporating NVMe technologies into the company’s K2 all-flash array. Kaminario currently uses SATA-based 3D TLC NAND flash drives. NVMe-based PCI Express (PCIe) solid-state drives (SSDs) can lower latency and boost performance, and NVMe over Fabrics (NVMe-oF) can extend the benefits across a network.
Golan said Kaminario does not support NVMe SSDs yet because “the price is too high.” He added that the NVMe-oF technology “is not mature enough to run in mission-critical and business-critical environments.”
A handful of new companies are starting to ship products with NVMe drives, but Golan said Kaminario’s NVMe support will probably wait until 2018.
“The ecosystem is not there yet,” he said.
Golan said startups that currently support NVMe use drives directly attached to servers. But, with a mature array platform on the market, Kaminario needs “to drive a full storage software stack over NVMe Fabrics,” he said.
“The big gain is [going to be with] NVMe over Fabrics, because NVMe drives are just media. That’s not interesting. The interesting part is NVMe over Fabrics and NVMe shelves,” Golan said.
Kaminario’s architecture allows customers to add controllers or shelves in any combination, scaling compute separately from storage, Golan said.
While companies may be confident in their disaster recovery strategy, DR planning and testing still has a ways to go, according to results of a survey by data protection vendor Zetta.
Among 403 IT professionals, 88% said they were somewhat or very confident in their disaster recovery. But while 96% said they have some type of DR, 40% responded that their organization lacks a formally documented DR plan. For those with a plan, 40% said they test it once a year, while 28% rarely or never test it.
Similar to Zetta’s findings, a recent TechTarget survey found that companies are generally confident in their business disaster recovery plans. Also similarly, 65% test their DR plan just once a year or less, according to the TechTarget survey.
“Companies need to be more rigorous around how they develop their DR plans,” Zetta CEO Mike Grossman said. That’s especially important given that more than half of the companies in the survey experienced a downtime event in the last five years.
It’s not enough for business disaster recovery plans to ask, “What happens if a hurricane hits?” According to the survey, the most common type of downtime event for an organization in the last five years was a power outage. For those that had a downtime event in that time period, nearly 75% said their organization suffered a power outage. Only 20% experienced a natural disaster in the last five years. A hardware error was the second most common response, with 53%. Both a human error and a virus/malware attack registered at close to 35%.
When people think about disaster recovery, they often think of catastrophic events. “In reality, that’s not what causes the biggest impacts day to day,” Grossman said.
Grossman recommends testing business disaster recovery plans a minimum of once per quarter and preferably once per month.
“Unless you have a continuous process, including testing, you’re not really protected,” Grossman said.
But “not all testing is created equal,” he warned. The more rigorous and real-life, the better.
A lot of companies like to think they’re protected but they’re not, Grossman said.
According to the “State of Disaster Recovery” survey, 55% changed their DR strategy after a downtime event. It’s a positive that companies are paying more attention to the issue but a negative that they underestimate the risk, Grossman said.
For those that made changes following their last downtime event, 55% of organizations changed their DR approach, 55% added DR technology and 39% increased their DR investment. Almost 1 in 4 respondents said they increased DR testing.
As IT gets more complex, companies like Zetta need to make DR easier to manage, Grossman said. But how do you take the complexity out of something that’s complicated?
The cloud helps. Zetta is “cloud-first,” Grossman said, with backup in the cloud and failover to the cloud. And security, which has traditionally been a challenge in the cloud, is getting better.
Ninety percent of IT professionals who are using the cloud in their disaster recovery strategy said they are confident in their DR, according to the survey. Seventy-four percent of organizations using only on-premises DR said they are confident in their plans.
“It’s simpler,” Grossman said of the cloud. “It provides better protection.”
Companies spend a lot of money on disaster recovery solutions but that doesn’t translate into faster data recovery, according to a survey conducted by Quorum titled the “State of Disaster Recovery.”
The report, which surveyed 250 CIOs, CTO and IT vice presidents, found that 80% of the companies surveyed claimed it takes more than an hour to recover from a server failure and 26% said it takes more than two hours for data recovery. Only 19% said it took less than an hour to recover. Seventy-two percent consider the speed of backup and data recovery as “critical.”
“All those backup and disaster recovery (BDR) products aren’t making their recovery any faster,” the report claimed. “While speed is essential for continuity and security… a staggering 80 percent of respondents need more than an hour to recover from a server failure. And its gets worse: more than a quarter need more than two hours.”
Sixty-four percent use more than three different disaster recovery solutions, with 26% using more than five and less than 40% using between one and three different disaster recovery products.
Moreover, a majority of the respondents said they wanted a method to simplify the management of all the BDR products they are using. Ninety percent of the respondents want to consolidate their disaster recovery solutions into one dashboard.
The report shows that the movement to the cloud has grown. Seventy-five percent of the survey respondents are using cloud-based disaster recovery solutions, while 36% use a hybrid model mixing on-premises and cloud DR. Thirty-nine percent use on Disaster Recovery as a Service (DraaS).
Eighty-nine percent have plans for more cloud-based disaster recovery solutions, with five percent stating they have no further plans and six percent stating they “don’t know.”
Disaster recovery products are growing in importance as as concerns about security increase. Seventy-seven percent said they have used their disaster recovery solutions after a security threat event occurred. Fifty-three percent respondents are worried about security threats compared to concerns about hardware failure, backup disk corruption or a natural disaster.
“Natural disasters crashing in on a data center, an employee error or a hardware failure can all pose immense problems for an organization,” the report stated. “But a skilled and willful attack can cripple a brand for years and could cost a literal fortune. Ransomware attacks particularly depend on a team’s inability to recover quickly.”
Companies are diligent about testing the production level with their current disaster recovery products. Eighty-eight percent of the respondents said they can achieve production-level testing with their current DR.
How much data do you actually recover?
That’s a question that Asigra users answered in a data recovery report.
Featuring statistics gathered from nearly 1,100 organizations across eight sectors, between Jan. 1, 2014 and August 1, 2016, the backup and recovery software vendor’s report found that those users only recover about 5% of their data on average.
“People really don’t recover a lot of data,” said Eran Farajun, executive vice president at Asigra. “Ultimately they’re paying like they recover all their data.”
Farajun compared the situation to what many experience with cable bills – customers often pay for hundreds of stations but don’t watch all of them.
Broken up by industry, manufacturing and energy recovered the most, averaging about 6%, according to the data recovery report. Public sector and health care recovered the least, at about 2%.
Users picked file-level systems as the most common data type restored.
The most common reason cited for a data restoration request was to access a previous generation of data, selected by 52% of users. Ransomware was a major cause of that need, Farajun said.
The second most common reason for data recovery was user error or accidental deletion, with 13%. A lost or stolen device was third with 10%. Interestingly, disaster was only picked by 6% of respondents, according to the data recovery report.
Asigra is working on improving cybersecurity and how it can best combine with data protection, Farajun said. In the face of the growing threat of ransomware, Farajun also suggested organizations educate their employees, have strong anti-virus protection and back up their data.
The average size of a recovery across all sectors was 13 GB.
Farajun described cost as the bane of a company’s relationship with its backup vendor.
“Mostly [companies] don’t feel they can do anything about it,” Farajun said. “You can do something about it.”
In 2013, Asigra launched its Recovery License Model and now almost all of the vendor’s customers use it. Pricing is based on the percentage of data recovered over the course of a contractual term, with a ceiling of 25%.
Asigra did a healthy amount of research before launching the model. It looked into other markets, such as the music and telecommunications industries, and assorted “fair-pay” cases. Music customers, for example, can now buy one-song downloads vs. an entire album that they may not listen to in its entirety.
“What happened?” Farajun said. “People bought boatloads and boatloads of songs.”
Asigra had been nervous when undertaking the new model. It anticipated a three-year dip but revenue started to go up after 12 months, Farajun said.
So why hasn’t this model caught on more in the backup market?
“There’s no incentive for software vendors to reduce their prices,” Farajun said. “We’re trying to price based on fairness.”
Farajun said the data recovery report vindicates the vendor’s underlying premise.
“People don’t recover nearly as much as they think they do and they overpay for their backup software.”