Jamie Lerner’s first 90 days as Quantum CEO confirmed the idea he had about the company when he took over. Discussions with customers convinced Lerner that Quantum’s greatest strength and greatest opportunity is storage for managing rich media data and video.
Now he has a solid plan for going forward: to complement Quantum’s StorNext file system technology with software features developed internally and through acquisition. Lerner said Quantum will also redesign its sales team more around specific solutions than geographies.
“What Oracle is to data management and what Cisco is to networking, that’s what Quantum is to rich media data,” Lerner said in an exclusive interview. “We see ourselves as the leader in infrastructure for managing rich media and video.”
Lerner said Quantum will add storage services intended specifically for that rich media and video. These services differ than storage features for traditional IT applications.
“We have storage, policy management and tiering technology,” Lerner said. “Over the next six to 18 months, we will layer on data services for video. Traditional data services – deduplication, compression, snapshots, clones, replication – are rarely used on video. With video you need a totally different set of data services. You need to search not just by keyword, but by image. You need deep media catalogues to know what media assets you have, what form they’re in, who has edited them. And you need a lot of analytics for video surveillance. Are people on video having an argument or holding a weapon? Has someone left a bag for a long period? Those are the data services needed for video.”
Lerner said Quantum will make core architectural changes by layering software modules on its current product. Those new modules will mostly be subscription based and cloud-hosted. He also said Quantum is likely to become a more aggressive acquirer of smaller storage companies.
“You’ll see a combination of tuck-in acquisitions to add features, skills and capabilities, and we’ll likely buy some technologies that are complete standalone entities,” he said. “They’ll be mostly software and cloud in nature, and heavy in rich media data services – analytics, a search catalog and other areas that will bolster our ability to handles petabytes of rich media.”
Lerner said he sees Quantum’s tape products retaining a large role in long-term archiving and cloud storage. The new Quantum CEO said the vendor will continue to see its DXi disk-backup appliances, while its dedupe capabilities will be woven into other storage platforms as a feature. But the main focus of development will be around StorNext and storage for rich media and video.
“Customers have figured out how to manage Oracle, they’ve figured out how to manage their email, but they are really struggling when incorporating video into their business,” Lerner said. “That’s the fastest segment of data growth.”
When Lerner became Quantum CEO in July, the vendor was knee deep in an internal accounting probe to find the cause of financial reporting irregularities. The probe is now complete. The main issue it found was that Quantum recorded revenue earlier than it should have, with approximately $25 million to $35 million of prematurely recognized revenue as of June 30, 2018. “We expect it to be good revenue but it was recorded too soon,” Lerner said.
Quantum detailed those findings in an SEC filing in September. Now it is restating past quarters to place the revenue in the right periods. The Quantum CEO said the restatements are not expected to affect cash flow.
He said he expects the restatements to wrap up by the end of the year, so Quantum can begin filing its quarterly earnings reports again in early 2019.
“Most of the deep concern phase is behind us,” he said. “Now we’re putting in place new loans and accounting procedures. We’re on the down slope of most of the unfortunate things that have happened to this company over the last couple of years.”
LAS VEGAS-NetApp expects about 5,000 customers and partners to gather here Monday as it lays out a roadmap for flash-enabled AI and cloud applications.
NetApp Insight 2018 marks the fifth year the event has been open to analysts, OEMs and the media. Prior to 2014, NetApp used Insight exclusively as technical training for data center managers. In 2017, NetApp Insight got off to a late start, as opening day was postponed following a mass shooting event on the Las Vegas Strip.
The three-day event will cover how the NetApp Data Fabric technologies extend to broader cloud use cases, said Kris Newton, a NetApp vice president of corporate communications and investor relations.
“We know that pretty much every organization, at some level, is thinking through AI and the cloud. We’ll have lots of discussion around how our customers can optimize their move to AI and the cloud, and see real results,” Newton said.
AI use cases cut across verticals, Newton said, spurring demand for faster storage and more efficient configurations to ingest data.
“AI puts pressure on your storage. You need storage that’s lightning-fast. You can’t wait around for your storage to respond,” Newton said.
Although she didn’t reveal details, Newton said NetApp Insight will highlight the role NVMe flash and storage class memory technologies play in a modernized data center. NetApp this year added an NVMe-based model to its All Flash FAS (AFF) Series arrays, mirroring similar moves by rival Dell EMC and others. The NVMe version of FAS allows customers to upgrade an existing FAS array by upgrading the OnTap operating system.
NVMe storage uses PCI Express to send traffic directly to CPUs, providing faster data transfer than traditional iSCSI command hops.
Sales of NVMe all-flash arrays will generate about $500 million in 2018, according to a report by analyst firm IDC, based in Framingham, Mass.
Storage arrays that extend NVMe from the back end to application hosts are sometimes known as rack-scale flash. NetApp’s AFF with NVMe technically doesn’t fit the definition, since NVMe runs on the front end, but allows customers to continue using SAS SSDs.
It wouldn’t be a NetApp Insight conference without product news. NetApp hinted it would reveal upgrades to its OnTap-based storage for converged systems, file storage and object platforms, as well as deeper integration for multicloud support.
Another point of interest will be any new details forthcoming on NetApp’s recent joint venture in China with server maker Lenovo. Under that deal, Lenovo will sell NetApp storage under its ThinkSystem brand.
Ctera Networks CEO Liran Eshel said his cloud file system company became cash flow positive this year, but it grabbed $30 million in new funding to grow as part of a booming market.
Ctera Networks raised $30 million in Series D growth equity funding to expand its global sales and delivery organization, especially in Southeast Asia and Singapore, and continue development of its enterprise file services technology. The latest financing round boosted the startup’s overall total to $100 million since 2008.
Ctera sells enterprise file software designed to cache active data on premises and shift colder data, in compressed and encrypted form, to object storage located in private and public clouds. In addition to translating data from file-to-object format, the software offers additional capabilities such as authentication, orchestration, synchronization and sharing.
Eshel said profitability is not Ctera’s top priority now. Neither is an IPO, although Eshel said “it’s definitely something we’re looking at.”
“We are investing significantly and will continue to invest in order to get more high growth and reach more customers,” Eshel said. “We could have just remained cash flow positive and be happy with where we are. But we think there’s much more in this market, and there’s much more land grabbing to be done. That’s why we will need to invest.”
Ctera customers have the option to use their own hardware or buy cloud gateway appliances that package the software. Ctera Networks introduced more powerful new HC Series Edge Filers on Dell and Hewlett Packard Enterprise (HPE) servers last summer.
“We are able to cover additional use cases and workloads that were traditionally solved by NAS systems. Now you could replace them with a more powerful cloud gateway,” Eshel said, claiming the new HC Series Edge Filers are doing well.
Eshel said Ctera Networks generally sells its software or gateways as part of deals with other infrastructure providers. He said the company often works with vendors such as Cisco Systems, Dell EMC, HPE and IBM.
“The bigger part of our business today comes from these infrastructure providers while we go to the market with complete solutions,” Eshel said. Ctera also has strategic reselling agreements with HPE and IBM.
Ctera Networks claims to have more than doubled its enterprise software subscription revenue during the last year. The company sells to cloud providers and enterprises, and its software is currently deployed in more than 200 private clouds, according to Eshel. Some of Ctera’s largest customers include McDonald’s, WPP, and the U.S. Department of Defense.
Eshel said the new funding would finance Ctera’s ongoing work to connect hyper-converged systems to a cloud file system. Ctera’s research and development arm is based in Israel, and the company’s sales headquarters is in New York.
Ctera’s competition in the cloud gateway space includes Nasuni and Panzura, but those vendors have all expanded their product lines with additional capabilities beyond mere file-to-object protocol translation.
Israel-based Red Dot Capital Partners led Ctera’s Series D funding round. Red Dot receives its funding from Temasek Holdings, an investment company owned by the Singapore government. Additional investors included Singtel Innov8, the VC arm of the Singapore-based Singtel Group telecommunications company. Also participating in Ctera’s Series D round were previous investors Benchmark Capital, Bessemer Venture Partners, Cisco, Venrock, Vintage Investment Partners and Viola Group.
Other recent funding rounds in the cloud market include $94 million for file and object storage vendor Cloudian, $75 million for cloud file sharing and content collaboration specialist Egnyte, $68 million for public cloud storage provider Wasabi Technologies, and $60 million for hybrid cloud computing and data management startup Datrium.
Quest backup has vaulted into the Office 365 workspace.
NetVault Backup 12.1 includes a plug-in that enables full and incremental backup and recovery of Office 365 Exchange Online mailboxes. Customers can back up to the cloud and on premises. They can restore individual, shared and resource mailboxes. The Office 365 plug-in provides flexible restore options and customers can restore only the data they need.
Quest built the plug-in with the Microsoft Graph API. While other vendors may be using old scripting, Quest is using new technology pushed by Microsoft, said Adrian Moir, senior consultant of product management.
“It allows us to grow across the Microsoft platform a lot faster,” Moir said.
Quest backup customers can restore emails, attachments, contacts and calendars.
Good timing for Office 365 backup
Don McNaughton, vice president of sales for Quest reseller HorizonTek, said many customers are using Office 365 and need backup for the SaaS app. Adding the backup support enables NetVault to remain a single data protection offering for those customers on Office 365. Standout features include the full or incremental backup options, full mailbox recovery and granular recovery, he said.
“So the timing was good,” McNaughton said.
Customers “want everything done in one place,” Moir said. That protection includes cloud and on-premises workloads, as well as hybrid approaches.
The Quest backup update builds on what the vendor launched with its NetVault 12.0 release, which aimed for more enterprise adoption. Moir said he expects Quest to add more technology focused on Office 365.
Competition includes some vendors purely focused on SaaS backup and others that incorporate it as part of an overall data protection platform.
“It’s a crowded market. Trying to differentiate is never easy,” Moir said, adding that he feels the Quest backup product’s flexibility, API incorporation, scalability and ease of use are standouts.
Beyond backup for Office 365
The NetVault Backup update also provides a multi-tenant architecture for managed service providers. In addition, an update to its VMware plug-in features vSphere 6.7 support.
McNaughton said HorizonTek is still analyzing the potential benefits of the other updates to 12.1 beyond the Office 365 backup.
McNaughton’s company has been a Quest partner since 2010. HorizonTek has actually been selling NetVault for about 20 years, predating when it became part of Quest backup. Quest acquired the NetVault platform from BakBone in 2010.
“After all this time, I’m still very happy introducing it to my customers,” McNaughton said. “NetVault has done a great job keeping up as technologies come out.”
Quest backup is on top of major trends in the industry, he said, including cloud integration and keeping everything under a single pane of glass.
McNaughton said he also likes how well NetVault integrates with Quest’s new QoreStor software-defined product as well as other secondary storage platforms.
Quest claims thousands of NetVault customers.
What are high availability applications if they’re not highly available?
According to a report released this month by SIOS, in partnership with ActualTech Media, one-quarter of respondents say their high availability applications fail every month. Only 5% said they never suffer an availability failure.
“An organization’s highly available applications are generally the ones that ensure that a business remains in operation. Such systems can range from order-taking systems to CRM databases to anything that keeps employees, customers and partners working with you,” the report said. “… The news is mixed when it comes to how well HA applications are supported.”
The report, “The State of Application High Availability,” gathered responses from 390 IT professionals in the United States and focused on tier-1 mission-critical applications, including Oracle, Microsoft SQL Server and SAP/HANA.
Twenty-six percent said their availability service fails at least once a month.
“This is a difficult statistic to grasp, as it would seem that there’s a fundamental flaw somewhere that needs to be corrected,” the report said. “Fortunately, not everyone is faring this badly.”
Among the rest of the 95% that said they suffer failures in high availability applications, 28% said it happens every three to six months, 16% said it happens every six to 12 months and 25% said it happens once per year or less.
High availability requires expertise, said Jerry Melnick, president and CEO of SIOS, a software company that manages and protects business-critical applications. That includes getting the right software to match requirements, getting the system configured correctly, plus discipline and management in how organizations approach the cloud, he said.
Is high availability up in the cloud?
As with many other uses, organizations are exploring the use of the cloud for high availability applications.
“Modern organizations are embracing the hybrid cloud and making strategic decisions around where to operate critical workloads,” the report said. “But not everyone is keen on moving applications into an off-premises environment.”
Twelve percent of respondents have not moved any high availability applications to the cloud. Twenty-four percent are running more than half of those applications in the cloud.
“Putting all those pieces together … requires a higher set of IT skills,” Melnick said.
Once an organization gets there, though, the cloud can help streamline high availability operations.
“The cloud offers a unique opportunity to cost effectively get to disaster recovery and handle disaster recovery scenarios,” Melnick said.
Sixty percent of organizations that haven’t made the full move to the cloud said they prefer to keep high availability applications on premises where they have more control over the infrastructure.
Melnick said he thinks some of those respondents will eventually move to the cloud.
Datrium’s latest $60 million funding will fuel its hybrid cloud computing and data management product line and business expansion into Europe.
The Series D funding round boosted the Sunnyvale, California-based startup’s overall total to $170 million since 2012. New CEO Tim Page closed the round as he tries to pivot the company from its SMB and midmarket roots to enterprise sales of Datrium DVX.
Former CEO Brian Biles, a Datrium founder who is now chief product officer, said the startup is having a great quarter, and Page has “re-energized a lot of our focus on go-to-market.” Page’s experience includes building out an enterprise sales organization while COO at VCE, the VMware-Cisco-EMC joint venture that produced Vblock converged infrastructure systems.
Datrium DVX first hit the market in early 2016 with server-based flash cache to accelerate data reads and separate data nodes for back end storage. DVX software orchestrates and manages data placement between the Datrium Compute Nodes and Data Nodes and provides storage features such as inline deduplication, compression, snapshots, replication, and encryption.
Separate Compute and Data Nodes
Datrium now pitches its on-premises DVX as converging “tier 1 hyper-converged infrastructure (HCI) with scale-out backup and cloud disaster recovery (DR).” But Datrium DVX is not HCI in the classic sense with virtualization, compute, and storage in the same box. The Datrium DVX system’s Compute Nodes cache active data on the front end, and separate Data Nodes store information on the back end, enabling customers to scale performance and capacity independently.
Customers have the option to buy Datrium Compute Nodes, supply their own servers, or use a combination of the two, so long as they’re equipped with solid-state drives (SSDs) to cache data. The compute nodes support VMware, Red Hat and CentOS virtual machines. Disk- or flash-based Datrium Data Node appliances handle the backend storage.
This year, Datrium added a software-as-a-service Cloud DVX option to back up data in Amazon Web Services (AWS) and CloudShift software for disaster recovery orchestration. The company claimed that more than 30% of its new customers adopted Cloud DVX within the first three months of its availability. Biles said Cloud DVX could lower backup costs in AWS because Datrium globally deduplicates data.
Biles characterized Datrium’s Series D funding as a “standard round” that will help to grow all parts of the company. He said Datrium currently operates in the United States and, to a lesser degree, in Canada and Japan, and the company plans to expand to Europe next year. Datrium has more than 150 employees and more than 200 customers, according to company sources.
“We have good momentum now, but we want to keep feeding that,” Biles said. He offered no estimate on when the company might become cash-flow positive. “A lot depends on the next couple of years of sales acceleration.”
Samsung Catalyst Fund led the latest funding round, with additional backing from Icon Ventures and prior investors NEA and Lightspeed Venture Partners. Icon’s Michael Mullany, a former VP of marketing and products at VMware, joined Datrium’s board of directors.
Dell EMC extended its lead over Nutanix in hyper-converged systems sales in the second quarter, although Nutanix crept ahead of Dell-owned VMware into first when the market is measured by HCI software.
That was the verdict from IDC in its worldwide converged systems tracker report released last night.
IDC measures the hyper-converged infrastructure (HCI) market two ways: by the brand of the systems and by the vendor whose software provides the core hyper-converged capabilities. Dell-owned technologies led both HCI market categories in the first quarter with Nutanix second in both. Nutanix, which moved to a software-centric reporting model earlier this year and is getting out of the hardware business, jumped up in software revenue but lost ground to Dell EMC in systems.
Overall, IDC said the HCI market grew 78% year-over-year to $1.5 billion in the second quarter. Dell EMC’s $419 million in revenue gave it 28.8% share. That represented 95.2% year-over-year growth, outgrowing the market. Nutanix placed second in branded revenue with $275.3 million, up 48.5% year-over-year and basically flat from its first-quarter branded revenue of $273 million. Nutanix had 18.9% of the branded revenue, down from 22.7% a year ago and 22.2% in the first quarter of 2018.
On the software side, Nutanix revenue grew 88.9% year-over-year to $498 million and 34.2% of the HCI market. It slipped past VMware, which grew 97% year-over-year to $496 million and 34.1% share. IDC considers Nutanix and VMware in a statistical tie because they are within one percent in share. VMware’s share jumped from 30.9% in the second quarter of 2017 to 34.1% a year later. But it dropped from 37.2% share in the first quarter while Nutanix increased from 35.2% to 34.2% quarter-over-quarter to catch VMware. However, Dell did receive part of Nutanix’s revenue gains because the Dell EMC XC platform uses Nutanix software through an OEM deal,.
Dell had $79 million in HCI software, putting it in a statistical tie Cisco ($77 million) and Hewlett-Packard Enterprise ($72 million). Dell had 5.4% share, Cisco 5.3% and HPE 4.9% — all within one percent. Because Cisco and HPE sell their software on their own servers, they had the same revenue and share in systems as in HCI software. HPE had the largest year-over-year growth of any systems vendor, increasing 119.4%. However, Cisco grew more since the first quarter, jumping from $60 million to $77 million and increasing share from 4.9%. HPE dropped share quarter-over-year, slipping from 5% to 4.9% while its revenue went from $61 million to $72 million.
Hyper-convergence was the only three of the converged markets that increased year-over-year. The certified reference systems/integrated infrastructure market declined 13.9% year-over-year to $1.3 billion in revenue. Integrated platform sales slipped to $729 million for a 12.5% decline. Dell led the certified reference systems market with $640 million, with No. 2 Cisco/NetApp at $481 million. Oracle led in integrated platforms with $441 million and 60.4% share. The HCI market is also now the largest of those three converged markets for the first time.
NetApp launched its Data Fabric architecture to adapt its storage to manage applications built for the cloud. Container orchestration had largely been a missing aspect in Data Fabric, but the vendor has taken a step to try and plug the gap.
NetApp has acquired Seattle-based StackPointCloud for an undisclosed sum. StackPointCloud has developed a Kubernetes-based control plane, Stackpoint.io, to federate trusted clusters and sync persistent storage containers among public cloud providers.
The first fruit of the merger is the NetApp Kubernetes Service, which the vendor claims will allow customers to launch a Kubernetes cluster in three clicks and scale it to hundreds of users. NetApp said it will levy a surcharge of 20% of the overall compute cost for the cluster to cover deployment, maintenance and upgrades. That equates to about $200 on $1,000 of overall compute.
The NetApp Kubernetes Service engine will allow customers to deploy containers at scale from a single user interface with underlying NetApp storage, said Anthony Lye, a NetApp senior vice president of cloud data services.
The Cloud Native Computing Foundation took over management of Kubernetes development earlier this year from Google. Docker Inc. popularized container deployments with its Docker Swarm orchestration management. Other open-source container tools include Apache Mesos and Red Hat OpenShift.
NetApp customers will still be able to use their preferred deployment framework, but Lye said Kubernetes is “the clear winner” among container operating systems.
He said Stackpoint completes the work NetApp started with its open source dynamic container-provisioning project, codenamed Trident. NetApp Kubernetes Service is available immediately.
Lye said his internal development teams were using the Stackpoint engine to deploy NetApp storage infrastructure at global cloud data centers run by Amazon Web Services (AWS), Google Compute Platform (GCP) and Microsoft Azure. In addition to the big three, StackPointCloud supports Digital Ocean and Packet clouds.
“My engineers were telling me this was the best thing they’d ever seen, plus the market was telling us that storage and containers need to go together and (enterprises) are using multiple clouds. Those three reasons led us to make the acquisition,” Lye said.
The DevOps trend has been fueled by container virtualization for writing cloud-native applications with specialized microservices. Linux-based containers also are gaining attention for the ability to “lift and shift” traditional legacy applications to hybrid cloud environments. Unlike a virtual machine, a container does not require a hosted copy of a full operating system.
Designed on Kubernetes Storage Classes, NetApp Trident was developed to simplify persistent-volume provisioning for OnTap-based storage, SolidFire and E Series arrays. Lye said the NetApp Kubernetes Service allows developers to run canary environments to test new applications with mixed nodes of graphics processing units and regular CPUs.
StackPointCloud launched in 2014 with bootstrapped funding. The transaction brings CEO Matt Baldwin to NetApp, along with an undisclosed number of StackPointCloud employees.
Stackpoint integration will start with NetApp HCI hyper-converged infrastructure and FlexPod converged systems. The plan is to extend NetApp Kubernetes Service across all of NetApp’s storage, Lye said. “Our strategy is to continue to build tighter connections between our cloud protocols and containers and extend the control plane from the public clouds down to support NetApp HCI or NetApp’s private clouds.”
Newcomer Wasabi Technologies will try to build up its brand recognition and take on the Big Three public cloud providers after raising $68 million in Series B funding.
“You don’t go up against Microsoft, Google and Amazon with pocket change,” said David Friend, Wasabi CEO and founder.
The Boston-based cloud storage provider launched in 2017 with about $8.5 million in its coffers from a 2016 Series A funding round, when the company was known as BlueArchive. Wasabi added another $10.8 million a few months later through a convertible note that folded into the recently closed $68 million Series B round.
Friend said Wasabi Technologies needs senior sales staff with expertise in vertical markets such as genomics, media and entertainment, and surveillance. The startup also plans to add in-house sales representatives to handle the growing volume of calls. Friend said Wasabi currently employs about 45 and could have 60 to 65 staffers by year’s end.
Friend wants to expand quickly to make it hard for newer competitors to get into the market. He said he took the same approach while CEO at Carbonite, one of the early cloud storage success stories in the consumer and SMB space.
“We’ve got about 3,500 paying customers, so it’s time to really turn up the heat and start building a sales force and the Wasabi brand and all that sort of stuff,” Friend said. “I wanted to start ramping up now that I feel comfortable that the technology is really solid and I could spend a buck on marketing and get four or five bucks back in terms of customer value.”
Wasabi Technologies claims business is growing at a rate of 5% to 10% per week. The startup stores data in its own equipment at colocation facilities in Ashburn, Virginia, and Hillsboro, Oregon. Friend said a new European data center will operate the same way when it opens in the fourth quarter. He declined to disclose the location of the new data center.
Friend said he knows what only about 100 or 200 of Wasabi’s 3,500 customers are doing with the cloud storage. He said the smallest customer might store a few TB, and the largest has 5 PB to 10 PB, with plans to expand to 50 PB to 100 PB.
Wasabi Technologies claims its cloud storage is cheaper and faster than Amazon’s Simple Storage Service, with no egress charges to extract data. Friend said most large customers with significant IT budgets shift data from huge tape libraries or on-premise storage reaching end of life.
Wasabi’s biggest customers are Hollywood movie studios and other media and entertainment companies, Friend said. With those customers in mind, the startup introduced a Direct Connect option in Hollywood and San Jose, California, to transfer data via a high-speed, dedicated pipe. Wasabi Technologies also offers Ball Transfer Appliances that large customers can fill up and ship back, similar to the Snowball option that Amazon has.
Who ponied up for Wasabi Technologies?
Instead of typical venture capital financing, Wasabi’s new funding comes from individual investors and family-run firms such as Forestay Capital, the technology fund of Swiss entrepreneur Ernesto Bertarelli. Friend said 117 investors contributed to Wasabi’s Series B round, including many repeat backers from the Series A funding.
“I started out expecting to raise more like $40 million, but even at $68 [million], I had to turn a whole bunch of people away. People were just flocking in. It was unbelievable,” he said.
New Wasabi investor Bertarelli is worth $8.7 billion, according to Forbes. His family sold biotech firm Serono to Merck for more than $13 billion in 2007, and launched the Waypoint Group five years later. Forestay Capital is Waypoint’s tech fund.
Friend said Wasabi’s investors are engaged and helpful. “Most of them are self-made people who have built businesses of their own, and they’re excited about being part of the company,” he said. “They are opening doors for us at customer sites. In a couple of cases, they’ve helped us recruit some senior people.”
I can’t tell you exactly how many storage products have launched in the past year, but I know it was in the hundreds. I can tell you it was more than I can count. That’s because hardly a day goes by when I don’t receive a briefing, press release, or pitch for a briefing from a storage vendor. And the rest of the TechTarget storage editorial staff can tell you the same.
I do know how many storage products won Storage Magazine/SearchStorage Storage Products of the Year awards last year: 14. That’s not many considered all the new products that ship in a year.
Now it’s time to start judging the hundreds of products that came out in 2018, and pick the 14 or so that deserve the honor this year. If one of those products is yours, you can enter it for consideration by our judging panel, made up of our editors, independent storage analysts and end users. You can find the entry form here. The deadline has been extended to Friday, Sept. 28, 2018, at 5:00 p.m. The form also includes the judging criteria and tips for completing the form if you’re new to this. This is the 17th year we’ve been giving these awards, so many of you have been through this before and know how prestigious they are.
Check out this year’s categories: Storage Arrays, Software-Defined/Cloud Storage, Storage Management Tools, Backup and Disaster Recovery Software/Services and Backup and Disaster Recovery Hardware. That pretty much covers the gamut of storage products, so you should be able to find your category. But you have to be in it to win it, so make sure you don’t miss the deadline and get shut out.