A Cohesity executive wants to make something clear about the company’s self-described product line.
“Hyper-converged secondary storage is not a trend,” chief marketing officer Lynn Lucas said. “It’s a category.”
The numbers in the most recent Cohesity revenue report shine a light on that statement. The vendor reported a 600% year-over-year increase in 2017 sales revenue. Over the past eight months, Cohesity’s new customer count doubled. The company does not report exact revenue or customer figures, though Patrick Rogers, its vice president of marketing and product management, said Cohesity storage has hundreds of customers.
Cohesity is entering its third year of selling products and aims to be the platform for converging all non-primary storage. Its customers include a dozen Fortune 500 companies. New customers include the U.S. Department of Health and Human Services, the U.S. Department of Energy, the U.S. Air Force and the University of California at Santa Barbara. In addition, international business produced more than 30% of bookings in 2017.
Hyper-convergence initially focused on primary storage. When Cohesity and fellow startup Rubrik emerged, analysts called their products converged secondary storage because they planned to handle all non-primary workloads. But with hyper-converged primary and converged secondary storage both growing healthily, the newcomers call their systems hyper-converged secondary storage.
“[Hyper-converged secondary storage] is a category and the enterprise is continuing to adopt it,” Lucas said.
Cohesity claims customers in financial services, media and entertainment, health care and high tech are using its storage for at least 1 PB, Rogers said. One customer has 4 PB in Cohesity storage. One petabyte equals 50 nodes.
The vendor also claims most of its customers use its cloud services. Cohesity has partnerships with Amazon, Google and Microsoft.
Cohesity recently launched its DataPlatform Cloud Edition, which now runs on Amazon Web Services and Microsoft Azure. DataPlatform serves as the underlying file system that manages storage across the Cohesity nodes, handling features such as data deduplication, compression, encryption and tiering.
Cohesity’s early competition came from Rubrik, which bills itself as “The Cloud Data Management Company” and raised $180 million in a 2017 round for a total of $292 million in funding. But larger companies are also getting into the game, including backup vendor Commvault with its HyperScale platform that Cisco also sells as ScaleProtect on the Cisco Unified Computing System.
Cohesity getting hyped for growth
In April 2017, Cohesity raised more than $90 million in a funding round that included investments from Google, Hewlett Packard Enterprise (HPE) and Cisco. Cohesity has raised more than $160 million in three funding rounds.
Cohesity doubled its workforce over the last year. It now has more than 300 employees, including 100 hired in the last few months, and plans to hire more. Its new San Jose, Calif. headquarters was built with room for growth, Lucas said.
One of the new hires is Lucas, who started recently as Cohesity’s first CMO after previous stints with Veritas Technologies and Cisco. Cohesity says she was hired to strengthen the leadership team, increase customers and accelerate company growth.
Cohesity also recently hired former NetApp president Rob Salmon as its first president and COO.
Strategy for the year ahead includes tackling all forms of secondary storage beyond data protection uses — such as test and development and analytics — and increasing adoption of cloud infrastructure, Rogers said.
Challenges include spreading the word about hyper-converged secondary storage. In addition, customers have more data but want to spend less.
“Hyper-converged secondary storage is a part of the enterprise data center that hasn’t been consolidated,” Rogers said, which presents an opportunity for Cohesity.
Acquisitions are not on the table, Rogers said.
“The horsepower here from an engineering talent perspective is really unbelievable,” Lucas said, citing founder and CEO Mohit Aron, founder of hyper-converged pioneer Nutanix, as an example. One of Aron’s goals is to converge data protection in a similar manner to the way Nutanix converges primary data.
Cohesity also has some important strategic partners, including Cisco and HPE, Rogers said. HPE, for example, is reselling pre-configured, scale-out Cohesity storage combined with HPE’s enterprise-class servers and network switches.
Aided by gains in cognitive analytics, cloud and storage, annual IBM revenue has finally returned to positive territory after an absence of nearly six years.
IBM last week closed its 2017 fiscal year by posting quarterly revenue of $22.5 billion, up 4%. Earnings for the full year remained flat at of $79.1 billion. The vendor used the earnings call to formally introduce longtime IBM executive James Kavanaugh as its new CFO.
“This is … four quarters in a row (that) we’ve grown storage. And that’s been based on the great work our storage team has done (in) repositioning the portfolio, leveraging and growing share in flash. But it’s also about software-defined and also….object storage that will continue” to provide growth markets, Kavanaugh said.
Cloud initiatives generated IBM revenue of $17 billion, up 27% on a currency-adjusted basis. The revenue figure Includes $9.3 billion from software-as-a-service offerings and nearly $8 billion in hardware, software and services to help companies build IBM-based private clouds. Sales of IBM storage hardware jumped 8%, although IBM does not break down revenue by individual storage products.
The overall results snapped IBM’s string of 23 consecutive losing quarters. The return to revenue growth stems from an IBM strategy shift intended to “significantly improve the trajectory” in burgeoning sectors during the past year, said Martin Shroeter, IBM senior VP of global markets.
“Back in July, we planted the flag for our businesses and we pointed to an improved trajectory in the second half. As we look back on the year, we (were able to) significantly improve the trajectory in our revenue and our gross margin performance. We did this by ramping up our cloud and as-a-service offerings and by continuing to reinvent” with its new IBM Z mainframe and AI-focused Power9 processor.
IBM revenue is broken into four segments:
- Cognitive Solutions: $5.4 billion, up 3%
- Global Business Services: $4.2 billion, an increase of 1%.
- Technology Services/Cloud: $9.2 billion, down 1%.
- Systems (including storage): $3.3 billion, up 32%
A fifth segment, global financing, helps customers underwrite the sale of used IBM equipment. Those activities produced $450 million last quarter.
Schroeter said fourth-quarter cloud revenue of $5.5 billion were up 27% on a currency-adjusted basis. The IBM Systems group, which includes system hardware and operating-system software, produced $3.3 billion, a jump of 28 percent.
For the year, IBM derived $17 billion from its enterprise cloud initiatives, including $9.3 billion from software-as-a-service offerings and nearly $8 billion in hardware, services and software to help companies build IBM-based private clouds.
IBM storage revenue has been a bright spot of late. Fueled in part by surging demand for all-flash storage, the latest filing marks the fourth consecutive growth quarter in storage. Revenue from IBM Z Systems mainframe rose more than 70%, thanks to pervasive encryption included in the latest version of the product. Systems hardware sales overall jumped 35%, offset by flat OS software revenue.
“We gained share in a very competitive market while holding margins stable. We had double-digit growth in our high-end hardware products for the quarter, which reflects the demand for flash as well as the capacity increase linked to mainframe demand. Our all-flash array offerings once again grew at a strong double-digit rate and faster than the high-growth all-flash market,” Schroeter said.
Increased flash sales are linked to the IBM Watson cognitive computing. Schroeter said IBM added more than 1,000 customers to its array of Watson-linked verticals, which include Watson Financial, Watson IoT and Watson Health. In addition, IBM and Massachusetts Institute of Technology (MIT) formed the IBM-MIT Watson partnership to unlock AI-based research.
Schroeter said IBM cloud and cognitive analytics are being integrated in more offerings as part of its Strategic Imperatives framework introduced in 2015, which now accounts for 46% of all IBM revenue.
Western Digital released a firmware update last year to address critical backdoor security vulnerabilities in its My Cloud NAS products but the company this week acknowledged more security issues with the devices still need to be addressed with firmware updates.
Western Digital addressed the My Cloud NAS security issues on certain models in a corporate blog post that was updated Tuesday. It stated that hackers could exploit default settings under several conditions: if they have access to the owners’ local network, if the My Cloud owner has enabled Dashboard Cloud Access on certain model or the My Cloud owners enabled additional “port forwarding” to the My Cloud devices.
“To mitigate the issue, we strongly recommend that My Cloud owners who have made such changes disable the Dashboard Cloud Access and ensure their router and My Cloud device are secure by disabling additional port-forwarding functionalities,” the blog post update said. “All affected My Cloud owners should restrict local network guest access only to people they trust. We are working on a firmware update for this issue and will make it available on our support download site as soon as possible.”
The NAS devices are popular among home users and small businesses. The models that are affected include My Cloud, My Cloud Mirror, My Cloud Gen2, My Cloud PR2100, My CloudPR4100, My cloud EX2 Ultra, My Cloud EX2 and EX4, My Cloud EX200, EX4100, DL2100 and DL4100.
James Bercegay, a Gulftech researcher, initially alerted Western Digital about two examples of NAS security flaws in June 2017. One flaw he discovered is My Cloud devices were vulnerable to unrestricted file uploading via the multi_uploadify.php script because it was protected “with faulty logic.”
“This gives you root access to the box,” Bercegay said in an interview Wednesday. “As the file is uploading, it gets written to the disk with the permissions of root. This was code left in there by accident… it makes a request though a non-existent file name of ‘mydlink.cgi.’ You can load up any file you want.”
Bercegay said he also discovered the NAS products have a hardcoded backdoor vulnerability via a single file called NAS_Sharing.cgi, which bad actors can use to gain control of the system to steal data and spread malware. The backdoor vulnerability gives access just by using the username “mydlinkBRionyg” and password “abc12345cba.”
“It gives you complete control,” he said. “You are the ultimate, super user on the device. It means you are God on that machine.”
Bercegay alerted Western Digital, one of the largest hard drive manufacturers in the world, about the NAS security vulnerabilities back in June 2017. The storage company requested the standard 90 days grace period to deal with the issue before disclosing it but it took it six months to release firmware update v2.30.172 that addressed the remote access bugs.
“The triviality of exploiting this issue makes it very dangerous and even wormable,” Bercegay wrote in a Jan. 4 report on the Gulftech website. “Not only that, but users locked to the LAN are not safe either. An attacker could literally take over your WD My Cloud by just having your visit a website where an embedded iframe or img tag make a request to the vulnerable device using one of the many predictable default hostnames for the WD My Cloud such as ‘wdmycloud’ and ‘wdmycloudmirror.’”
Western Digital responded to an interview inquiry regarding the NAS security issues with an email:
“Minor issues are being addressed in future updates,” the company stated in the email. “Additionally, the My Cloud Home model architecturally is designed new from the ground up. We are not aware of any vulnerability to the security issues listed in the respective reports.”
Bercegay said companies should be able to promptly respond to security issues, but sometimes are slow to do so.
“It does not take long to do these things,” he said. “They just don’t prioritize it. There is a lot of bureaucracy and red tape, especially when it comes to security. (These problems happen in the first place) because of sloppy coding. It’s like 1999 all over again.”
Western Digital was awarded PWNIE Award in 2016 for a vendor that most poorly responded to a security issue.
Users of Dell EMC data protection are being urged to quickly patch three security flaws that could hijack Avamar-based products.
The vulnerabilities revolve around Avamar Installation Manager, a common component in Dell EMC Avamar Server, NetWorker Virtual Edition and Dell EMc’s Integrated Data Protection Appliance (IDPA).
The Dell EMC data protection vulnerabilities were discovered by Digital Defense Inc. (DDI), a San Antonio, Texas, firm that performs vulnerability assessments and penetration tests on behalf of customers in financial services and other regulated industries.
If used in combination, DDI said the zero-day exploits could allow unauthorized users to modify the configuration file to gain root access via Dell EMC backup copies. Security fixes are available for download from Dell EMC to credentialed enterprise customers.
Dell EMC Avamar data protection also powers VMware vSphere Data Protection. VMware has issued a security advisory.
The vulnerability considered most serious is an authentication bypass vector. This mechanism potentially allow a hacker to receive authentication via a basic POST request to the Avamar server. No specific knowledge is required about the targeted Dell EMC backup Avamar server, such as user credentials and passwords, to generate a session ID.
Other identified vulnerabilities include bugs that allow authenticated users to download or upload arbitrary files with root access. Used in combination, the three Avamar-related security holes could fully compromise Dell EMC data protection systems.
DDI alerted Dell EMC to the findings late last year. The two vendors do not have a contractual relationship. Mike Cotton, a DDI senior vice president of engineering at Digital Defense, said DDI downloaded a virtual instance of Avamar Virtual Edition 7.4 as part of routine bench testing.
“We started looking under the covers for any vectors we could use against the remote appliance,” Cotton said.
Dell EMC notified customers of the software fixes on Friday. Spokesman Kevin Kempskie said no Dell EMC data protection systems are known to have been affected.
The Avamar product was acquired by EMC in 2006 primarily for its data deduplication technology. Dell EMC NetWorker Virtual Edition is a software platform that backs up data on multiple operating systems to a variety of targets. Dell EMC IDPA marked the vendor’s first integrated disk-based hardware appliance, a departure from selling its backup software on DataDomain appliances.
As businesses look to increasingly digitize their information, data management and data access are top concerns, according to a recent survey.
“You can’t be a digital business if you don’t have your arms around data,” said Bill Wohl, chief communications officer at Commvault, which produced the survey in partnership with Quadrant Strategies.
Two-thirds of respondents estimated that their company has access to half of its data or less, according to the survey of 1,200 IT executives and personnel in Europe, Asia and North America. Wohl said that lack of access stood out and is a hindrance to digital transformation planning.
Fifty-one percent of IT executives and personnel said “better data collection and management” is essential to their company’s success over the next few years, according to the survey. Forty-nine percent said “new tools to analyze increasingly sophisticated data” is essential.
Ransomware, GDPR increase data management urgency
The rise in ransomware is moving data management planning to the front of the line.
“We’ve moved from a period where the risk of an attack is a question. It’s going to happen,” Wohl said. “It requires a determined effort on the part of IT management.”
Only 31% of IT executives and personnel said their company is well-prepared for ransomware, according to the survey. Thirty-four percent said they are ready for “a project that involves bringing together all the company’s data.”
Getting data under common management, which helps recovery from ransomware, is one of Wohl’s recommendations for digital transformation planning. Though the transformation process can seem overwhelming, bringing disparate data sets under common management — so an organization has a complete picture of the business — is an important first step.
The European Union’s General Data Protection Regulation, which goes into effect in May, is also propelling data management to the forefront. Forty-seven percent of IT executives and personnel said their company is well-prepared for implementation of GDPR.
How to get ahead with your data
Wohl also suggested narrowing the focus of IT and decreasing the number of vendors used. A typical organization may have multiple backup processes in place, for example.
Automating data management is another important step in digital transformation planning. Automation and vendor consolidation frees up staff time for more forward-looking work in advancement and innovation.
Wohl pointed to insurance and investment firm Great-West Financial as an example of an organization undergoing a successful digital transformation. As part of its digital transformation planning, the company is moving away from old business models, having customers interact more on the internet and through mobile devices versus the paperwork so historically prevalent in the industry.
According to the survey, though, IT executives and personnel are more focused on day-to-day operations than innovation. Forty-nine percent of IT executives said running the company network is a high priority, while 34% said understanding and preparing for innovation in the industry is a top concern.
And IT executives are more confident than other personnel that their departments are ready for digital transformation planning. For example, 41% of executives said they are comfortable their IT department can successfully understand and prepare for innovation in the industry, while 29% of personnel said the same. Thirty-nine percent of executives said IT can successfully use data science for big data analytics, while 29% of personnel said the same.
Recent IDC research shows hyper-converged platforms are quickly catching up to the converged systems market.
For the third quarter of 2017, global revenue on converged systems increased nearly 11% to $2.99 billion, according to the latest IDC Tracker market data. Most of that growth came from hyper-converged systems revenue, which jumped 68% year over year from $597 million in the third quarter of 2016 to $1 billion last quarter.
Speeding the transition was a string of mergers and acquisitions, including the first full operating year for a merged Dell EMC and NetApp’s foray into HCI storage based on SolidFire all-flash arrays.
IDC defines converged systems as those that combine servers, disk storage, networking and management software. The IT analyst firm divides the converged systems market in three segments: hyper-converged infrastructure (HCI), integrated platforms, and vendor-validated reference architectures for private clouds. IDC Tracker focuses on systems that are branded, certified, designed and supported by vendors. Systems implemented though integrators or partners are excluded.
Hyper-converged platforms consolidate computing, networking and storage in a single virtualized appliance. Dell EMC overtook HCI pioneer Nutanix, generating $307 million, up 158%, to improve its market share to 30.6% — up from 20% a year ago.
The Dell EMC HCI storage portfolio includes the VxRail appliance, the rack-scale VxRack system and XC Series family. The XC Series bundles Nutanix software on Dell PowerEdge servers. That partnership started in 2014.
Nutanix captured nearly 21% of the market, down nearly 1% year over year, although its revenue grew 61% to $207.4 million. Dell EMC and Nutanix combined to capture 51% of overall HCI sales.
Cisco muscled past Hewlett Packard Enterprise to claim third place. The network hardware vendor has made strides since entering the market in 2016 with Cisco HyperFlex hyper-converged. After posting $10.2 million in HCI revenue a year ago, Cisco HyperFlex sales skyrocketed 545% to $65.7 million, good for 6.6% market share. The Cisco hyper-converged platforms are sold as a minimum three-node cluster of Cisco HyperFlex HX220c M5 all-flash or hybrid nodes integrated by Cisco UCS fabric interconnections.
HPE’s HCI storage system revenue grew by 144%, with third-quarter sales of $35.6 million more than doubling year over yearup from $14.6 million a year ago. The vendor last year launched the HPE HyperConverged 380 appliance to go after VMware customers. HPE followed that news by acquiring Nutanix early rival SimpliVity in January for $650 million.
IDC’s figures suggest there is room for other HCI storage competitors to stake a claim in the market. Nearly 39% of the HCI sales were attributed to other vendors, totaling more than $386 million.
Dell EMC, NetApp battle on converged reference designs
NetApp was late to the HCI market, making a foray In October with a hyper-converged platform based on SolidFire all-flash arrays. But according to IDC, NetApp is the only top-five storage vendor to post positive revenue in certified reference systems and integrated architecture. That market grew just 1.55% year over year to $1.44 billion.
The NetApp FlexPod converged infrastructure program combines NetApp storage with Cisco UCS servers and switching. Cisco-NetApp reference architecture produced $486 million, representing one-year revenue growth of 56%.
NetApp improved its share in converged systems reference architecture to 33.6%, up from 22%. Dell EMC still leads the pack with $697 million, but its sales and market share declined. Dell EMC controls 48% of the market for converged systems reference architecture, down from 51% last year. Its revenue in the category slipped from $729 million last year for a 4% drop. HPE experienced a larger drop, with sales tumbling 42% to $149 million.
IDC said integrated platform sales comprise about 18% of the converged systems market. Oracle dominated the segment with $20.4 million and nearly 46% market share.
Here’s something you rarely hear from high tech companies today:
“We’re a hardware company in our heart and soul.”
That is how executive vice president Brett Davis introduced iXsystems during a press tour in early December at the NAS vendor’s San Jose, California headquarters. The company sells open-source based TrueNAS enterprise hardware and FreeNAS desktop systems.
Self-identifying as a hardware company in this software-defined world is only one reason why iXsystems seems out of place in Silicon Valley. The vendor also bootstrapped its financing, turning a profit without accepting outside investment.
“We’re private, profitable and self-funded,” Davis said. “Our heritage goes back to the ‘90s. We’ve just been here. We say we’re unique, but you can say we’re weird.”
But it’s the hardware tag that provides most of the weirdness these days. The iXsystems headquarters includes a manufacturing facility in the back, where Davis said the 130-person company can fulfill 3,000 orders in a day.
The vendor claims more than 4,000 customers, including Sony, NBC, Duke University and NASA.
But the iXsystems strategy of bundling open-source storage software on commodity hardware isn’t that unusual. Plenty of others do that, and label it software-defined storage. But only iXsytems boasts it’s a hardware company even if its value comes from open-source projects.
The company’s roots date to Berkeley Software Design, Inc. (BSDi), which started in 1991. iXsystems founders founders Mike Lauth and Matt Olander acquired the hardware business from BSDi in 2002. Lauth has been the CEO and Olander the CTO since then. From the start, iXsystems was heavily involved in the FreeBSD project and is the project leader for FreeNAS Storage and TrueOS Desktop open-source operating systems.
Davis said 70% of iXsystems appliances are custom configurations. The vendor uses Intel, AMD and ARM chips inside. The systems support VMware, Microsoft Hyper-V, Citrix, KVM and OpenStack virtualization software, Hadoop, Docker and MySQL data and container platforms, and FreeNAS, FreeBSD, CentOS, Red Hat Linux and Ubuntu open source software.
“We’ve been doing open source since way before it was cool,” Davis said. “We give away the number one software-defined storage (FreeNAS), but software and hardware are inseparable.”
iXsystems re-sold storage systems from Dot Hill, Infortrend and others in the late 2000s, but had to rely on those vendors for support. Now iXsystems provides end-to-end support for its storage. The company acquired the FreeNAS project in late 2009, and then spent two years re-writing the operating system before making it commercially available. iXsystems ported the OpenZFS open-source enterprise file system to FreeNAS and TrueNAS. That gave FreeNAS and TrueNAS file, block and object support, triple parity RAID, support for flash and unlimited instant snapshots.
FreeNAS and TrueNAS are both unified storage systems, and TrueNAS also supports Fibre Channel networking. TrueNAS competes with the likes of Dell EMC VNXe and Unity, NetApp FAS, Hewlett Packard Enterprise’s MSA 2040 and Nimble arrays, and Western Digital’s Tegile platform.
“We can run virtual machines and containers in FreeNAS,” he said. “We have the capabilities to do it, and we have our own hypervisor. But it’s a competitive space, and we have other plans.”
HyTrust has revived DataGravity’s data-aware storage technology, six months after scooping up the startup’s assets.
Cloud infrastructure specialist HyTrust today launched the CloudAdvisor automated framework to detect, classify and protect compliance-sensitive data in multiple clouds and software-defined data centers. The CloudAdvisor virtual appliance integrates DataGravity’s data analytics and data tagging.
HyTrust CloudAdvisor continuously monitors content and notifies users of suspicious activities. It scans and classifies data stored in multiple clouds and physical data centers based on the data’s value. Automated policy enforcement provides data protection.
Target customers include data centers with a high volume of unstructured data storage and firms in regulated industries.
HyTrust CEO Eric Chiu said DataGravity’s automated data classification and data discovery provide key ingredients in CloudAdvisor.
“We took the DataGravity technology and repackaged it as a software appliance, starting with things like virtual machines and backup copies. Our goal is to define, detect and defend unstructured data, which is proliferating to the point that companies really don’t know where it exists,” Chiu said.
In another case, Chiu said one HyTrust customer lacked specific insight on the type of data in its files
“We did a scan of their VMs and found about 10,000 credit-card numbers and Social Security numbers in a public share,” he said. “The customer had no idea they were just sitting there. I think that’s going to be par for the course” as companies store increasingly vast data sets.
Using the cloud as a storage tier took a few years of getting used to, but companies have decided to use multiple hybrid clouds to reduce storage costs and boost disaster recovery. The multi-cloud approach is expected to gain further acceptance in response to General Data Protection Compliance that takes effect in European Union countries in 2018.
Ex-EqualLogic executives Paula Long and John Joseph launched DataGravity in 2014. The company emerged from stealth with data-aware Discovery Series hybrid arrays that combined metadata analytics with advanced data discovery, indexing and governance.
Long and Joseph dropped the hardware arrays in favor of a software-defined storage business model in 2015, but the move came too late. By that time, artificial intelligence, cognitive computing and machine learning were already taking hold. HyTrust swooped in to pick up DataGravity technology in an asset sale in June.
Flash storage vendor Tintri has had an inauspicious start to life as a public company. On Wednesday, during only its second earnings call, there were hints that Tintri may already be on life support.
The Tintri cloud customer base added 80 new logos last quarter, yet it won’t allay fears that the newly minted public company will struggle to outrun mounting losses. Despite recent restructuring, including cutting about 80 jobs last quarter, Tintri is rapidly burning through cash and will need supplemental capital to stay afloat.
CEO Ken Klein said all options are on the table, including entertaining potential buyers. “We are exploring strategic options available to the company to deliver value to shareholders, including the sale of the company” and further optimizations to hasten positive cash flow, Klein said.
Investors predictably pulled back on the news, with Tintri shares tumbling 13% at Thursday’s open to $4.50.
Tintri’s adjusted earnings per share of 79 cents was within its guidance, but quarter-over-quarter revenue fell 6% to $31.77 million. Tintri had forecast $36.5 million in revenue for the quarter. Tintri’s year-to-date revenue of $97 million is up 15% year over year.
Tintri carries an accumulated deficit of $439.2 million, up $101 million from last quarter. The revenue miss was blamed on “delayed and reduced purchases” of Tintri cloud storage gear. Tintri booked, but did not close, a number of deals ranging from $400,000 to $1 million.
“Our third quarter revenue was impacted by continued headwinds we have faced since our IPO in June,” Klein said.
Those deals remain in the pipeline, with “cautious” revenue guidance next quarter in the range of $25 million to $27 million, Tintri CFO Ian Halifax said.
Tintri cloud arrays package flash storage and web-scale software to design private cloud infrastructure that mimics the performance of the public cloud. Tintri Enterprise Cloud Series 6000 arrays, rolled out in September, are gaining traction with users, Klein said. The EC 6000 accounted for half of the vendor’s $22.8 million in product revenue last quarter. The vendor also rolled out the T1000 platform for remote branch office and departmental deployments.
Software sales accounted for 16% of total product revenue ($3.64 million), up 2%. The increase was fueled in part by the launch of Tintri Cloud Connector, an S3-compatible integration that allows customers to tie local Tintri storage to Amazon Web Services and IBM Cloud Object Storage. Tintri also includes predictive analytics for sizing compute and storage.
What will Tintri do next?
At the time of its initial public offering in June, Tintri claimed its revenue soared 150% percent between 2015 and 2017, although debt and expenses kept the balance sheet in the red. Tintri’s IPO also came at a time when investors started to sour on storage infrastructure equities.
Lukewarm interest forced Tintri to revise its share price from $11 to $7 per share. It wound up netting proceeds of about $60 million, slightly less than half its initial $109 million target.
The Tintri cloud hardware model faces a changing competitive landscape. More and more hardware-centric vendors are shifting to software-defined storage services. Klein said Tintri plans to stay the course for now.
“The feedback from our customers, particularly the use cases that we are addressing, is that we have the right model,” Klein said.
Toshiba plans to deliver storage node software designed to extend the high performance and low latency benefits of NVMe-based solid-state drives over a network fabric.
Toshiba Memory America’s non-volatile memory express over fabrics (NVMe-oF) target software is due in the first quarter of 2018. The University of New Hampshire Interoperability Lab recently certified the unnamed Toshiba storage software with RDMA over Converged Ethernet (RoCE) network interface cards (NICs) in the storage node.
The Toshiba software runs in a target storage server and virtualizes the NVMe-based PCI Express (PCIe) solid-state drives (SSDs) from the box into a single pool, according to Joel Dedrick, a system architect for NVMe-oF at Toshiba. He said the Toshiba storage node software would enable popular datacenter orchestration systems to provision the NVMe SSDs and manage the drives, their wear and various other functions.
“Our goal here is to make the world a better place for NVMe,” he said.
Vendors such as E8 Storage, Mangstor and Vexata bundle software on NVMe hardware, but few standalone software applications for NVMe exist. Toshiba’s software competition will include startup Excelero, whose NVMesh product also virtualizes and pools NVMe-based SSDs and aims to enable applications to access them at high speed and low latency over a network fabric. Excelero cites its patented Remote Direct Drive Access (RDDA) technology as a performance differentiator. Dedrick said Toshiba’s storage node software takes a different architectural approach, and the company’s expertise in managing physical flash will also set its product apart.
Impetus for Toshiba storage software
Dedrick said Toshiba decided to get into the storage software business because it views NVMe over Fabrics as “an enormously important development” that will spur data centers to convert from traditional SCSI to latency-lowering NVMe to transfer data between clients and storage devices.
SCSI was designed with hard disk drives (HDDs) in mind, and newer NVMe targets faster solid-state storage, using a more streamlined command set to process I/O requests. NVMe requires less than half the number of CPU instructions that SCSI does with SAS drives. NVMe also supports 64,000 commands in a single message queue and as many as 65,535 I/O queues, whereas a SAS device typically supports a maximum of 256 commands in one queue.
NVMe over Fabrics is designed to extend the latency-lowering, performance-enhancing advantages of NVMe over a network fabric. Toshiba recommends 100 Gigabit Ethernet for deployments with multiple NVMe-based SSDs in the storage node, although no minimum speed is required.
“The larger the number of drives that you aggregate in a single place, the bigger the network pipe is going to want to be going in and out of there,” Dedrick said.
The new Toshiba storage node software will target enterprises with high-performance databases, according to Dedrick.
Dedrick said Toshiba plans to test and certify servers that run its storage node software, and potential enterprise customers could request it through their OEMs. He said Toshiba also plans to license the software to ODM/OEM partners, which could sell it as a value-added option for their standard offerings.
Toshiba did not disclose pricing for the storage node software.