Storage Soup

November 6, 2018  4:55 PM

Maxta aims to bring AI to HCI with MxIQ

Dave Raffo Dave Raffo Profile: Dave Raffo

Maxta is bringing predictive analytics into hyper-convergence.

Maxta Hyperconvergence software runs on x86 hardware using VMware or Red Hat Enterprise Virtualization hypervisors to create hyper-converged clusters. Today it officially launched MxIQ analytics, which is designed to work similarly to the predictive analytics that have become popular on storage arrays.

Unlike performance analytics running on storage arrays, though, MxIQ analyzes logs on servers, hypervisors and networking devices as well as storage.

“We heard from partners and customers that they were flying blind, they didn’t know when they were running out of capacity or performance,” Maxta VP of product management Kiran Sreenivasamurthy said. “They did not have hard data to understand the behavior of a cluster.”

MxIQ is a free feature built into Maxta Hyperconvergence software but Sreenivasamurthy said Maxta will consider charging for advance features planned in future releases.  For now, all new Maxta customers and those who upgrade to the latest software version (3.4.1) have access to Maxta MxIQ.

“In the future, we’re looking at applying changes at customers’ sites using recommendations based on machine learning,” Sreenivasamurthy said. “We’ll react to changes that we see and anomalies that we detect.”

Maxta MxIQ looks at compute, storage, network and virtualization under management of Maxta software to determine and forecast system health and availability. Sreenivasamurthy said MxIQ can tell customers if a drive is close to failing, whether new components are compatible with existing hardware, and if performance issues are caused by storage, compute or networking.  The software sends capacity and availability alerts and shows usage trends. MxIq uses statistics from its entire customer base to make its forecasts.

Maxta MxIQ consists of software that runs on AWS or in a customer’s data center and agents that install on all severs in a cluster. The agents collect information on the servers and send them to the software running on AWS or in the data center.

Sreenivasamurthy said customers can opt out of sharing their information from one cluster, an individual server or from any of their servers. He said at least 10 customers have used the software during its early release program.

MxIQ runs in active mode in one server in each cluster and runs in passive mode on the other servers. If the active server fails, one of the passive devices in that cluster becomes active.

MxIQ has three levels of user privilege – customer, partner and admin. Partners are service providers or other Maxta partners who sell its hype-rconverged software. Maxta MxIQ shows them all of their end-user customer clusters. The Admin is the person supporting the software, either the end customer or partner.

November 1, 2018  9:36 AM

Nutanix shines Beam into private cloud

Dave Raffo Dave Raffo Profile: Dave Raffo

Nutanix is extending the Beam capability it launched six months ago for Amazon Web Services and Microsoft Azure into private clouds running on Nutanix hyper-converged infrastructure.

Nutanix Beam is a software-as-a-service (SaaS) offering that gives Nutanix HCI customers visibility into their costs across public clouds and now on-premises deployments.  Beam serves as a multi-cloud management dashboard that shows how applications are consuming cloud resources. Its new release also adds the ability to analyze cloud consumption trends to help plan future spending, and show customers the cost of Nutanix clusters and software licenses allocated to each cluster.

Greg Smith, Nutanix VP of product marketing, called the Beam extension “an important milestone for Nutanix in building our product portfolio.” He said the vendor intended to extend the reach of Beam from the start, and the API-driven application can eventually expand to more clouds such as Google as well as non-Nutanix on-prem systems.

“This is consistent with our vision and product strategy to provide the framework customers need to manage resources across multiple clouds,” he said.

Smith said early Nutanix Beam users have expressed interest in extending it to on-premises appliances.

“We have a lot of requests for this,” he said. “It’s one thing we need to do to sell aggressively and our customers need to deploy aggressively into larger Nutanix environments.  They wanted that integration. Most customers are using Nutanix in a private cloud but also have applications running in public clouds. What they wanted was a global view.”

Smith said the new functionality for Beam is in early access beginning today and is expected to be generally available within a few weeks.

Nutanix picked up Beam technology from its March acquisition of Minjar. Minjar sold Beam as its Botmetric service for AWS and Azure.

Nutanix Beam at the Nutanix .NEXT customer show in May, along with Nutanix Flow networking and Nutanix Era application management.

October 31, 2018  2:39 PM

Pivot3 spins larger HCI deals

Dave Raffo Dave Raffo Profile: Dave Raffo

If Pivot3 is a valid indicator, the hyper-converged market is not only adding customers rapidly, it’s also moving into far bigger deals.

Pivot3 was among the first hyper-converged vendors, and the private company now competes with the largest storage and server vendors. Pivot3 claims its third-quarter books grew 50% over the same period last year. More impressively, it’s average deal size in the quarter nearly doubled from the 2017 average. Pivot3 said its deal size increased 95%, as 86% of new bookings came from large enterprises and more than 70% of new cutomers deployed multiple workloads on Pivot3 Acuity HCI products.

Bruce Milne, Pivot3’s chief marketing officer, said the vendor last quarter landed its biggest deal ever, a federal government contract. Milne said Pivot3’s background selling storage for video surveillance is paying off, as cities around the world are buying its HCI systems to deploy analytics and applications such as facial and license plate recognition software.

“More people are deploying an entire platform as opposed to a single application,” Milne said. “As market acceptance of hyper-convergence accelerates, larger companies are also realizing they’ve been missing out.”

Milne said Pivot3 has more than 2,500 customers. It has raised $253 million in venture funding, but none since early 2016.

Milne said Pivot3 is “making good progress” towards profitability but remains in growth mode. He said Pivot3’s investors are happy with its progress, and the company is adopting good business practices. He said Pivot3 is aiming for profitability in late 2019. “We won’t predict a specific quarter [to hit break-even], we can continue to invest in innovation,” Milne said. “We’re competing with big vendors and we don’t have a huge amount of pressure to become profitable. We went to do it for our own purposes. We’ve flattened our spending and increased bookings. It’s a good trajectory to be on.”

Milne said Pivot3 will add business policity management capabilities to its Acuity product by the end of 2018, extending its ability to manage data services and application on the platform. He said Pivot3 will also enable the ability to move workloads to and from public clouds, the way it does now for data.

That follows a few hardware releases in recent months. Pivot3 in August released a ruggedized system aimed primarily at military and intelligence organizations in the field, and formalized a partnership with Lenovo to sell its HCI software on ThinkSystem servers.

Milne said a large deal with the city of Bogota, Colombia, involving Pivot3 and Lenovo helped bring about the formal partnership.

“That cemented an opportunitie for smart cities, internet of things and edge computing with Lenovo, and was the catalyst for a larger joint go-to-market deal,” he said.

October 31, 2018  8:50 AM

Commvault staggers under weight of ‘massive changes’

Dave Raffo Dave Raffo Profile: Dave Raffo

While going through a transition period, data protection vendor Commvault’s sales are still taking a hit.

Commvault’s revenue for last quarter of $169.1 million fell $10 million below Wall Street expectations based on the vendor’s previous forecast. Overall revenue decreased four percent from the previous quarter and ticked up just one percent over the same quarter last year. Product revenue of $69.5 million decreased seven percent from the previous quarter and dropped three percent year-over-year.

Commvault did post an $891,000 profit after losing more than $1 million last year, but that was due largely to staff cuts and reduced spending. Commvault reduced its work force by around seven percent over the last six months, and finished September with 2,644 employees.

Commvault has failed to hit its revenue goals for four straight quarters. The company is searching for a new CEO to replace Bob Hammer, who is stepping down following criticism from activist investor Elliott Management earlier this year. It is also shifting to a subscription pricing model, and it revamped its product lineup this year to try to make it simpler to sell all its various data protection and management products. All of this is part of the Commvault Advance initiative adopted after Elliott’s stinging critique of company management. That includes a re-branding of products under a Commvault Complete program.

Hammer, who plans to remain on the Commvault board after he steps down as CEO, had little to say about the search for his replacement. He said the CEO search is “well under way. The search committee is making good progress.”

When an analyst on the call pressed him for more of an update, Hammer repeated “the search committee is making very good progress on the CEO search.”

Commvault never put a timeframe on its CEO search, but industry sources say the board hoped to have a replacement before this month’s Commvault GO user conference. Now, sources say, Commvault may not have a replacement for Hammer before the end of 2018.

Bill Wohl, Commvault’s chief communications officer, said the vendor never set a goal to have a new CEO by GO. Wohl also said Commvault Advance was planned long before management received Elliott’s letter, although the plan went into place several weeks after Elliott made the letter public.

“We are exactly where we planned to be as far as the CEO goes,” Wohl said.

Commvault’s transition comes as its largest rivals, Veritas and Dell EMC, come out of corporate restructuring of their own. Commvault also faces formidable well-funded startups Cohesity and Rubrik, and Veeam is making inroads in its move to the enterprise.

Commvault is going full speed ahead with its Commvault Advance and Commvault Complete programs, though. Hammer and CFO Brian Carolan frequently used the word “disruption” Tuesday on the company’s earnings call but claimed Commvault is headed in the right direction.

“While there was a higher level of disruption than we had anticipated, the most significant changes are now largely completed and we are focused on go-forward execution,” Hammer said.

Commvault executives spent much of the earnings call walking through the long-term financial benefits of moving to subscription pricing. They characterized subscription pricing as “repeatable” revenue compared to its historic perpetual licensing.

“While we are not satisfied with our revenue performance, we are seeing strong early momentum from our Commvault Advance initiatives, and are excited about our accelerating subscription revenue,” Hammer said.

Commvault forecasted approximately $181 million in revenue for this quarter and $189 million next quarter, putting its fiscal year total at $715 million. Those numbers are based on software sales of approximately $82 million this quarter and $86.5 million next quarter. Hammer characterized the guidance as a “conservative near-term outlook” and they represent modest year-over-year growth. He said Commvault’s goal is to grow revenue nine percent next year.

“The whole foundation of Commvault Complete was not to try to make band-aid changes,” Hammer said. “It makes fundamental changes in our products, pricing, routes to market, and a much more efficient cost structure. So internally, there is a lot of optimism. I really think we’ve done this the right way, although it had some intended risks as we made these massive changes.”

October 30, 2018  2:04 PM

Veeam CEO shakeup: Co-founder in, McKay out

Paul Crocetti Paul Crocetti Profile: Paul Crocetti

Veeam Software has reshuffled its executive team, promoting co-founder Andrei Baronov to CEO, while Peter McKay is no longer with the data protection company after 2 ½ years as an executive, including more than a year as co-CEO and president.

In addition to the Veeam CEO switch, co-founder Ratmir Timashev is now executive vice president of worldwide sales and marketing, and William Largent is executive vice president of operations. Both Timashev and Largent will report to Baronov, according to Veeam.

“That was Peter McKay’s decision, driven by a desire to pursue some other opportunities,” Timashev said Tuesday about his departure, shortly after Veeam announced the changes publicly. “It was a little bit of a surprise to me.”

McKay joined the company as COO and president in July 2016, following a stint in management at VMware. He had shared Veeam CEO duties with Baronov since May 2017.

As co-CEOs, McKay led Veeam’s “go-to-market,” finance and human resources functions, while Baronov oversaw research and development, as well as product management.

Timashev said he and Largent will split McKay’s work. Timashev will focus on sales and marketing, while Largent will work in finance, corporate governance and human resources.

Earlier this month, Veeam said the third quarter was its 41st consecutive quarter of double-digit growth. Veeam previously reported $827 million in annual bookings revenue in 2017, an increase of 36% year over year. Veeam claims 320,000 customers and aims to become a billion-dollar company.

Headshot of Veeam's Ratmir Timashev

Ratmir Timashev

In September, Veeam said human error caused a database with 4.5 million unique email addresses to be accessible to third parties for two weeks. McKay apologized for the breach. Timashev said the issue has been resolved and McKay’s departure was not related to the breach.

Veeam was founded in 2006 as a virtual backup provider, but has since added physical and cloud protection, as well as data management capabilities. It has also expanded its customer base in taking on more enterprise customers.

“I’m very happy with the progress we’ve made,” especially on the enterprise front, Timashev said.

Timashev said he does not expect major changes in company strategy following the Veeam CEO shift.

On his LinkedIn page, McKay said he was honored to lead Veeam and influence its customers, partners and employees.

“As I enter the next phase of my career, I’m going to miss being a part of this exciting team, but I’m also looking forward to the next stage,” McKay wrote in a LinkedIn post Tuesday. “Oftentimes as we drive towards a goal, we can lose touch with those who are the most important in our lives. I’m looking forward to the chance to focus on some downtime to reenergize my body as well as spend quality time with my family and friends who have been patient with me over the years.”

Timashev was Veeam CEO from 2006 to 2016, when Largent replaced him. McKay and Baronov became co-CEOs a year later, when Largent moved into the role of chairman of the Finance and Compensation Committees.

Timashev remained active in Veeam after stepping down as CEO, most recently serving as senior vice president of marketing and corporate development.

Check back with SearchDataBackup for more on the Veeam CEO news.

October 30, 2018  7:28 AM

NetApp Insight: Yes, NetApp still sells hardware

Dave Raffo Dave Raffo Profile: Dave Raffo
NetApp, Storage

At the end of the final keynote at NetApp Insight last week, founder Dave Hitz decided to add some sanity lest people get the wrong idea about the vendor’s strategy.

Hitz summed up the message at Insight to that point as “Cloud, cloud, cloud, cloud.”

“You could be forgiven at the end of Day One to think, ‘Do they even think about on-prem anymore?'” he asked.

Hitz said a customer asked him, “How long before you think NetApp stops selling hardware?’ I said, ‘Dude, that is not the goal.’ We believe if we can help people get the data into the cloud that they want in the cloud, we will gain share on-prem. They’ll buy our on-prem gear because we make it easy to get to the cloud and back from the cloud. They want to be able to manage back and forth. It’s not our goal to stop selling on-prem.”

Hitz, who helped start NetApp in 1992, summed up its current strategy as, “not your grandfather’s NetApp.”

That is by design. NetApp CEO George Kurian said the vendor must adapt to keep up with the changing technology landscape. NetApp still derives an overwhelming majority of its revenue from on-premise storage but pushes a Data Fabric that includes a wide variety of cloud storage. The goal is to enable identical management of storage in the cloud and on-premises.

“Business as usual in IT will not enable success during this digital transformation,” Kurian said. “This transformation requires you to change the way the business and IT work together.”

Joel Reich, who has Kurian’s old job as executive vice president of product operations, said in the early days of NetApp’s Data Fabric focus, “people thought we were crazy, embracing the thing [cloud] that would kill use. They thought it was a crazy strategy. But we’ve re-positioned NetApp around the Data Fabric, telling a distinct story to inspire innovation and lead with the cloud.”

NetApp upgraded its flagship OnTap operating system at Insight. It brought out version 9.5 with a new Max Data product that helps take advantage of persistent memory in servers. But most of the product rollouts at the show were public or private cloud related.

They include StorageGridSG6060 object storage, NetApp Cloud Insights, Azure NetApp Files, Cloud Volumes Service, Cloud Volumes OnTap, SaaS Backup for Office 365, NetApp Data Availability Services and NetApp Kubernetes as a Service (NKS).

Another change for NetApp is the way it has embraced technology from acquisitions. For much of its history, NetApp was considered bad with acquisitions. Clustered NAS startup Spinnaker Networks was Exhibit A. It took NetApp more than a decade to embed Spinnaker’s technology into OnTap. A knock on NetApp was that it couldn’t master any technology outside of OnTap.

But its current “cloud-connected flash” portfolio includes flash technology from SolidFire (NetApp HCI is built on it), GridStor object storage from Bycast, Cloud Insights resource management from Onaro, cloud backup from Riverbed, persistent memory storage (Max Data) from Plexistor and container orchestration (NKS) from StackPointCloud.

If NetApp succeeds in the cloud, it will be on the back of technologies it acquired.

“This is about the new NetApp and what we’re becoming,” NetApp CMO Jean English said. “It’s a journey. The legacy of who NetApp has always been is a starting point.”

October 29, 2018  8:52 AM

Security patch issued for Arcserve Unified Data Protection

Paul Crocetti Paul Crocetti Profile: Paul Crocetti

Arcserve Unified Data Protection customers are being told to patch the backup platform after a security provider found issues that could leave data unprotected.

The four vulnerabilities in Arcserve UDP could compromise sensitive data through access to credentials, phishing attacks and the ability for a hacker to read files without authentication from the hosting system, according to Digital Defense, the company that discovered the problems.

Digital Defense, based in San Antonio, reached out to Arcserve with technical details of the vulnerabilities, said Mike Cotton, senior vice president of engineering at the security provider, which disclosed its findings publicly last week.

“[We] walked them through scenarios with how attackers can exploit the vulnerabilities in question,” Cotton wrote in an email. “Their team was extremely professional and they were very proactive in wanting to understand where the vulnerabilities were and how precisely to fix them.”

The vulnerabilities affect Arcserve Unified Data Protection 6.5, updates 3 and 4. Update 4 launched last month. UDP, Arcserve’s flagship product, features backup, recovery, automated testing, granular reporting and hardware snapshot support.

Arcserve Unified Data Protection customers can download a patch from Arcserve Support and reach out to the company to address any outstanding questions or concerns, the vendor said. Arcserve, based in Eden Prairie, Minn., also provided manual fix application instructions.

“Arcserve is committed to developing data protection solutions that meet the highest security standards to protect our partners, customers and, most importantly, their data,” the data protection vendor said in a statement. “We welcome reports from security researchers and experts so we can quickly and efficiently address any vulnerabilities, which was done by our incident response team in this case.”

Cotton said installing Arcserve’s patch is the best way to address these particular flaws.

“More generally, undertaking controlled network access strategies to limit access to the administrative interfaces of key backup systems can further harden installations such as this,” Cotton wrote.

Digital Defense regularly works with vendors regarding the disclosure of zero-day vulnerabilities. When the company’s Vulnerability Research Team finds issues and validates them, it contacts the affected vendor and helps with remediation actions.

Digital Defense has found vulnerabilities in other major backup products, including Dell EMC’s Avamar in 2017, but Cotton said this is the first time the company has worked with Arcserve.

“We believe they’ve addressed the flaws in question for these vulnerabilities,” Cotton wrote, “so no further action is necessary for them.”

October 25, 2018  3:58 PM

Lenovo investment pushes Scale Computing closer to edge

Dave Raffo Dave Raffo Profile: Dave Raffo
Scale computing

Scale Computing CEO Jeff Ready said the hyper-converged vendor was on the edge of profitability before it decided to expand its edge computing sales.

Scale today said it has secured $21.2 million in funding, with another $13.6 million coming by the end of the year. The $34.8 million in new funding – led by strategic partner Lenovo – brings Scale’s total funding to $95.8 million.

It was Scale’s first funding since an $18 million round in 2015, and Ready said he was working to bring the Indianapolis-based hyper-converged pioneer into the black.

“I’m not from Silicon Valley,” he said. “I want to run this as a real business. I was on a plan that said, ‘Let’s make a profitable company.’ We were close. But we’ve decided to take a step back and invest more and push out profitability out a little bit, probably around a year. This opportunity was not expected two years ago.”

Scale started selling non-VMware based hyper-convergence to SMBs in 2012, putting it on a separate track from most early hyper-converged players. That is still the bulk of its business, and Ready said Scale has more than 3,000 customer deployments. But it has discovered a lucrative market of selling to companies in industries such as retail, healthcare and manufacturing, which have many small sites with little or no IT staff. Ready said Scale’s software and appliances are a good fit in these edge sites because they are easy to manage and have self-healing capabilities.

“SMB is still our bread and butter, and it’s why we’ve ended up positioned so well for edge environments,” he said. “They look like most of our SMB customers, with zero, one or maybe two people on site. They need applications to run and take care of themselves and prefer something that encompasses the entire infrastructure stack. They need a system that runs itself, they have no people or expertise locally to babysit them.”

There is one big difference between SMB and edge customers, though. The companies with many small sites also need central management for all of them. Ready said Scale manages hundreds of clusters as one large storage pool.

Scale isn’t alone on the edge, though. VMware claims it signed a deal with a retailer to install vSAN hyper-converged software on Dell EMC VxRail appliances across 1,200 stores this year.

Ready said he has considered VMware his main competition from the start, mainly because Scale gives customers the option of not using VMware.

“VMware inevitably will be our main competitors,” he said. “I’ve always considered VMware to be the main competitor.”

Ready declined to say how many employees Scale has, but said it’s “in the hundreds” and will add a least one hundred more over the next year. Many of those will come in the sales, marketing and channel teams. He is also counting on Lenovo to expand Scale’s market reach. Scale and Lenovo this month launched a formal partnership to sell an appliance with Lenovo servers running Scale’s HC3 Edge platform.

Allos Ventures, a Scale investor since its A Series round, also participated in the F Series funding. Ready said most of Scale’s other previous investors are also involved in the current round.

October 22, 2018  8:25 AM

Quantum CEO ready to acquire to expand StorNext

Dave Raffo Dave Raffo Profile: Dave Raffo
Quantum, StorNext

Jamie Lerner’s first 90 days as Quantum CEO confirmed the idea he had about the company when he took over. Discussions with customers convinced Lerner that Quantum’s greatest strength and greatest opportunity is storage for managing rich media data and video.

Now he has a solid plan for going forward: to complement Quantum’s StorNext file system technology with software features developed internally and through acquisition. Lerner said Quantum will also redesign its sales team more around specific solutions than geographies.

“What Oracle is to data management and what Cisco is to networking, that’s what Quantum is to rich media data,” Lerner said in an exclusive interview. “We see ourselves as the leader in  infrastructure for managing rich media and video.”

Lerner said Quantum will add storage services intended specifically for that rich media and video. These services differ than storage features for traditional IT applications.

“We have storage, policy management and tiering technology,” Lerner said. “Over the next six to 18 months, we will layer on data services for video. Traditional data services – deduplication, compression, snapshots, clones, replication – are rarely used on video. With video you need a totally different set of data services. You need to search not just by keyword, but by image. You need deep media catalogues to know what media assets you have, what form they’re in, who has edited them. And you need a lot of analytics for video surveillance. Are people on video having an argument or holding a weapon? Has someone left a bag for a long period? Those are the data services needed for video.”

Lerner said Quantum will make core architectural changes by layering software modules on its current product. Those new modules will mostly be subscription based and cloud-hosted. He also said Quantum is likely to become a more aggressive acquirer of smaller storage companies.

“You’ll see a combination of tuck-in acquisitions to add features, skills and capabilities, and we’ll likely buy some technologies that are complete standalone entities,” he said. “They’ll be mostly software and cloud in nature, and heavy in rich media data services – analytics, a search catalog and other areas that will bolster our ability to handles petabytes of rich media.”

Lerner said he sees Quantum’s tape products retaining a large role in long-term archiving and cloud storage. The new Quantum CEO said the vendor will continue to see its DXi disk-backup appliances, while its dedupe capabilities will be woven into other storage platforms as a feature. But the main focus of development will be around StorNext and storage for rich media and video.

“Customers have figured out how to manage Oracle, they’ve figured out how to manage their email, but they are really struggling when incorporating video into their business,” Lerner said. “That’s the fastest segment of data growth.”

When Lerner became  Quantum CEO in July, the vendor was knee deep in an internal accounting probe to find the cause of financial reporting irregularities. The probe is now complete. The main issue it found was that Quantum recorded revenue earlier than it should have, with approximately $25 million to $35 million of prematurely recognized revenue as of June 30, 2018. “We expect it to be good revenue but it was recorded too soon,” Lerner said.

Quantum detailed those findings in an SEC filing in September. Now it is restating past quarters to place the revenue in the right periods. The Quantum CEO said the restatements are not expected to affect cash flow.

He said he expects the restatements to wrap up by the end of the year, so Quantum can begin filing its quarterly earnings reports again in early 2019.

“Most of the deep concern phase is behind us,” he said. “Now we’re putting in place new loans and accounting procedures. We’re on the down slope of most of the unfortunate things that have happened to this company over the last couple of years.”


October 19, 2018  5:26 PM

NetApp Insight 2018: A primer on storage for AI, flash-enabled cloud

Garry Kranz Garry Kranz Profile: Garry Kranz

LAS VEGAS-NetApp expects about 5,000 customers and partners to gather here Monday as it lays out a roadmap for flash-enabled AI and cloud applications.

NetApp Insight 2018 marks the fifth year the event has been open to analysts, OEMs and the media. Prior to 2014, NetApp used Insight exclusively as technical training for data center managers. In 2017, NetApp Insight got off to a late start, as opening day was postponed following a mass shooting event on the Las Vegas Strip.

The three-day event will cover how the NetApp Data Fabric technologies extend to broader cloud use cases, said Kris Newton, a NetApp vice president of corporate communications and investor relations.

“We know that pretty much every organization, at some level, is thinking through AI and the cloud.  We’ll have lots of discussion around how our customers can optimize their move to AI and the cloud, and see real results,” Newton said.

AI use cases cut across verticals, Newton said, spurring demand for faster storage and more efficient configurations to ingest data.

“AI puts pressure on your storage. You need storage that’s lightning-fast. You can’t wait around for your storage to respond,” Newton said.

Although she didn’t reveal details, Newton said NetApp Insight will highlight the role NVMe flash and storage class memory technologies play in a modernized data center. NetApp this year added an NVMe-based model to its All Flash FAS (AFF) Series arrays, mirroring similar moves by rival Dell EMC and others. The NVMe version of FAS allows customers to upgrade an existing FAS array by upgrading the OnTap operating system.

NVMe storage uses PCI Express to send traffic directly to CPUs, providing faster data transfer than traditional iSCSI command hops.

Sales of NVMe all-flash arrays will generate about $500 million in 2018, according to a report by analyst firm IDC, based in Framingham, Mass.

Storage arrays that extend NVMe from the back end to application hosts are sometimes known as rack-scale flash. NetApp’s AFF with NVMe technically doesn’t fit the definition, since NVMe runs on the front end, but allows customers to continue using SAS SSDs.

It wouldn’t be a NetApp Insight conference without product news. NetApp hinted it would reveal upgrades to its OnTap-based storage for converged systems, file storage and object platforms, as well as deeper integration for multicloud support.

Another point of interest will be any new details forthcoming on NetApp’s recent joint venture in China with server maker Lenovo. Under that deal, Lenovo will sell NetApp storage under its ThinkSystem brand.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: