Within the next few weeks, Storage Soup is moving to IT Knowledge Exchange, a TechTarget site where IT pros can ask or answer technical questions or follow dozens of IT blogs hosted there.
We’re moving our blog there to bring you closer to your peers in the storage industry.
The content of Storage Soup won’t change. Only our address will change — and we’ll automatically redirect you there when the change happens.
Once we move, be sure to bookmark the new link, and if you’re into RSS, subscribe to us using your favorite feed reader.
Backup software vendor Asigra is looking to “hold storage vendors’ feet to the fire” with a new free tool it will be offering to customers to validate IOPS on disk-based backup hardware. According to Asigra executive vice president Eran Farajun, sometimes customers end up buying systems to support disk-based backup that are overkill thanks to persuasive salespeople, and others “think they can get away with a cheaper version” and under-buy.
The new tool will simulate read/write loads of different file system configurations and user profiles to simulate the work that Asigra’s backup software would place on the system. Users can simulate configurations of 300, 500 and 1000 sites. “Maybe [getting adequate backup performance out of a disk array] means you switch from Fibre Channel to SATA drives, or you go from a 32-bit to a 64-bit NAS head,” Farajun said. Results of the testing done by the I/O simulator are generally ready in a week or two depending on workload, and can be fed into Excel spreadsheets or Crystal Reports.
Vendors already post results of benchmark testing on sites like SPEC.org, but according to Farajun, “Most of the time, those numbers aren’t that helpful–they’re optimal, utopian, statistical numbers. It’s like going to buy a Camry and the salesperson tells you how fast the NASCAR version can go on a closed course with a professional driver.”
Enterprise Strategy Group analyst Bob Laliberte said he saw this move being less about taking on storage vendors and more about Asigra attempting to boost its software’s appeal in the market. “Everyone’s trying to do what they can to show value and value add in this economy,” Laliberte said, adding that one potential application for the new tool would be among service providers who could pass along any cost savings realized by paring down backup infrastructures to customers.
Many in the storage industry are wondering about the fate of the Sun StorageTek business following Sun’s revelation of its umpteenth restructuring last week. In a press release issued Friday, Sun said it will be laying off between 5,000 and 6,000 employees, or 15% to 18% of its workforce, in an effort to save $700 million to $800 million in the next fiscal year.
Sun’s continually dismal earnings reports (it reported a revenue decline of $1.7 billion for the most recent quarter) already led to speculation that the company will be taken private or sell off parts of its business. But the buzz in the industry is intensifying with this lastest layoff because of the restructuring’s keen focus on open source software, which is where Sun has been turning its efforts in storage as well.
The elephant left in the room is the “traditional” storage business, most of it acquired for $4.1 billion with StorageTek three years ago.
Sun’s storage business now consists mainly of tape from StorageTek and open storage. But Sun is primarily interested in developing its own ideas and making its own way. CEO Jonathan Schwartz made clear on the earnings call that open storage would be the focus going forward. “We believe we can expect strong growth in open storage as the adoption of ZFS continues and the need for open systems becomes ever more critical for customers seeking better performance at a lower price than traditional NAS,”he said.
Now, sources with inside knowledge of Sun’s storage strategy point to the realignment of key executives to focus on software as a confirmation that Sun is getting ready to pull the plug on the traditional storage business. Sun shifted Anil Gadre from chief marketing officer to the head of a new software-focused business unit, and moved the Solaris, Virtualization, and Systems Management Software divisions under Systems executive vice president John Fowler. One source said the shift in focus can’t bode well for the traditional business.
“Sun’s key systems and marketing execs are all now in charge of software business units – or they have left the company, like Andy Bechtolsheim,” said the industry insider.
According to this source, “the facts are that Sun can’t sustain its current business given its current fiscal performance. Sun’s core expertise and strategy is Solaris, file systems, [and] systems software. All indications are that Sun will continue to invest here, but economically the company must divest or sell other assets not critical to Sun’s future or core competency…the traditional storage business is clearly packaged for a sell off.”
One shareholder, Southeastern Asset Management, which took a 21% stake in Sun last week, has also stated publicly it views Sun as a software company. Reuters reported that Southeastern “said it might go around the technology company’s board to talk to ‘third parties’ about alternatives,” though that story also notes that a buyer for all of Sun as a whole is unlikely. However, “one business that Sun could sell fairly easily is StorageTek, a data storage business that it bought in 2005 for $4.1 billion. Bankers estimated it to be worth $750 million to $1 billion today,” the Reuters report adds.
If Sun is looking to sell off what’s left of StorageTek, who will buy? While a fire-sale price compared with what Sun paid for it, $750 million to $1 billion is still a hefty price for anyone to pay for tape in the current financial climate. Unless there’s some surprise white knight waiting in the wings to take on a tape storage business in this kind of economy, it could still be back to the drawing board for Sun…again.
At the end of a busy Monday, Symantec revealed that John Thompson is giving up his CEO post and “retiring” to the role of Chairman of the Board, with COO Enrique Salem taking over as CEO.
I was somewhat surprised by this move. I met Thompson at Symantec Vision this June and found him sharp and personable. His age, 59, also seems a bit young for retirement.
However, financial bloggers have seen this differently. For example, 24/7 Wall Street.com put Thompson on its list of “CEO’s to Go” in 2008, with a detailed explanation of the ways Thompson and Symantec have run afoul of Wall Street. The complaints are mostly due to a stubbornly low stock price, attributable in part to the Veritas and Altiris acquisitions. “Wall Street hated the change of strategy [with Veritas] and still dislikes it. To us, the storage play and the data security play makes sense. But money talks and the money is against this merger even after two-years,” the 24/7 hit-list article stated.
In response to today’s news, 24/7 Wall St. author Jon C. Ogg writes, “this was with some mixed emotion because we have heard such great things about him and believed him to be a high-caliber person. Because we thought well of him, despite his company’s share performance, we said it isn’t too late for Thompson and we think there is a real shot that he’s be more valuable to keep as Chairman with a new CEO rather than an outright revolution. ”
Sigh. I don’t know about you, but right about now, I am a bit sick of hearing about Wall Street. Maybe Thompson felt the same way.
The Planet, which hosts servers and applications for its customers, is adding a cloud-based storage service to its offerings based on a new partnership with cloud storage provider Nirvanix. Nirvanix came out of stealth last fall, claiming it could offer better performance than established cloud players like Amazon S3. The reason was because it is constructing a global network of storage “nodes” based on the way Web content delivery systems work: by moving the storage and processing power closer to the user, cutting down on network latency.
With this agreement, The Planet is making one of its Texas data centers a new Nirvanix node, bringing the storage cloud to its existing hosting customers. The Planet will also be opening up the new infrastructure to non-hosted, storage-only customers in 1Q 2009, according to Rob Walters, general manager of the storage and data protection business unit at The Planet.
The Planet will join an already crowded cloud storage market which includes big players like EMC and Amazon. According to Walters, one differentiator between The Planet’s and Amazon’s cloud offerings is that the Planet will not charge for “requests” to the cloud file system in addition to capacity and bandwidth charges, which Amazon does for some of its services. “Stripping it down to bandwidth and capacity is a big piece of our value-add,” he said.
Can you hear the Decho? That’s what EMC’s calling its newly launched subsidiary, Decho (short for Digital Echo) Corp. Decho will be a combination of two formerly separate EMC businesses: Mozy Inc. and Pi Corp.
Decho will continue to offer consumers and businesses Mozy-branded online backup services, including Mozy Enterprise, though Decho’s focus is on individual (read: consumer) data. “Another example of the trend Gartner has called the ‘consumerization of IT,’ ” a spokesperson wrote to me in an email this morning. (Although I wonder what the outlook is on that concept, given what else has been happening in consumer products lately.)
According to the announcement, Decho will also introduce new cloud-based services for individuals over time. These services will likely be based on Pi’s still-stealthy IP, which at the time of EMC’s acquisition of Pi was described as cloud middleware that makes data accessible using multiple types of endpoint devices.
Right now the Pi still seems to be under wraps (cooling on a windowsill, maybe?), since no details are forthcoming about what future Decho services might be. Quoth the EMC spokesperson, “[Future services] will be focused on helping people do things with their data once it’s in the cloud. The new services will address the growing challenge people have as the amount of their data increases in volume and value, as it persists over time and is spread across multiple devices and services.”
CA jumped into the software as a service (SaaS) game by launching three offerings at CA World. The SaaS offerings include a disaster recovery/business continuity service called CA Instant Recovery On Demand, which is built on technology acquired when CA bought XOsoft in 2006.
CA will sell the service through resellers and other channel partners. A participating reseller will establish a VPN connection between the customer and CA, and use that to automatically fail over a server that goes down. The service supports Microsoft Exchange, SQL Server and IIS, as well as Oracle applications.
Instant Recovery on Demand costs around $900 per server for a one-year subscription.
Adam Famularo, CA’s general manager for recovery management and data modeling, expects the service to appeal mostly to SMBs because larger organizations are more likely to use the XOsoft packaged software for high availability and replication. “If an enterprise customer says ‘We love this model, too,’ they can buy it,” he says. “But most enterprises want to buy it as a product.”
Famularo says he sees the service more for common server problems than for large disasters. “It’s not just for hurricane season, but for everyday problems,” he says.
Since I’ve been in the storage industry, I can’t think of another product that generated as much hype and interest for as long as EMC’s “Maui,” now rechristened Atmos for its general release.
Now that there’s actual technical nitty-gritty to get into about the product after a year of talk, the Internet is having a field day. There are a few follow-up resources with great details about the product springing up that I want to point out in case you missed them:
- VMware senior director of strategic alliances Chad Sakac has an intricately detailed post up about how VMware’s vCloud initiative fits in with Atmos, and the differences between what VMware’s doing in the cloud and what Atmos is trying to accomplish.
- Another EMCer, Dave Graham, covers the details of the hardware (aka “Hulk”) that’s being paired with Atmos.
- The Storage Architect has questions.
- Finally, StorageMojo’s Robin Harris has links to all the relevant technical whitepapers for anyone looking to truly geek out to their heart’s content. Harris also points out the highlights for those who don’t make whitepapers typical bedside reading.
I have been skeptical about cloud computing in the past, and remain that way to a certain extent. While I have no doubt about the ability of storage vendors and IT departments to put together huge, booming, honking service provider infrastructures, I still think that those pesky details about how users–especially users with a lot of data–are going to get their data there and back in a reasonable period of time have not been addressed. Some of my colleagues, like SearchServerVirtualization’s Alex Barrett, have seen this movie before, and “[wonder] why hosting providers think that cloud storage will succeed when storage service providers (SSPs) of the late 1990s were such a blatant failure?”
For many, the answer to that question is that today’s technology has advanced since the 1990s. Chris Gladwin, founder of Atmos competitor Cleversafe, told me that he’s had the idea for his cloud offering for years, but the average Ethernet network was nowhere near up to speed. Now it’s more like ‘near.’ And just yesterday I wrote about one startup, Linxter, that’s trying to do something about one piece of the networking equation. It may be that the technology’s finally ready, even if the idea doesn’t look much different.
It has also been suggested to me by IDC’s Rick Villars that the storage industry thinks of the cloud as an effort to replace all or part of an enterprise data center, but that cloud service providers actually represent a net new mass of storage consumers. It might be that both skeptics and cloud computing evangelists are exactly right, but they’re talking about differing markets.
When it comes to the enterprise, though, I think the cloud will have its uses–not necessarily in deployments but in giving rise to products like Atmos. The enterprise has been looking for a global, autonomic, simple, logical means of moving data based on its business value, not to mention cheap, simple, commodity storage hardware components, for quite a few years now. I’ve written stories before prompted by the wistful look in enterprise storage admins’ eyes when they hear about automation like Compellent’s Data Progression feature for tiered storage. Whether or not the external cloud business takes off like people think it will, it looks like some longstanding enterprise storage wish list items might still be checked off.
Most of the time when concepts come along like the cloud, they’re discussed first from a 30,000 foot, theoretical point of view. As they take shape, though, pragmatic nuances come into play. After looking at a map, you still have to get from point A to point B.
The wider economy is dampening some appetites for innovation, but the cloud is rolling on, and new companies are popping up to solve some of the new logistical problems presented by its evolution. For one thing, the frailties of today’s Internet networks have come up a lot in more pragmatic discussions of the cloud. While companies like EMC and Sun are offering advanced kits to service providers, customers still face limited bandwidth and at times lossy networks getting their data uploaded to the cloud.
I met with one new company looking to address these issues last week. Called Linxter, it’s looking to sell software to service providers that places an agent at each end of the wire between cloud and customer. The software agent for the customer side would be embedded into whatever software the service provider already has them use. These dual software agents then can act as a kind of universal adapter for sending data over the network to the cloud data center, reducing protocol chatter, improving latency, allowing for on-demand and scheduled sending and receiving of data. They automatically re-transmit data if a connection is lost, picking back up where the transmission dropped off. This can be key for sending backup streams to the cloud, for example. The software will also be pre-packaged with various devices so that cloud service providers don’t have to re-engineer software for each new endpoint.
According to founder and CEO Jason Milgram, the product’s third public beta was released Oct. 3. The first commercial release will be available on the company’s website in mid-December. “The communication layer is a very high level skill set, and our company has those skills,” Milgram said. “Our technology takes care of the complexity.”
To date the company has been funded to the tune of $3 million byt angel investors and is pursuing a channel/partner sales strategy. Milgram couldn’t name any partners, but said there will be at least 10 listed once the commercial release comes out. So far of the approximately 50 public betas, most are systems integrators and ISVs. There have been at least 300 downloads of the Linxter middleware since May, he said.
The Boston Globe reported this morning that an unencrypted backup tape containing personal information on some 21,000 clients of the school’s legal clinic has been lost. According to the newspaper, the tape was lost by a technician who was transporting it on the subway.
The Globe story also reports:
To prevent a similar occurrence in the future, the law school is encrypting the center’s computer servers and backup tapes for a higher level of protection beyond the password. It has bought a new tape library with a bar-code reader for better inventory control and hired a professional courier service to transport the backup tapes.
School spokesperson Robert London told me this afternoon that the Globe story “gives the impression” that the law school has determined where and how the tape was lost, but that’s not the case. “It’s possible it was lost in transit on the MBTA, but it could have been lost after it reached our campus,” he said. The Globe story does not cite a specific source for that information.
London added that the tape was coming from a remote office that was about to become the last branch of the law school to deploy tape encryption, and said the rest of the school’s facilities already have encryption in place. To lose a backup tape from that particular system was “just bad timing and bad luck,” he said.