For those of you still following the NetApp / Sun soap opera with popcorn at the ready, here’s the latest: the whole case, including all suits and counter-suits between the two parties, will be heard not in Texas, where NetApp originally filed the suit, but by a “mutually agreed-upon judge” in Southern California, where both companies are based. The first matter before that judge will be a re-examination request on the patents NetApp claims Sun violated.
Here’s where blogs come in to play again, as they have, frequently and at times weirdly, throughout this case. According to a letter Sun sent to members of the press today:
Reexams have been filed on the NetApp WAFL patents that purportedly cover concepts such as copy on write, snapshot and writable snapshot. There is a significant amount of prior art describing this technology that was not in front of the US patent office when it first examined these patents. In just one example, the early innovation by Mendel Rosenblum and John Ousterhout on Log Structured File Systems, applauded in a NetApp blog: http://blogs.netapp.com/dave/2007/09/vmwares-founder.html as beinginspirational to the founders, was not considered by the patent office in the examination of the NetApp patents.
With this notice, Sun is hoping we’ll think they have NetApp right where they want them. But if there’s one irrefutable truth that’s come out of this whole saga so far, it’s that outside the offices of a few key people, nobody really knows what’s going on in this feud.
I will say, however, that the change of venue does seem like a concession on NetApp’s part, one that Sun, of course, isn’t letting slip by: “We are pleased that Network Appliance agreed to Sun’s request and retracted its imprudent choice of venue for this litigation.” (Prior to the change in venue, NetApp had pointed out that a number of cases like this one have been tried in Texas, including IBM’s suit against Amazon, regardless of where the parties in the suits have been based.)
Who knows what the new year will hold in store for these two. But we suggest stocking up on your Orville Redenbacher’s over the holidays.
When you buy a product or service how much of your purchasing decision is influenced by brand? And how much of that influence is because of that brand’s size?
I’ve taken a hard look at my purchasing habits and love to talk about brands w/ my friends. My eyes were opened as to how easily swayed I was by big names and the perceived strengths of a brand. How much did and does brand really mean to me? Well, I own a Sony TV even after they put rootkits on my cd’s … twice … and had a bunch of exploding batteries. I own a Microsoft Xbox and Xbox 360 even though they over heat and shut down (the red rings existed on the original Xbox too). I own two … yes two, Chevy vehicles (Corvette runs in the bloodlines, man, I’m tellin’ you!!!) So these and others were pointed out to me by people who call themselves my friends in about as brutal a way as they possibly could, you know what I mean if you’ve ever taken a friend clothes shopping.
With every effort not to sound egotistical, I consider myself an intelligent buyer, I make what I like to believe are intelligent well informed buying decisions (thank you Google!!) in my personal life, but more so in my professional life. I get paid to make intelligent decisions about vendors and products that get brought into our environment, not to mention I have to live with them while they are here, so I tend to take my professional recommendations very seriously. I research, test, research some more, and especially considering I’m suggesting that my salary, bonus or both be spent on said purchase I tend to be really rigorous.
A little background into my brand history, I purchased, ran, and actually liked IBM’s OS/2, I bought a Panasonic DVD-RAM drive. I own the Sega Dreamcast which was arguably a better machine than the Sony Playstation 2, not only that, Sega was a household name when it came to video games.
I can go on, remember the Apple Newton? Yup, I had one. How about Psion!? Had one of those too (Series 5mx). All of the above, on their own merits, were technically superior to their competition, they had a great brand name behind them and yet…they all got destroyed in the marketplace. The one that surprised me the most was Sega, do you remember all those commercials that screamed SEGA! at the end? They got it, they knew brand was key, then they blew it, I’ll save that one for another blog :). Meanwhile upstarts like Microsoft, Palm, Plextor and Sony went on to dominate their respective markets.
Why was that (aside from the fact that I owned one)?
And what does all this have to do with storage, you are probably asking yourselves. Well… how do you purchase storage products? Does brand play a part in your decision, or do you buy based strictly on technical specifications?
Another, even harder question, how do you sell an unknown brand to your supervisor? While you may be comfortable with buying from a relatively unheard-of brand, how does the person signing the check feel about it? Do you feel comfortable putting your reputation on the line, based on someone else’s reputation or lack thereof?
Take a look at the examples I mentioned above (Apple, IBM, Psion, Sony, Panasonic, and Sega) and how they all failed. They were the number one or two brand names, everyone knew who they were (Psion mostly in Europe), and yet their products failed miserably. Can we be so sure that the adage “you can’t get fired for buying IBM” is true?
I’ve heard comments like: “My boss never lets me buy the incumbent, because the number two player will work harder for our business,” or “I want one number to call, one throat to choke, buy all our stuff from one place whether it’s the best of breed or not, they didn’t get to be number one by accident”.
Personally speaking, brand still plays a role in my purchasing decision. I’m not nearly as obdurate (great SAT word submission) about brand as I used to be though. I do make a pointed effort to do research into what is available. I lean on my peers, I try to make it out to trade shows that are geared to the specific segment I’m looking at. I know that keeping up with all the new companies is pretty much a full time job on it’s own, but I do try to make that job a part of my job. With storage exploding the way it has been in the last few years it has become even more difficult to keep track of all the companies offering storage products and solutions, but equally important as they include more features in their products.
Currently we are evaluating vendors for multiple storage related projects. Whether or not all the projects will get funded is another issue, but the fact is, whittleing the field down is difficult when management doesn’t have a tight a grasp on the players. Gartner, Gomez, IDC and others do a decent job of providing executive level synopsis of who the vendors are and what segment and sub-segments they target, but that doesn’t go far enough to help me when I have to sell an unknown vendor in a market space that hasn’t even been fully fleshed out yet, much less reached any sort of maturity. I have to educate on not only the product but the vendor as well. Sometimes it is a fight there is no way to win.
Some will raise arguments about trust between upper management and their workforce, but we all know they only trust consultants ;-). I’ll elaborate more about the specific issues I faced with the various projects in a future post, but remember brand and perception are inexorably tied. Our industry depends heavily on perception. If you don’t believe me, throw this statement out to 3 different execs: (*I know I’m starting a bar fight somewhere*) “hard drives are a more reliable, cost effective solution than tape as a backup medium, fact or perception?”
I think a reference to The Matrix Reloaded (the second one) when Neo talked with the architect is a great mnemonic (pun intended), or maybe example is a better word. Either way the basic point is that everything is about choice, and convincing Neo to choose 23 people and making a new Zion is like choosing the incumbent instead of Trinity, the upstart vendor.
The glitz and glamour of new product releases tend to overshadow the rather mundane task of performing firmware upgrades on storage systems. However, administrators who take the time to keep their storage systems up-to-date with the latest and greatest patches for their storage system may find they can avoid some FC SAN “gotchas” as well as find some hidden gems that vendors are packaging in their latest firmware releases.
Prompting my thoughts on this topic was a recent conversation I had on this topic with a storage architect. He recently inherited a FC SAN where the firmware releases on the storage systems were two major releases back. The older code on these storage systems was becoming a problem since other devices on the SAN (switches, virtual tape libraries, and servers) had newer firmware with new features, but in order to take full advantage of these newer features, the storage systems also needed newer code.
I discussed this topic with EMC partly because the storage systems in question were EMC Clariion, but also because I know from personal experience that EMC releases firmware updates a fairly regular basis.
In the case of its Clariions, EMC comes out with a major release every 9 to 12 months that includes major new functions. For instance, its December 2006 code release for the Clariion included a new proactive hot spare feature for improved high availability and a Quality of Service feature as a licensable add-on. Its August 2007 Clariion major release added new security features as well as iSCSI enhancements like native replication.
Another interesting feature included in the update is the Software Assistant. This tool scans the Clariion prior to starting a firmware upgrade and provides recommendations as to which code an administrator should load on the system. The Software Assistant also does a high availability check prior to actually starting the upgrade to confirm that firmware upgrade can be completed without unexpectedly taking the system offline.
EMC recommends to customers that they install major firmware releases for its Clariions shortly after they are released (within 3 to 4 months).
However, there is a more pressing reason to ensure that firmware code is current. When doing firmware upgrades one must apply them sequentially. If a Clariion system is two generations old, customers may need to upgrade to the intermediate release before upgrading to the newest release. Though this is generally not a big deal, it does add to the length of the time needed to perform the firmware upgrade and makes it more difficult to back out of an upgrade should something go awry.
Over the last few weeks, storage insiders have been abuzz with speculation that a merger between HP and Symantec is imminent. Whether such talks are occurring, I can not definitively say, but if it does occur, the whole corporate world might as well kiss goodbye any hopes it had of creating and managing a heterogeneous storage environment.
Obviously, I’m exaggerating a bit. Kissing heterogeneity goodbye won’t happen the day such a deal is signed (if it occurs), and it probably won’t ever completely happen. HP and Symantec will likely both pledge that heterogeneous support will remain part of their product roadmaps. And, it’s likely that is true. However, one can almost bet that when it comes time to prioritize which storage products are tested first in conjunction with future releases of Symantec’s Veritas storage software that HP’s storage products will find their way to the head of the line.
More disconcerting is what Symantec’s acquisition by HP (or whoever they are acquired by or merge with) would mean for the future of heterogeneous storage environments in general. At one time, Symantec was on the vanguard of supporting an enterprise heterogeneous storage environment. Yet, now no one is really shocked or even appears overly concerned when Symantec is mentioned as a candidate for an acquisition or merger by what is traditionally considered a storage hardware vendor.
This mindset is testimony to changing user concerns and priorities. It used to be that storage hardware was the primary cost in user data center. Not anymore. Now, it is the management of the storage hardware — even if a user buys all of the hardware from the same storage vendor.
The complexity associated with managing storage hardware from multiple different vendors has become a mind-boggling exercise. While at one time it may have been worthwhile to spend the extra time and money to verify if an HP-UX server worked with an IBM storage system, now it is questionable if that is still the case. Instead I sense an increased willingness on the part of users to pay a premium to buy all of their storage hardware and software from one vendor and avoid checking multiple different support matrixes that using heterogeneous environments requires.
The looming acquisition or merger of Symantec, regardless of by who, signals the re-emergence of an old systems management philosophy. Companies no longer want a one trick pony for their storage management needs, even if that one trick pony manages heterogeneous storage environments. Instead more companies appear to want a return to simpler times where they buy all of their storage hardware and software from one vendor that all work nicely together. Let’s just hope that if companies have to revert back to this philosophy, that it works better this time than it did in the past.
This was our first Storage Decisions conference in the hilly city built on a fault line, and that meant a fresh crop of Storage Decisions attendees and happenings.
Sun held a “trends and innovation” dinner for press and analysts concurrent with the show on Monday night (it wasn’t affiliated). About two dozen Sun execs and their audience sat down to a gourmet repast at San Francisco’s trendy Absinthe restaurant. Execs and Sun reps including Chief Technology Officer and Executive Vice President of Research and Development Greg Papadopoulos, CIO Bob Worrall, Executive Vice President of Systems John Fowler, and distinguished engineer Subodh Bapat.
As always, Sun was articulating grand visions of the future. “The storage marketplace is about to undergo its most rapid set of changes possibly ever–it will change the economic fortunes of a number of companies,” Fowler predicted (Sun is hoping this will hold true in a positive direction for its storage products). Cost per capacity will be “one-tenth of what you see today.”
Like fellow large players IBM and EMC, both of whom have recently acquired storage service-provider companies, and Symantec, which is preparing a software-as-a-service (SaaS) backup offering, Sun is keen on outsourcing as well. Eventually, according to Bapat, there will only be a few “really big computers” in the world run by companies like Microsoft and Google in “mega data centers” like Google’s famed farm of PCs. Sun would also like to become a service provider itself, but their real focus is on selling equipment into those service-provider data centers. Sun was already part of a similar build-out in the telecom industry in past years, though it was also pointed out that companies like Google have already done their build-outs just fine without Sun.
Meanwhile, new “mega data centers” are beginning to spring up, including a new 500,000 square-foot, 50-megawatt behemoth being built for a national lab set to open next year, according to Bapat. “50 megawatts is bigger than a small city would consume,” Bapat said. “Utilities are going to become a real problem.”
Bapat also predicted that within the next year, a major data center failure will “cause major national effects, and bring forward the importance of data centers as national assets.”
Sun loves to look out 15 years, but ask about the next 15 months and it’s a trickier question. Sun’s recently announced partnership with Dell is part of its attempt to position itself better in the market; Sun will also be going after service provider customers such as SmugMug, according to Worrall, and developing server-farm products with its partners at research universities. How that’ll translate into specific products and sales remains largely unclear.
Sun is on to something when it comes to Fowler’s prediction about the pace of change over the next year, according to Taneja Group founder Arun Taneja. “We’re in such a vibrant market right now,” he said. “I have never seen so much change and innovation happening all at once, ever.”
Some visuals from the show floor (click to see larger versions, mouseover for descriptions)
Everybody’s favorite user-blogger Tory Skyers was Mr. Storage Decisions this year, presenting on the storage issues presented by new mobile devices and participating in a user panel on storage management. Skyers warned users not to overlook the trend toward iPhones and home servers. “An executive buys a home server, plugs in his laptop at home, and the home server asks, ‘wanna back it up?’ Then his kid comes home with the Trojan du jour and suddenly your company’s data is in the Eastern bloc somewhere.”
The flow of data leakages happen both ways in the mobile world, he added, with mobile devices blurring the line between personal and corporate data repositories. “So mp3s and AVIs and maybe even that Trojan find their way to the laptop, which finds its way to your data center, which finds its way to your SAN and your network.” Tory gave some how-tos on controlling some of that flow of information on both sides of the equation, including “using social networks in your work environment to enforce policy”–specifically, a “Page of Shame” for violators of company storage policies pertaining to mp3s etc. and strategically placed rumors of “someone getting busted” for violating policies. He recommended tools like Desktop Authority and Powerfuse for content filtering and executable monitoring for contraband files, using open-source and free Microsoft tools to create document templates for data classification, and Surfcontrol Mobile Filter to restrict access to Websites and protocols even when users are off the network and VPN on company machines. Desktop Authority and Powerfuse will also restrict which mobile devices can be plugged in to a corporate machine–a USB mouse will get through but not a thumb drive or iPod.
“This is a better alternative to sealing your USB ports with epoxy,” something Tory said he’d been asked to do before (by an exec who then realized he had no way to plug a mouse in to a $2500 machine).
In the course of his presentation, Tory also referenced the following tidbit from CNN: customs and border guards can confiscate anyone’s laptop without any grounds for suspicion and copy all the information held within it. Terrifying.
Some more visuals from around the conference:
On Wednesday users gathered for a peer discussion on virtualization that turned up some interesting things, including–be still our hearts–an actual, living, breathing, Invista user (we wanted to take his picture). Very few of those present have actually deployed storage virtualization and those considering storage virtualization tools were also in the minority among this group. “I’m wondering what the benefits are that other people have seen to virtualization, what the return is,” said one user.
The majority of users saying they’d begun virtualizing are using HDS. Almost all users with storage virtualization in place said they used it to front other arrays from the same vendor, with the exception of migrating data from decommissioned storage. “You just don’t want to get into finger-pointing with the different vendors,” according to one attendee.
Three months after filing to become a public company, NAS vendor BlueArc has pushed back its scheduled IPO until 2008 according to industry sources.
Citing SEC regulations, BlueArc declined to comment on its IPO schedule. But several industry and financial analysts familiar with the company say its bankers have decided to hold off on going public. BlueArc filed to for its IPO Sept. 7, and it normally takes a company about three months to begin trading shares as a public company. But BlueArc has yet to set its expected share range or go on its roadshow that precedes an IPO. There is usually a gap of at least two weeks between the share range and IPO, which means BlueArc would run smack into the holiday season if it decided to become public by the end of 2007.
It’s not clear why BlueArc decided to wait, but it’s likely that the company and its bankers anticipate a lower price for shares than they originally expected if they go public now.
“I’m sure market conditions haven’t helped,” said a banker for a securities firm who is not involved with BlueArc’s IPO.
A financial analyst who follows storage said investors “are doing more due diligences on IPOs now,” and said it could hurt BlueArc that the stock prices of storage systems companies Isilon and Compellent have dropped drastically since their recent IPOs. The analyst said BlueArc might want to show another quarter of solid growth to help its case. There is also the possibility of an acquisition. IP SAN vendor EqualLogic had filed for an IPO before Dell scooped it up for $1.4 billion last month. Storage insiders agree that Hitachi Data Systems is the most likely suitor because it has an OEM deal to sell BlueArc NAS systems and an equity stake in the company.
Even without BlueArc, 2007 was an active year for storage IPOs. Pure storage companies Compellent, 3PAR and Data Domain all went public, along with EMC’s partial spinoff of server virtualization company VMware and InfiniBand suppliers Mellanox and Voltaire.
Just when I think that I have heard every reason for keeping data on tape, new arguments keep emerging. Now the latest is that tape is more energy efficient than disk.
My first real insight into this came a few weeks ago when I was speaking to Spectra Logic’s director of technical marketing, Molly Rector, who had just returned to Denver after meetings with Spectra Logic channel partners, resellers and users in the New York and Boston area. The feedback that she received from her meetings was that some data centers in the Northeast were running low on power and no longer able to obtain new power. In these cases, the shortage of power was forcing their customers to choose tape because it was more energy efficient than disk even though they wanted to buy disk for their backup environment.
While it may be true that tape consumes less power than disk, it is disconcerting that some companies find themselves in this predicament of needing to choose tape over disk because of something as seemingly preventable as an inadequate supply of power.
Keeping data on tape costs businesses in ways that are sometimes hard to measure. Legal discoveries, the personnel needed to manage tape and moving and storing tapes offsite all add to the costs of tape management and also consume power in more subtle ways. To somehow conclude that the choice between disk and tape somehow needs to stop and start with a company’s rate of energy consumption seems a bit archaic to me.
Tape may consume less power than disk, but that does not necessarily make tape a better choice. Disk and tape are both choices that companies need to have available to them and either one, if managed properly and looked at from a total cost of ownership, can save companies money and cut energy consumption in the process.
Companies in this situation are obviously looking at some hard choices in the near term as their choices are less about the choice between disk and tape but if it is time to change how and even where they manage their data. In the Northeast, it appears some companies have already waited too long to make a decision because when the number of outlets left in the wall dictates what storage media they need to buy, the only choices left are unpleasant ones.
In doing some research recently on the problems associated with recovering data from old tapes, I found out that a similar set of problems exist when trying to recover data stored on old disks. This problem becomes especially pronounced if a company unplugs an old disk drive and puts it on the shelf or keeps it in production too long.
The problem that companies are more likely to encounter when storing a disk drive on the shelf is not necessarily data degradation on the disk drive platter but mechanical failures of the parts within the disk drive itself. Greg Schulz, the lead analyst with Minneapolis-based StorageIO, finds that the lubricants of the mechanical parts within the disk drive can settle. This can cause the disk drive to malfunction when the company attempts power it up again for the first time in a long time.
Jim Reinert, VP of disaster recovery for Kroll Ontrack, a worldwide provider of data recovery services, says that the largest problem Kroll encounters with trying to recover data from old disk drives is repairing and replacing defective mechanical parts inside the disk drive. Motors failing and electronic circuit boards going bad are just some of the components Kroll has had to repair before it can recover the data from the drive. This situation requires Kroll to find an exact match for the defective part, usually on the used market.
Of course, mechanical problems can also occur while the computer system is still in use. Reinert finds that some of the toughest data to recover is found on older, proprietary computer systems that are in use but break. Typically found in manufacturing and production environments, these are older computer systems that control a piece of equipment that everyone uses but no one manages. As a result, the data is not backed up nor does anyone know who created the application or how it runs.
So, what’s the best way to protect data on old disk drives? The best and simplest way is to avoid keeping data on old disk drives and migrate data to newer disk drives. Kroll Ontrack classifies disk drives over five years in age as “old” since by this time disk drive warranties have usually expired and parts for the disk drive are out of production.
Schulz is a little less dogmatic about the five year cut-off. He finds that disk drives that are up to seven to eight years in age are probably OK depending on what condition in which they were stored or how they are used in production. He suggests spinning them up on a regular basis (once every 3 to 6 months), though he agrees that as disk drives age, administrators should migrate the data to newer drives.
If a disk drive has already failed or you come across one of indeterminate age or condition and you don’t know what data is on it or its value to your business, your best bet is probably to send it to a data recovery specialist and keep your fingers crossed.
As Dell proved when it decided to drop $1.4 billion on EqualLogic earlier this month, large storage acquisitions have not gone away just because startups have found a lucrative IPO path and EMC is taking an M&A breather to integrate its new toys.
Hewlett-Packard, a company a lot of people thought was getting ready to exit the storage business a few years back, is now the most likely to add to its storage portfolio through acquisition. Over the past few years HP has picked up AppIQ, OuterBay, PolyServe and Opsware, and more is expected.
There has been persistent talk about possible HP deals of varying sizes: small (email archiving startup Mimosa Systems), medium (IPO-eyeing iSCSI vendor LeftHand Networks) and blockbuster (struggling security-storage giant Symantec). While some of these rumors have swirled for months and are growing stale, don’t be surprised to see HP pull the trigger on at least one deal between now and its Dec. 11 Analyst Day. And storage is high on HP’s list of priorities these days.
“Storage is a place that we have interest in growing our position,” HP CEO Mark Hurd said during the company’s Monday evening earnings conference call.
When asked specifically about storage acquisitions, Hurd refused to give details, “other than to say we continue to have a filter of something that makes strategic sense, it makes financial sense, and we can actually run and operate it.”
Hurd went on to talk about the importance of data storage in today’s corporate world, calling it a key attribute in the process of creating, moving, processing, visualizing, and printing content.
Hurd’s comments came after HP reported positive signs of storage growth after several disappointing quarters. Most of that growth came in the midrange and low-end. HP’s 7% increase in storage and 17% growth in its midrange EVA systems were on par with numbers recently reported by EMC and Network Appliance and well ahead of IBM’s storage performance last quarter. Hurd said he was also happy with the performance of low-end MSA systems.
The biggest storage disappointment for HP was that tape revenue declined, as did the high-end storage systems business that HP sells through an OEM deal with Hitachi Data Systems.
“There is still much room for improvement,” Hurd said of storage. “We still have a tape business that is not growing the way we would like and the high end is still behaving more like the mainframe market, as opposed to the mid-range market and the lower end of the storage market.”
Now we’ll see what HP does to, as Hurd put it, “grow its position.”
It was a treat for natives of the Boston area to go to the Museum of Science–most of us who hail from Massachusetts agreed the place is a staple of our childhoods. I haven’t been there in about 15 years, but was both surprised and delighted to find that most of the main exhibit areas I saw haven’t changed at all (leading one analyst to crack wise about EMC’s selection of venue for an “Innovation” day).
Here’s a photo of some of the usual suspects attentively listening to Joe Tucci give his entire PR staff heart attacks by revealing the code names of four new products set to be announced next year. (Said Tucci of the meeting during which objections to the pre-announcement were registered: “I asked, who do I see about getting that policy changed? And the arguments ceased.”)
Last but not least, here’s the newly appointed president of the compliance and archiving division at EMC (formerly chief development officer) Mark Lewis demonstrating a new Documentum interface called Media Space.
It’s showing an computer-generated sketch of a hypothetical video game console to show how the program can be used to collaborate on images.