Storage Magazine’s recent piece “Making the case for solid-state storage” brought up some interesting points regarding the future of storage technology.
First, will solid state storage finally bring an end to data loss resulting from drive failures? As we’ve all learned the hard way, there’s a universal law that states fewer moving parts means greater reliability. Speed is another factor, with performance gains of seven to nine times traditional storage. Not just in processing efficiency, convenience, user productivity and so on, but such speed improvements can have an enormous impact on disaster recovery and business continuity efforts. Perhaps this will be the contributing factor that pushes management to get on board with DR/BC. Or, perhaps it’ll be an excuse: “If we can use solid-state storage devices to recover our systems much more quickly, what’s our incentive to build out an elaborate DR/BC program?”
It’s funny, I remember studying about the inner workings of solid state memory while getting my undergraduate degree in the early 1990s way before “solid state” as we know it today was cool. Maybe it’s just my perception, but this technology seems a bit late to the game. I like what solid state storage stands for and brings to the table but given where we are in other areas of technology, it’s just not all that exciting.
Maybe my perception will change once I get my hands on some of this hardware.
In the news release announcing the 2005 merger of Symantec and Veritas, the new entity was said to be “uniquely positioned to deliver information security and availability solutions across all platforms, from the desktop to the data center.” Symantec’s Chairman and CEO, John W. Thompson, speculated that its “customers [were] looking to reduce the complexity and cost of managing their IT infrastructure and drive efficiency with fewer suppliers.”
If that still holds true, customers got more of what they want—or fewer of what they don’t want—in yesterday’s latest symbiotic acquisition of humyo by Trend Micro. The new association mirrors the convergence of data storage and data security.
From the Trend Micro website:
The acquisition of humyo aligns nicely with Trend Micro’s strategy to deliver industry leading cloud security technology and products to stop threats where they emerge, on the internet, and provide security for and from the cloud.
Trend Micro is number three behind Symantec and McAfee in the antivirus and security sector, and it’s safe to bet that this acquisition is an attempt to expand its offerings and therefore its customer base. After all, if you’re already offering security for the cloud, why not go that extra step and provide the cloud as well, right? One-stop-shopping seems to be the trend in IT storage these days, as does the ever-blurring of the line between storage and security. From MSPmentor’s Joe Panettieri:
In some cases, the walls between online storage and security are fading away or disappearing entirely. For instance, CA Technologies is reinventing its security and storage lineup to embrace managed services pricing. Symantec is expected to take similar steps soon, according to online forum chatter. And master MSPs like Ingram Micro Seismic and Virtual Administrator are blending storage and security services for VARs and MSPs.
What are your thoughts on this acquisition and its implications for the storage industry? Is cloud storage replacing cloud backups? Share your thoughts in the comments section or email me directly.
Acronis has a pretty neat—and free—tool called Drive Monitor that’s worth checking out. The premise of the program—”to increase customer awareness about the health of their disk drives and to encourage them to back up their data in order to survive a disk failure”—struck a chord when I saw the offer. It seems a little backwards, but I believe the Acronis folks have taken a good approach.
It’s true—many people tend not to take backups seriously unless or until it appears that something bad is about to happen. I’ve seen a lot of businesses banking on the reliability and up time of critical drives and storage systems while ignoring the maintenance and oversight such systems require. This mindset is directly related to the common information privacy and security excuses you hear from management such as:
- “We don’t have anything the bad guys would want.”
- “We have a firewall, encryption, and anti-virus, so we’re good.”
- “Our auditor told us we’re in compliance with the important federal regulations.”
I don’t know which is the more primitive of information security controls—using strong passwords or backing up data—but it’s really interesting how many businesses overlook both.
The way I see it is at least vendors such as Acronis see the need and are offering preventative tools to keep bad things from happening. Sure, we can’t force data backups on others but at least there are resources for those looking for a solution.
As TechTarget’s Storage Decisions conference wrapped up today, I was perusing the conference site and noticed something peculiar about the sessions. There are sessions on backup. There are sessions on disaster recovery. There’s even one on ITIL. But nowhere could I find anything on storage security.
Sure, being a security guy, I’m biased in my approach, but if storage security is not being discussed at such a high-profile conference, where is it being discussed? Well, perhaps there’s some coverage at RSA, CSI, and related security shows, but I wouldn’t think those are the shows where storage admins are hanging out.
My point is, there’s still a disconnect between the perceived risks of storage systems and the actual risks. Based on just the storage-related vulnerabilities in the past 12 months alone (search the word “storage” here), there are obviously some things that should concern any given business. Storage is more than boring old disk drives; it’s applications, operating systems, and firmware that present a relatively broad attack surface on the network. What is your organization doing about it?
Kevin Beaver is an independent information security consultant, keynote speaker, and expert witness with Principle Logic, LLC and a contributor to the IT Watch Blog.
A short blurb in the Israeli daily Haaretz has the storage world talking: IBM is negotiating to acquire data storage startup Storewize for $140 million. Rather than offering storage solutions directly, Storewize focuses on boosting existing network-attached storage by compressing and decompressing data before it’s stored or accessed with a drop-in appliance, similar to how more widely-deployed WAN optimization technology works. Storewize claims it can reduce storage utilization by 50% to 90%, reducing an organization’s data center footprints, cooling costs and necessary hardware all in one fell swoop.
With public customers like Polycom reducing storage needs upwards of 60% on everything from disaster recovery backups to Oracle databases, it’s no surprise IBM took notice, offering Storewize’s STN-6000p series through its business partner program.
Organizations could certainly use any opportunity to cut down on storage demands: While costs per gigabyte continue to drop, facilities costs for data centers remain high and the amount of data that needs to be stored has increased six-fold in the past four years with no signs of slowing.
Are we seeing the beginnings of a storage compression war, just as we saw a storage deduplication war start three years ago? Quite possibly: The Register reports that Storewize competitor Exar will likely get a boost as other storage vendors look to beef up their own optimization offerings. If war is imminent, vendors and users alike might take Arun Taneja’s advice from last time around, when he warned that while the need was very real, results and benefits from the current technology need to be fully vetted:
So, be prepared to see a barrage of data coming your way. I am suggesting to the vendor community that they run their products using a common dataset to identify the differences in approaches. I think you should insist on it. Without that, the onus is on you to convert their internal benchmarks to how it might perform in your environment. You may even need to try the system using your own data
What do you think? Are storage compression appliances going to be standard issue in your upcoming storage deployments, and can you trust the public case studies to apply to your situation? Let me know in the comments or e-mail me directly.
Just because a business is considered small doesn’t make its data storage any less complex or crucial to success. Cisco’s Wednesday release of its Small Business NSS 300 Series Smart Storage series, specifically geared toward businesses with fewer than 100 employees, is a step in the right direction for the previously-neglected SMB sector, reports IT News. This release follows suit of big names like Netgear, HP and others who have set the “larger industry trend of networking or data center companies attempting to be all things to their customers.”
IDC Analyst Ray Boggs attributes it to the “blurring of a lot of this technology where storage is associated with security, which is associated with the Internet, which then gets you into networking.”
The Nitty Gritty
- The series offers two, four and six-bay desktop network storage boxes with up to 12TB capacity based on 2TB SATA drives.
- The storage boxes can be configured as NAS or iSCSI target devices.
- It’ll run you between $900 – $6,000, depending on capacity and functionality.
- There is an accompanying service plan, Cisco Small Business Pro Service, will run you about $150. This includes three years of tech support – software updates, hardware replacement, and call support.
- The series includes software for setting up a Web server, designing a website, on-disk data encryption to protect against compromised hard drives, and the WordPress blogging platform.
What is your small business currently using as its storage platform? What are you hopes for the future of SMB storage solutions? Let me know in the comments or email me directly at Melanie@ITKnowledgeExchange.com.
According to Aguacer0:
Tape will never die due to compliance issues. It is being replaced as near-line backup due to performance and capacity. Nowadays you will see backups being executed to disk first and then tapes (D-D-T). Now with technologies like dedupe and VTL, tapes are coming from 2nd step to 4 step for archival purposes. But tape will never die.
Mrdenny agrees, speculating that tape will last as long as off-site archives are needed. Meandyou uses tape for data backup: virtual tape for onsite storage and cartridge tape for offsite. ToddN2000 prefers optical media to tape, due to its superior durability.
Dee101 weighs in that the cost of replacing all of the data already allocated to tape is far too weighty an undertaking, especially in a market of quickly evolving disk standards. That didn’t stop Dee101 from switching his own archives to disk recently in order to “be prepared to continue to evolve on disk.”
Petkoa doesn’t like tape, period. Instead, his company relies on disks for quick restore backups and optical storage for “long-term irrevocable backups.” Tapes – “slow and overwritable” – present a major liability “if (maybe when?) [they are] hacked.”
What are your experiences with tape storage? Have you migrated to disk or another form of data backup and archiving that you prefer? Let us know in the comments section or over at the forum, then check out HP’s blog post outlining six reasons why tape is here to stay.
Microsoft’s TechEd 2010 North America — the conference for “cutting-edge insights and expertise that will make life easier for you (and everyone else) at work” — kicked off on June 7th and is running through June 10th in New Orleans. Some of TechTarget’s top editorial bloggers are down there, soaking up what Microsoft’s dishing out, and we’ve compiled some of the highlights of their highlights right here.
Brendan Cournoyer hit on the high points of TechEd over at the Windows Server Notebook blog. First up, the keynote speech, which he said was dominated by cloud talk with the main theme of “how IT professionals can extend the tools and data they currently use on premise (Active Directory, System Center Products, etc.) to cloud computing environments.” President of server and tools business Bob Muglia named the server as the “heart of the cloud,” and articulated that the “company’s goal is to bring all the features of Windows Server and SQL Server to Azure.”
The June 7th announcement of the release of the beta version of Service Pack 1 for Windows 7 and Windows Server 2008 R2, slated for release at the end of July, created quite the buzz as well. Ed Tittel over at Windows Enterprise Desktop speculated that Microsoft is emphasizing SP1′s lack of new and specific features to discourage users from waiting for its release, encouraging them to act now and adopt Windows 7. Apparently the features available in July’s SP1 are already available through Windows Update. Cournoyer noted that SP1 will have “Dynamic Memory for Hyper-V and RemoteFX functionality for VDI.”
Other updates and releases from the Windows Server Notebook:
- Windows Server AppFabric is now available.
- Senior marketing manager for mobile communications August Valdez gave a demo of the Windows Phone 7′s features and capabilities. He described it as “basically a portable phone-version of the Office suite.”
- Internet Explorer 9 was mentioned, which will include “new ‘graphics acceleration’ features to take advantage of the latest standards and scenarios for Web browsing.”
- Windows Intune in beta was also announced, a platform for an environment of PC updating and management tasks such as inventory and patching. Is it “a glimpse of the future” as promised by Muglia?
- The cloud made an appearance again during Amir Netz’s presentation of Microsoft PowerPivot, supposedly where “business intelligence can be combined with — you guessed it — the cloud.” The demonstration included data made available through SharePoint 2010 and SQL Server Analysis Services and integrated with Bing! maps.
For more updates straight from Tech Heads in NOLA, check out their Twitter feed @TechEd_NA or follow Brendan’s coverage at @WindowsTT. Are you at TechEd? What are some of your favorite highlights? Let me know in the comments or email me directly at Melanie@ITKnowledgeExchange.com.
Feel stuck at a legacy company fighting smaller, nimbler competitors? Want to know how your IT shop is supposed to deliver the latest SaaS and cloud solutions when half your stores still run the original cloud software, AS/400? Today’s guest post by Keith Morrow, former CIO of Blockbuster and 7-eleven and current president of K. Morrow Associates, might help give you the insights you need to succeed. Check back: We’ll be carrying more of Keith’s writing soon.
Since the arrival of online commerce 15 years ago, there have been few technology trends that have the potential to revolutionize the retail industry like the ones we see in mobile computing, social networking, and cloud computing. As retail executives, we are challenged to deliver innovative applications that satisfy the customers’ demand for commerce content and transactional ability from anywhere, using any of the always-connected consumer devices: smart phones, set-top boxes, and even internet-connected picture frames. Furthermore, due to the great recession, the notoriously limited retail IT budget is now even tighter, so we have to deliver these applications at little or no additional cost.
As someone who has been in the same boat, I want to offer, over a series of posts, several practical insights for creating innovative, API-enabled applications. I believe the time to take action is now, and if we do things smartly, limited budget or aggressive timeline should not be an impediment.
Find the Perfect Entry Point, Not the Perfect Roadmap
The company that I worked for rented movies and games through retail outlets. We needed to act fast to counter the challenges posed by our competitors who were delivering the goods by mail and digitally via the web and set-top boxes.
Instead of doing lengthy research to discover the perfect product roadmap, we decided to focus on enabling developers and partners to innovate on our core service. We quickly launched a very focused API service as an entry point, and developed a simple iPhone app as a proof-of-concept for building an application on that API.
Then, working with our partners, we quickly created and released an application that the customers can use from their mobile devices or set-top boxes. Using this application, the customers can search for movie titles, browse for recommendations, and decide whether they want to watch the movie immediately through their set-top box, browser, or pick up the movie from the closest retail outlet.
The most critical decision was to not wait too long to make a decision. Once customers were using the initial app, we enabled them to give us feedback that we monitored closely, which in turn allowed us to iterate and improve upon the initial entry point.
In my next entry, I’ll look into ways that retailers can shed a more conservative, traditional mindset and embrace new ways for deploying new apps.
As they say, incentive makes perfect, and now there’s incentive—in addition to increased efficiency and lower cost—to green up those data centers. The Environmental Protection Agency has announced “that stand-alone data centers or buildings that house them can now receive the agency’s official recognition for being in the top percentile of the most energy efficient buildings in the category,” as reported at DatacenterDynamics.
With increased and ongoing effort to hone methods for making data centers a greener operation by improving power management, the Energy Star rating creates a more tangible way to measure who’s ahead of the game and who’s falling behind, or whose Power Usage Effectiveness doesn’t fit into the bottom 25 percent. Perhaps it’s a last-ditch effort to counter the exponential growth in data and thus of data centers and energy consumption. From the Energy Star website:
In its 2007 Report to Congress on Server and Data Center Energy Efficiency Opportunities [PDF], EPA estimated that the nation’s servers and data centers consumed about 61 billion kilowatt-hours (kWh) in 2006 (1.5 percent of total U.S. electricity consumption) for a total electricity cost of about $4.5 billion. As one of the fastest growing sectors, national energy consumption by servers and data centers could nearly double by 2011 to more than 100 billion kWh, representing a $7.4 billion annual electricity cost. However, there is significant potential for energy efficiency improvements in data centers.
The Power Usage Effectiveness metric, developed by The Green Grid, piggybacks on the agency’s already-existing commercial buildings rating. There is also a ratings system in place for hardware such as servers, with a system for rating data center storage equipment in development stages.
Where do you expect your data center to rate on the EPA’s new energy efficiency scale? What have you done or what are you doing to improve?