According to Aguacer0:
Tape will never die due to compliance issues. It is being replaced as near-line backup due to performance and capacity. Nowadays you will see backups being executed to disk first and then tapes (D-D-T). Now with technologies like dedupe and VTL, tapes are coming from 2nd step to 4 step for archival purposes. But tape will never die.
Mrdenny agrees, speculating that tape will last as long as off-site archives are needed. Meandyou uses tape for data backup: virtual tape for onsite storage and cartridge tape for offsite. ToddN2000 prefers optical media to tape, due to its superior durability.
Dee101 weighs in that the cost of replacing all of the data already allocated to tape is far too weighty an undertaking, especially in a market of quickly evolving disk standards. That didn’t stop Dee101 from switching his own archives to disk recently in order to “be prepared to continue to evolve on disk.”
Petkoa doesn’t like tape, period. Instead, his company relies on disks for quick restore backups and optical storage for “long-term irrevocable backups.” Tapes – “slow and overwritable” – present a major liability “if (maybe when?) [they are] hacked.”
What are your experiences with tape storage? Have you migrated to disk or another form of data backup and archiving that you prefer? Let us know in the comments section or over at the forum, then check out HP’s blog post outlining six reasons why tape is here to stay.
Microsoft’s TechEd 2010 North America — the conference for “cutting-edge insights and expertise that will make life easier for you (and everyone else) at work” — kicked off on June 7th and is running through June 10th in New Orleans. Some of TechTarget’s top editorial bloggers are down there, soaking up what Microsoft’s dishing out, and we’ve compiled some of the highlights of their highlights right here.
Brendan Cournoyer hit on the high points of TechEd over at the Windows Server Notebook blog. First up, the keynote speech, which he said was dominated by cloud talk with the main theme of “how IT professionals can extend the tools and data they currently use on premise (Active Directory, System Center Products, etc.) to cloud computing environments.” President of server and tools business Bob Muglia named the server as the “heart of the cloud,” and articulated that the “company’s goal is to bring all the features of Windows Server and SQL Server to Azure.”
The June 7th announcement of the release of the beta version of Service Pack 1 for Windows 7 and Windows Server 2008 R2, slated for release at the end of July, created quite the buzz as well. Ed Tittel over at Windows Enterprise Desktop speculated that Microsoft is emphasizing SP1′s lack of new and specific features to discourage users from waiting for its release, encouraging them to act now and adopt Windows 7. Apparently the features available in July’s SP1 are already available through Windows Update. Cournoyer noted that SP1 will have “Dynamic Memory for Hyper-V and RemoteFX functionality for VDI.”
Other updates and releases from the Windows Server Notebook:
- Windows Server AppFabric is now available.
- Senior marketing manager for mobile communications August Valdez gave a demo of the Windows Phone 7′s features and capabilities. He described it as “basically a portable phone-version of the Office suite.”
- Internet Explorer 9 was mentioned, which will include “new ‘graphics acceleration’ features to take advantage of the latest standards and scenarios for Web browsing.”
- Windows Intune in beta was also announced, a platform for an environment of PC updating and management tasks such as inventory and patching. Is it “a glimpse of the future” as promised by Muglia?
- The cloud made an appearance again during Amir Netz’s presentation of Microsoft PowerPivot, supposedly where “business intelligence can be combined with — you guessed it — the cloud.” The demonstration included data made available through SharePoint 2010 and SQL Server Analysis Services and integrated with Bing! maps.
For more updates straight from Tech Heads in NOLA, check out their Twitter feed @TechEd_NA or follow Brendan’s coverage at @WindowsTT. Are you at TechEd? What are some of your favorite highlights? Let me know in the comments or email me directly at Melanie@ITKnowledgeExchange.com.
Feel stuck at a legacy company fighting smaller, nimbler competitors? Want to know how your IT shop is supposed to deliver the latest SaaS and cloud solutions when half your stores still run the original cloud software, AS/400? Today’s guest post by Keith Morrow, former CIO of Blockbuster and 7-eleven and current president of K. Morrow Associates, might help give you the insights you need to succeed. Check back: We’ll be carrying more of Keith’s writing soon.
Since the arrival of online commerce 15 years ago, there have been few technology trends that have the potential to revolutionize the retail industry like the ones we see in mobile computing, social networking, and cloud computing. As retail executives, we are challenged to deliver innovative applications that satisfy the customers’ demand for commerce content and transactional ability from anywhere, using any of the always-connected consumer devices: smart phones, set-top boxes, and even internet-connected picture frames. Furthermore, due to the great recession, the notoriously limited retail IT budget is now even tighter, so we have to deliver these applications at little or no additional cost.
As someone who has been in the same boat, I want to offer, over a series of posts, several practical insights for creating innovative, API-enabled applications. I believe the time to take action is now, and if we do things smartly, limited budget or aggressive timeline should not be an impediment.
Find the Perfect Entry Point, Not the Perfect Roadmap
The company that I worked for rented movies and games through retail outlets. We needed to act fast to counter the challenges posed by our competitors who were delivering the goods by mail and digitally via the web and set-top boxes.
Instead of doing lengthy research to discover the perfect product roadmap, we decided to focus on enabling developers and partners to innovate on our core service. We quickly launched a very focused API service as an entry point, and developed a simple iPhone app as a proof-of-concept for building an application on that API.
Then, working with our partners, we quickly created and released an application that the customers can use from their mobile devices or set-top boxes. Using this application, the customers can search for movie titles, browse for recommendations, and decide whether they want to watch the movie immediately through their set-top box, browser, or pick up the movie from the closest retail outlet.
The most critical decision was to not wait too long to make a decision. Once customers were using the initial app, we enabled them to give us feedback that we monitored closely, which in turn allowed us to iterate and improve upon the initial entry point.
In my next entry, I’ll look into ways that retailers can shed a more conservative, traditional mindset and embrace new ways for deploying new apps.
As they say, incentive makes perfect, and now there’s incentive—in addition to increased efficiency and lower cost—to green up those data centers. The Environmental Protection Agency has announced “that stand-alone data centers or buildings that house them can now receive the agency’s official recognition for being in the top percentile of the most energy efficient buildings in the category,” as reported at DatacenterDynamics.
With increased and ongoing effort to hone methods for making data centers a greener operation by improving power management, the Energy Star rating creates a more tangible way to measure who’s ahead of the game and who’s falling behind, or whose Power Usage Effectiveness doesn’t fit into the bottom 25 percent. Perhaps it’s a last-ditch effort to counter the exponential growth in data and thus of data centers and energy consumption. From the Energy Star website:
In its 2007 Report to Congress on Server and Data Center Energy Efficiency Opportunities [PDF], EPA estimated that the nation’s servers and data centers consumed about 61 billion kilowatt-hours (kWh) in 2006 (1.5 percent of total U.S. electricity consumption) for a total electricity cost of about $4.5 billion. As one of the fastest growing sectors, national energy consumption by servers and data centers could nearly double by 2011 to more than 100 billion kWh, representing a $7.4 billion annual electricity cost. However, there is significant potential for energy efficiency improvements in data centers.
The Power Usage Effectiveness metric, developed by The Green Grid, piggybacks on the agency’s already-existing commercial buildings rating. There is also a ratings system in place for hardware such as servers, with a system for rating data center storage equipment in development stages.
Where do you expect your data center to rate on the EPA’s new energy efficiency scale? What have you done or what are you doing to improve?
When you hear the word “storage,” what typically comes to mind? Likely NAS, SANs, DAS, data centers and so on, right? Well, there’s another component to storage that tends to get overlooked…at least in the context of oversight, security, and compliance. That is: mobile storage. From smartphones to external hard drives to iPads and beyond, there’s easily as much storage capacity on your business’s mobile systems as there is in the traditional storage environment. That’s a big deal.
So how are you keeping that storage environment under wraps? Management will often proclaim,“We have a policy against placing sensitive information on mobile devices,” and go on to say, “We trust our users to do the right thing.” This is all fine and dandy on paper but in reality, there’s a lot of risk any given business is bearing due to unprotected – or underprotected – mobile storage. If I were a betting man – and I am – I’d venture to say that 95% of all storage-related risks is in your user’s hands, literally.
Keep this in mind when securing your storage systems; there’s a big payoff to be had if you do. Otherwise, when someone – such as a business partner, auditor, regulator, or opposing legal counsel – asks you if your storage environment is secure, what can you truthfully say?
For further reading, check out the pieces I’ve written for TechTarget on mobile storage security.
I’m in the middle of writing a whitepaper on data protection for CSOs, and it occurred to me just how often storage systems are overlooked in security testing. The typical security assessment involves servers, workstations, mobile devices, databases, Web applications, WiFi, and network infrastructure systems. You rarely see/hear anyone scoping storage systems in particular. Why is this? Do people just assume that they’re secure because they’re on a hardware appliance or they paid a gagillion dollars for them and surely someone thought about security along the way?
The reality is, if it has an on/off switch and an IP address, it’s fair game on the network. Not only do high-end NAS and SAN storage systems meet these criteria, but they also have other attack surfaces – especially Web interfaces – that make them that much more susceptible to attack. Unfortunately, such IPs and URLs may or may not be tested during any given internal vulnerability assessment depending on the scope and how deep the tester looks.
Whether you do it yourself or hire an independent information security consultant, when it comes time to scope your next security assessment, be sure to include your storage environment. If you don’t find the weaknesses, surely a bored or malicious insider will. Better to be proactive for something so critical to your business.
It’s a dark day for networking professionals, particularly the Cisco fans in their ranks: Apple’s iOS 4 will make their lives harder in more ways than one, but Cisco itself doesn’t seem to care.
While millions of gadgeteers worldwide oohed and ahead at the iPhone and iPad’s latest features, including video calling and better ads, networking professionals groaned because finding information about their bread and better, Cisco’s ubiquitous router and switch operating system IOS, just got harder.
As if it wasn’t bad enough that the iPhone and iPad are both notorious for causing trouble on campus networks, now Cisco fans face sifting through Apple-related information when trying to troubleshoot their hardware or just keep up on the latest news.
At first, many (including myself) though this might be a swipe by Apple at Cisco after the latter tried enforcing its iPhone trademark when Apple first launched their mobile device line. This time, however, Cisco did it all legal like, licensing rights to the term “iOS” from Cisco.
Looks like Cisco fans better brush up on their advanced search queries if they want to stay ahead of the Apple fanboys and girls.
The power of electricity has been long known, but only in recent years, with a focus on both being greener and saving money, that power management has again taking a high priority among large organizations. This guest post from Pam Seale, product marketing manager with Absolute Software, goes into some critical steps to keeping power consumption in line.
Few organizations will argue the value of power management policies. Not only is energy conservation an important part of environmental stewardship, but by defining when inactive computers switch to a lower power setting or turn off, organizations can easily reduce costs and please the bottom line.
It seems, however, that even fewer organizations are certain how to implement power management policies that both make sense for their work environment and are easily enforceable. So what are some of the key things to keep in mind when designing a power management policy?
Understand your current power reality. To find out where efficiencies can occur and to establish a benchmark to measure success, you need to know how much power your computers are currently consuming. The fastest way to collect this information is via a power management product-typically an installed agent that reports detailed metrics on energy use for each device and the overall fleet. You can also enlist the help of a power management ROI calculator to determine how much you can potentially save.
Note policy metric considerations.
- Work habits of internal teams: Power management schedules should be flexible to account for various users’ work hours; management products should allow you to define groups to which unique metrics can be applied.
- Power source settings: A battery-powered computer should probably be set to power down or hibernate after a shorter duration of inactivity than a device that is plugged into the wall. Power management policies should acknowledge this.
- Flexible actions: To accommodate diverse users, power management actions should be flexible-log out, hibernate, sleep modes, shut down, etc. Action triggers-how long a device must be inactive to activate them-should be equally accommodating.
Refine results. Power management tools should allow you to examine both current and historical power use, power on time, etc. Comparing these metrics will reveal where greater efficiency can be achieved.
Take advantage of rebates. Government and provider rebates, grants and subsidies are available to organizations that implement computer power management policies, and can cover some or all of the costs of your power management tools.
With the right tools in place and a basic understanding of your organization’s power use, power management policies need not be a daunting task. There are a number of products available today to help simplify the design and management of your power policies-it’s simply a matter of finding which solution supports the above capabilities and best fits your organization’s needs.
For more information on power management policies, and to learn about the power management capabilities of Absolute Manage – a cross-platform computer lifecycle management tool by Absolute Software – visit www.absolute.com/power or view the Absolute Software blog.
Even if your company was immune to being wiped out by a natural disaster, you’ve only thwarted about one percent of the threats against your data. Don’t get caught in the lurch when freak—or not so freak—accidents strike; back it up. For the ins, outs, pros, cons—and other short, plural words—of storage best practices, check out these blogs:
- StorageRap: Marc Farley is an author and blogger, covering topics such as storage networking, technology, applications and markets.
- Storagezilla: Mark Twomey is zero parts lizard and all parts EMC whiz. This is his personal blog, however, and EMC is in no way involved with what he blogs about.
- The Storage Anarchist: Another chip off the EMC block, Barry Burke shares his personal politics regarding all things storage.
- Storage Soup: The editors at SearchStorage.com bring you a “lighthearted review of the latest industry chatter, trends and products in the storage arena.” Bon appétit!
- Pack Rat: Stephen Foskett is a self-proclaimed pack rat—in his personal life. Professionally, he’s a vendor-independent storage consultant.
- The Storage Alchemist: Storage technology industry leader Steve Kenniston blogs about his experiences in the storage community.
- The Storage Effect: Seagate’s blog dedicated specifically to all things storage.
- Online Storage Optimization: Carter George, co-founder of Ocarina Networks, provides industry commentary and info on Ocarina’s “unique storage optimization technology.”
- Chuck’s Blog: An EMC insider himself, Chuck Hollis blogs on all things technology including, you guessed it, storage.
- StorageMojo: The founder of TechnoQWAN LLC, Robin Harris also blogs at his ZDnet blog, Storage Bits.
According to Friday’s IDC Worldwide Quarterly Disk Storage Systems Tracker as reported by IT News’ Lucas Mearian, external disk storage sales experienced a year-over-year growth of 17.1 percent, with $5 billion in revenue in Q1 of 2010. The disk storage systems market had a revenue of $6.7 billion—an 18.8 percent growth—shipping 3,397 petabytes of capacity, up 55.2 percent.
For a bit of perspective, this comes after a 4 percent drop in sales last quarter for external disk storage systems.
IDC analyst Steve Scully seems hopeful that this is indicative of what’s to come, that “people are looking to increase their IT spend,” but it’s too soon to tell.
The NAS market is the star of this story as companies attempt to manage the exponential growth of unstructured data. The other Cinderella is NetApp, whose products for unified storage helped them pull ahead from fourth to a tie for second place in revenue share with IBM after a 47 percent growth.
A Whole Bunch of Numbers
EMC leads the NAS market with 45.1 percent revenue share; NetApp trails right behind with 26.9 percent. The iSCSI SAN market experienced a revenue growth of 45.7 percent; Dell’s at the forefront with 36.9 percent and NetApp next with 14.4 percent.
The whole networked disk storage market is up 26.3 percent with EMC up front with 28.7 percent revenue share, followed by, you guessed it, NetApp with 13.7 revenue share.
Despite Scully’s hesitant optimism, these numbers hint that storage solutions’ increased priority means increased IT budgets and spending overall.
What are you and your company allocating to solutions for storage in 2010? What are your predictions for the key players in storage solutions for the remaining quarters?