It’s another big week here in Boston with the approach of the USENIX Annual Technical Conference 2010, and buzz around Microsoft Research’s upcoming announcements all include the word “cloud.” So what exactly do they have up their sleeves?
With the increase of players in the cloud market, including Microsoft’s own Windows Azure Platform, Microsoft Research has come to the rescue with this framework to aid customers select a cloud provider [PDF]. Their benchmarking tools help predict application cost and performance once deployed, allowing for more detailed comparison among cloud services.
The purpose of this technology is to eliminate the disparity between how applications perform under light workloads versus high workloads by utilizing smarter request sends. Rather than immediately sending all requests as they occur, Stout [PDF] will “[treat] scalable key-value storage as a shared resource requiring congestion control.” In the case of lighter workloads, requests will be sent immediately while Stout will auto-batch requests for heavier workloads to prevent queuing delays.
Utility Coprocessor, or Ucop
The purpose of this middleware is to economize “dramatic speedups of parallelizable, CPU-bound desktop applications using utility computing clusters in the cloud.” They’ve created a prototype, based in Linux, that allows a single cluster to serve everyone, requiring only that the users and cluster use the same major kernel version. Learn more about how they manipulated client configurations in their white paper [PDF].
This—performance isolation for cloud datacenter networks [PDF]—is the answer to the lack of control over how networks are shared, which opens cloud applications to interference and unpredictable performance: an “edge-based solution that achieves max-min fairness across tenant VMs by sending traffic through congestion-controlled, hypervisor-to-hypervisor tunnels.”
Check out the USENIX ATC 2010 website for further information on the conference, speakers and announcements to come. Are you planning to attend the conference? Give us updates and opinions in the comments section or email me directly.
If you thought your data was getting crowded in your storage center, you haven’t seen the cloud storage market lately. Verizon dealt itself into the game on Tuesday with its Verizon Cloud Storage Storage-as-a-Service offerings. Distinguishing itself from consumer-focused services such as Amazon’s S3, Verizon promises to provide enterprises with superior value and access speed, utilizing their global data network in combination with storage capacity. According to IT News, Verizon’s first nodes are set to go live in the U.S. in October; until then, they’ll jumpstart the service in July using Nirvanix’s Storage Delivery Network. The service is pay-as-you-go, starting at $0.25 per gigabyte per month, with discounts for more volume.
Despite concerns enterprises have over security, reliability and data recovery and retrieval, big names like EMC and AT&T entered the arena last year along with Amazon’s S3 with similar offerings. And though more carriers may join the game, Verizon is highlighting the benefits of being on its global IP network: data access with fewer network jumps along with Verizon’s security infrastructure.
Customers will be able to access data using:
- Application programming interfaces
- Third-party applications and backup agents
- Standard CIFS
- Standard NFS
- Browser-based portals for data management between Nirvanix and Verizon’s data centers
The SaaS is partnered with Verizon’s new consulting service, Verizon Data Retention Services, to help guide businesses through storage policy development to best suit their needs and operations.
What are your thoughts on the cloud storage bandwagon? What are your concerns surrounding switching your storage to the cloud with a new service like Verizon’s SaaS?
If you’re like me, you’ve yelled, cussed, and screamed over frustrations brought on by backup software. Let me share my experiences.
First off, I remember, back in the late 80s/early 90s, using a Colorado drive (remember those?!) with DOS-based software that just worked. Besides the occasional hardware burp, I knew I could set my 250MB system to backup overnight and by dawn it’d likely be finished.
Flash forward a few years to the mid to late 1990s and what did we have? More complexity. Sure, products like Arcserve and BackupExec had lots of features, but they also had tons of complexity. I just needed to run backups but couldn’t figure how to half the time without all the bells and whistles getting in the way. Perhaps I should’ve taken a class. On top of that, the software was extremely unstable and unreliable. Growing pains in that era I suppose. I miss the simplicity of networks, OSs, and so on from those days but certainly don’t miss all the headaches backup software brought on.
Flash forward to today: I don’t hear about such problems. Have the vendors finally gotten it? I’ve sort of witnessed this starting 6 or 8 years ago with the cool new disk imaging software company called Acronis that had a product called TrueImage. Acronis not only brought innovation to the table (being able to image a drive while the OS was running), but they helped me realize that backups can indeed be simple again. I thought Acronis TrueImage picked up where Ghost and Drive Image left off and they’ve continually innovated to the point where backups just work. No fuss, no muss, just good old reliable backups.
According to the recent SearchStorage.com quality awards, Acronis is holding its own against the big guys now, which is pretty impressive. It shows how a small company willing to innovate can take on the establishment. I love free enterprise! Don’t take this the wrong way, I’m no spokesperson. However, when I find software that works well I’m going to sing its praises and tell others about it.
Speaking of innovation, check out this graphical depiction of the history of computer storage. Very cool stuff that makes you realize how far we’ve come.
Kevin Beaver is an independent information security consultant, keynote speaker, and expert witness with Principle Logic, LLC and a contributor to the IT Watch Blog.
In light of this month’s theme, Storage in 2010, we’ve compiled some books on data storage and more energy-efficient IT operations. Have you read one of these titles or did we leave a great book out? Let me know at Melanie@ITKnowledgeExchange.com and we’ll add it to our list!
- The Green and Virtual Data Center by Greg Schulz: (CRC/Auerbach Publications, 2009) Rather than focusing on the political side of the green movement, Schulz highlights the realistic and technologically possible ways to upgrade to or create a more sustainable data center.
- Grow a Greener Data Center by Douglas Alger: (Cisco Press, 2009) “A guide to building and operating energy-efficient, ecologically sensitive IT and facilities infrastructure.
- Information Storage and Management: Storing, Managing and Protecting Digital Information by EMC: (Wiley, 2009) A wide angle view of “concepts, principles, and deployment considerations – rather than product specifics.”
- Storage, Data and Information Systems by John Wilkes, Christopher Hoover, Beth Keer and Pankaj Mehra: (Hewlett-Packard Laboratories, 2008) Described as a “refreshingly straightforward introduction to an important area,” check this title out if you want the facts in an easy-to-understand format.
- Storage Area Networks for Dummies, 2nd Edition by Christopher Poelker and Alex Nikitin: (2009) From the basic “What are SANs?” SANs for Dummies addresses all of your questions to help you navigate your way through the subject.
- Storage Virtualization: Technologies for Simplifying Data Storage and Management by Tom Clark: (Addison-Wesley Professional, 2005) An objective and technical look at one of the saviors of storage managers.
- Storage Networks Explained, 2nd Edition by Ulf Troppens, Rainer Erkens, Wolfgang Mueller-Friedt, Rainer Wolafka and Nis Haustein: (Wiley, 2009) For the ins and outs of SANs – the uses, forms and benefits – check out this book. For anyone in IT systems planning or operating.
What books or guides have helped you figure out your storage needs and best practices?
In preparation for my session at the upcoming Gartner Security Conference, I’ve been reviewing Intel’s Anti-Theft Technology. Have you seen it? It’s pretty neat and is a unique approach to the mobile security dilemma.
Basically, Intel is starting to integrate laptop security into their 2010 Intel Core hardware which promises to:
- Detect suspicious behavior that could indicate someone trying to break into the computer.
- Guard your hardware even if your hard drive is removed, replaced or reformatted.
- Restore operation when (if) the laptop is recovered.
Intel claims the technology will work even if someone re-images the system, changes the boot order, installs a new drive, or keeps the system off the Internet.
Now this is change we can believe in!
I’ve always thought that unless and until the vendors integrate controls such as Intel’s Anti-Theft Technology and drive encryption from the factory, we’re going to continue having a ridiculous amount of mobile security breaches. Sure, these technologies aren’t going to run themselves, but I believe them being built-in will dramatically increase the chances that they’ll be used.
I’ll give it a few more years, but I think my continual ranting about mobile security may eventually come to an end.
Kevin Beaver is an independent information security consultant, keynote speaker, and expert witness with Principle Logic, LLC and a contributor to the IT Watch Blog.
After a sleepy first quarter, storage area network (SAN) switch and adapter sales are booming as storage professionals look to get a get grip on explosive data growth, according to numbers crunched by Infonetics Research’s Michael Howard.
“In the first quarter of 2010, Fiber Channel and Fiber Channel over Ethernet SAN switch sales were unseasonably up, thanks mainly to Cisco, while SAN adapter sales were seasonally down, creating a flat sequential quarter for the overall storage network equipment market,” he wrote in a recent research note. “However, the market is up 24% from this quarter a year ago, and, with data and video being created at unprecedented levels and concentrated in data centers, companies will continue to invest in SAN switches to reduce complexity on the data center floor.”
And Howard thinks the SAN love is here to stay, projecting the market will grow to $6.5 billion by 2014, more than double its revenues in 2009. The technology’s been particularly popular in the North American market, where video production and transmission has pushed media moguls to find a better way to manage their storage farms, but with more video corporate training, communication and HD video to go around digitally, I wouldn’t be surprised to see much of the rest of the world following on fast.
It’s Storage in 2010 Month here at IT Knowledge Exchange, so we thought we’d share what some of our bloggers around the community block are saying about storage. Have a storage topic you’d like covered? Let me know in the comments or email me directly at Melanie@ITKnowledgeExchange.com. Enjoy!
- Fiber channel or iSCSI? Raj Perumal answers the age-old storage question over at The musings of an IT Consultant blog.
- An open source clustered NAS deep dive: Storage Soup’s Beth Pariseau brings us an in-depth Q&A with Gluster CTO and co-founder, Anand Babu Periasamy, on their development of an “open-source, software-only, scale out NAS system for unstructured data.”
- How do I know if I should be using RAID 5 or not? Mrdenny explores the numbers of the RAID question over at SQL Server with Mr. Denny.
- Symantec addresses concerns and re-org rumors: Beth Pariseau of Storage Soup delves into the speculation that Symantec’s Storage and Availability Management group was planning to “divest entirely from the storage business.”
- Powering up and down a storage array may not be done in the order you think: Get step-by-step best practice for powering up and down your storage array from SQL Server with Mr. Denny.
- Let the primary reduction deals: Get another perspective of IBM’s acquisition of Storwize from Dave Raffo for Storage Soup.
And as always, check back at the Enterprise IT Watch blog for the latest on our monthly themes and topics.
Storage Magazine’s recent piece “Making the case for solid-state storage” brought up some interesting points regarding the future of storage technology.
First, will solid state storage finally bring an end to data loss resulting from drive failures? As we’ve all learned the hard way, there’s a universal law that states fewer moving parts means greater reliability. Speed is another factor, with performance gains of seven to nine times traditional storage. Not just in processing efficiency, convenience, user productivity and so on, but such speed improvements can have an enormous impact on disaster recovery and business continuity efforts. Perhaps this will be the contributing factor that pushes management to get on board with DR/BC. Or, perhaps it’ll be an excuse: “If we can use solid-state storage devices to recover our systems much more quickly, what’s our incentive to build out an elaborate DR/BC program?”
It’s funny, I remember studying about the inner workings of solid state memory while getting my undergraduate degree in the early 1990s way before “solid state” as we know it today was cool. Maybe it’s just my perception, but this technology seems a bit late to the game. I like what solid state storage stands for and brings to the table but given where we are in other areas of technology, it’s just not all that exciting.
Maybe my perception will change once I get my hands on some of this hardware.
In the news release announcing the 2005 merger of Symantec and Veritas, the new entity was said to be “uniquely positioned to deliver information security and availability solutions across all platforms, from the desktop to the data center.” Symantec’s Chairman and CEO, John W. Thompson, speculated that its “customers [were] looking to reduce the complexity and cost of managing their IT infrastructure and drive efficiency with fewer suppliers.”
If that still holds true, customers got more of what they want—or fewer of what they don’t want—in yesterday’s latest symbiotic acquisition of humyo by Trend Micro. The new association mirrors the convergence of data storage and data security.
From the Trend Micro website:
The acquisition of humyo aligns nicely with Trend Micro’s strategy to deliver industry leading cloud security technology and products to stop threats where they emerge, on the internet, and provide security for and from the cloud.
Trend Micro is number three behind Symantec and McAfee in the antivirus and security sector, and it’s safe to bet that this acquisition is an attempt to expand its offerings and therefore its customer base. After all, if you’re already offering security for the cloud, why not go that extra step and provide the cloud as well, right? One-stop-shopping seems to be the trend in IT storage these days, as does the ever-blurring of the line between storage and security. From MSPmentor’s Joe Panettieri:
In some cases, the walls between online storage and security are fading away or disappearing entirely. For instance, CA Technologies is reinventing its security and storage lineup to embrace managed services pricing. Symantec is expected to take similar steps soon, according to online forum chatter. And master MSPs like Ingram Micro Seismic and Virtual Administrator are blending storage and security services for VARs and MSPs.
What are your thoughts on this acquisition and its implications for the storage industry? Is cloud storage replacing cloud backups? Share your thoughts in the comments section or email me directly.
Acronis has a pretty neat—and free—tool called Drive Monitor that’s worth checking out. The premise of the program—”to increase customer awareness about the health of their disk drives and to encourage them to back up their data in order to survive a disk failure”—struck a chord when I saw the offer. It seems a little backwards, but I believe the Acronis folks have taken a good approach.
It’s true—many people tend not to take backups seriously unless or until it appears that something bad is about to happen. I’ve seen a lot of businesses banking on the reliability and up time of critical drives and storage systems while ignoring the maintenance and oversight such systems require. This mindset is directly related to the common information privacy and security excuses you hear from management such as:
- “We don’t have anything the bad guys would want.”
- “We have a firewall, encryption, and anti-virus, so we’re good.”
- “Our auditor told us we’re in compliance with the important federal regulations.”
I don’t know which is the more primitive of information security controls—using strong passwords or backing up data—but it’s really interesting how many businesses overlook both.
The way I see it is at least vendors such as Acronis see the need and are offering preventative tools to keep bad things from happening. Sure, we can’t force data backups on others but at least there are resources for those looking for a solution.