We’ve got the second installment of Keith Morrow’s three part series, No Time, No Budget, and No People? No Problem! Straight from former CIO of Blockbuster and 7-eleven and current president of K. Morrow Associates, learn how acting like a start-up and maximizing the assets you already have can save you money and precious time when deploying applications in the cloud. Check back soon for part three!
Since the arrival of online commerce 15 years ago, there have been few technology trends that have the potential to revolutionize the retail industry like the ones we see in mobile computing, social networking, and cloud computing. Today’s piece looks into the ways that retailers can shed a more conservative, traditional mindset and embrace new ways for deploying new apps, delving into practical insights for creating innovative, API-enabled applications. More specifically, how leveraging the move to the cloud can serve as the smartest decision in blowing out one’s API strategy.
Think Modular and Act Entrepreneurial, on the Cloud
Many retailers are very conservative when it comes to technology adoption, and they to closely control where new apps are deployed. Due to our limited budget, we didn’t have a choice but to embrace a new way for deploying the new apps.
Had we done it the old way, we would have acquired and configured the database, application, and web servers ourselves. We would have had to negotiate a long-term hosting agreement worth millions of dollars, and the agreement would have to go through a lengthy legal and executive approval process. Instead we acted like a startup and launched our API service and the API-enabled applications on the cloud, outside of the confines of our firewall, with the help of a technology partner. We bought capacity only to the level that we needed and as the services gained customer adoption, we added more. With this strategy, we were able to avoid high, upfront fixed costs and turned them into variable expenses.
Don’t Build Everything from Scratch
Some retail technologists see any initiative as an opportunity to re-engineer and rebuild. We didn’t have that luxury. We also realized that we already had valuable digital assets and enabling applications available, in-house or externally through our partners. The constantly updated movie library was already there. Our store locator engine was built. We had a transaction engine and a payment gateway. What we needed to do was create a common API service layer that would enable new applications to access those services consistently, for many more customers (millions), and in a way that we could monitor analytically for future improvements.
We looked outside of our organization and found a SaaS vendor whose technology enabled us to create this API service layer quickly, get them up and running on the cloud, and use analytical reporting tools to monitor traffic and the conversion data. We also used the same Graphical User Interface designs across different consumer devices, making only minor tweaks for usability. The key is to leverage existing solutions to accelerate time to market before your customers leave you. Without technology from this vendor, it would have taken us five to ten times as long to deliver what we wanted to.
In my third and final part of this series, I’ll discuss the strategic benefits that can result from extending the reach of APIs to developers and partners.
Every silver lining has to have a cloud attached to it, and a headline from Xinhaunet’s Sci & Tech section provides just that. The silver lining? Technology implanted in a human to improve quality of life. The cloud? As with most exciting and cutting edge technology, lack of proper security. Thus Dr. Mark Gasson, a British scientist, has lay claim to becoming the first man to be infected with a computer virus.
Before you grab your yellow outbreak suit and throw your computers out a window, full disclosure: He infected himself. Dr. Gasson set out to demonstrate the danger of further development of medical devices such as pacemakers and cochlear implants without equal development of security.
“With the benefits of this type of technology come risks, ” Dr. Gasson told Xinhaunet. “We may improve ourselves in some way but much like improvements with other technologies, mobile phones for example, they become vulnerable to risks, such as security problems and computer viruses.”
The chip implanted in the doctor’s wrist allowed him access to secure buildings and his mobile phone. Once he contaminated the chip, the planted virus was able to pass onto external control systems. This discovery and the ease with which his experiment was executed was cause for concern for Professor Rafael Capurro of Germany’s the Steinbesi-Transfer-Institute of Information Ethics. He weighed the pros and cons of implant surveillance, telling the BBC: “Surveillance can be part of medical care, but if someone wants to do harm to you, it could be a problem.”
Both Dr. Gasson and Professor Capurro shared their findings and concerns at Australia’s International Symposium for Technology and Society this month.
With security always an underlying concern in all areas of technology, what is your take on the quest for security as developed as the technology it protects? Is Dr. Gasson’s experiment just another case of preaching to the choir, or do you think it will take a threat to human well-being—rather than just their data—to finally shape up the standards for security?
HP’s TechForum 2010 in Las Vegas kicked off Monday and runs through Thursday, June 24. This year’s hot topic theme? Converged infrastructure.
Yesterday’s big announcement was StoreOnce, HP’s new deduplication software. The software, which is now available for HP’s new D2D4312 disk backup appliance, promises to minimize every component of deduplication: hardware requirements, storage capacity, memory and disk I/O requirements.
HP StoreOnce’s Deduplication Bells and Whistles
Like the old adage, “Less is more,” HP StoreOnce hopes to strip down the process of deduplication for more efficient data protection and storage. What exactly can you get your hopes up about the new software?
- Deployable at multiple points of converged infrastructure.
- Deduplicate fewer times and gain more control over data growth.
- Use the same software across backup clients, virtual and inline appliances and scale-out storage systems.
From the HP StoreOnce news release [PDF]:
“HP StoreOnce provides a significantly more efficient method for managing and protecting data while maximizing utilization of storage capacity than competing data deduplication offerings.”
In a recent InformationWeek survey, only 24% of respondents reported using data deduplication technology, while 32% reported evaluating the option and 10% gave a definite no to deduplication. Which group do you belong to? Do HP’s announcements have any effect on your current practices or evaluation of deduplication?
If you are planning to adopt deduplication with HP, prepare for a 20% improvement in inline deduplication performance, allowing you to spend about 95% less on storage capacity. The software is available in all HP StorageWorks D2D backup systems, with offerings suitable from SMB to midsize data centers. The D2D4312, for midsize data centers, boasts scalable capacity up to 48 terabytes and “enables clients to consolidate backup of multiple servers in a single process.” When compared to tape backup and offsite archiving, HP’s D2D4312 can save you up to $2 million dollars for multisite deployment. Not only that, but no matter your storage interface, HP ensures simple deployment into your existing backup process.
What’s your take on data deduplication? Is HP StoreOnce the revolution it promises to be?
Salesforce may be one of the darlings of the cloud computing world, but they have a lot of baggage to get over if their latest venture, Chatter, is to succeed. The new communication platform, which rolled out of beta and into general availability today, mixes a bit of Facebook, a dash of Twitter and a trusted enterprise platform in the hopes of boosting enterprise collaboration and helping Salesforce gain enterprise traction beyond its wildly-devoted sales and customer service customers.
The service has already been in beta with dozens of customers trying it out, and the early returns are good: A reported 22% uptick in productivity and over 55 ChatterExchange apps available for out-of-the-box integration.
We were actually leaked some demo screenshots straight from the Chatter team itself as it brainstormed with similar software groups, and the stream is reproduced, unaltered, below:
Team USA isn’t the only one being robbed at the World Cup: The global game is bringing spam, viruses and phishing attacks to offices around the globe, with network congestion serving as icing on the cake, particularly during game time.
Major events have long been fodder for such attacks and troubles, but Internet hot-spot watchers have been surprised by the magnitude of the World Cup’s impact on office life. According to a statement from Cisco’s Spencer Parker, a product manager:
” … employees are actively taking an interest in the World Cup during working hours. Employees could be watching live streaming of these football matches on their PCs, checking the score during the matches, or even listening to the games. As a result we have seen this significant uplift in Web traffic at the precise times that the matches are taking place.”
And how significant is that uptick? Cisco ScanSafe customers have seen web usage jump 27% globally during World Cup games. If SalesForce becomes SalesStop this Wednesday at 10:00 am ET, now you know why.
But productivity and network traffic aren’t the only victims: Cisco also estimated 3 billion World Cup spam messages went out on June 11th, and some of them held nastier payloads than others. Websense’s Security Labs Blog deconstructed a typical nastygram spam, promising the latest World Cup scandal:
Tragically, rather than the tantalizing scandal – compromising shots of the ever popular WAGs, perhaps? – the attachment only includes URL trickery leading to a compromised webpage and a viral payload.
Game on, but make sure your staff knows the risks.
I saw this chat regarding storage, the cloud, and data protection and it reminded me of how nauseated I get when I hear about all the great new ways that the cloud is going to save those of us in IT from the evils of the world.
Be it in the cloud, in your data center, or in cousin Willy’s basement, the same data protection principles apply to storage systems. The reality is:
- No cloud vendor can offer risk-free storage services.
- No SAS 70 audit report is going to tell the whole story.
- No contract or SLA is going to keep your business out of hot water or the headlines when an issue of confidentiality, integrity, or availability of your data is compromised.
- A marketing spin can be put on anything.
If a storage device is on the network and a human being is somehow involved in its setup, ongoing management, and maintenance, you can bet your bottom dollar that there’s going to be risk. Cloud or not, do yourself and your business a favor and understand what you’re getting into before you jump on the bandwagon.
For further reading on the risks and realities of cloud backup, check this out:
Kevin Beaver is an independent information security consultant, keynote speaker, and expert witness with Principle Logic, LLC and a contributor to the IT Watch Blog.
It’s another big week here in Boston with the approach of the USENIX Annual Technical Conference 2010, and buzz around Microsoft Research’s upcoming announcements all include the word “cloud.” So what exactly do they have up their sleeves?
With the increase of players in the cloud market, including Microsoft’s own Windows Azure Platform, Microsoft Research has come to the rescue with this framework to aid customers select a cloud provider [PDF]. Their benchmarking tools help predict application cost and performance once deployed, allowing for more detailed comparison among cloud services.
The purpose of this technology is to eliminate the disparity between how applications perform under light workloads versus high workloads by utilizing smarter request sends. Rather than immediately sending all requests as they occur, Stout [PDF] will “[treat] scalable key-value storage as a shared resource requiring congestion control.” In the case of lighter workloads, requests will be sent immediately while Stout will auto-batch requests for heavier workloads to prevent queuing delays.
Utility Coprocessor, or Ucop
The purpose of this middleware is to economize “dramatic speedups of parallelizable, CPU-bound desktop applications using utility computing clusters in the cloud.” They’ve created a prototype, based in Linux, that allows a single cluster to serve everyone, requiring only that the users and cluster use the same major kernel version. Learn more about how they manipulated client configurations in their white paper [PDF].
This—performance isolation for cloud datacenter networks [PDF]—is the answer to the lack of control over how networks are shared, which opens cloud applications to interference and unpredictable performance: an “edge-based solution that achieves max-min fairness across tenant VMs by sending traffic through congestion-controlled, hypervisor-to-hypervisor tunnels.”
Check out the USENIX ATC 2010 website for further information on the conference, speakers and announcements to come. Are you planning to attend the conference? Give us updates and opinions in the comments section or email me directly.
If you thought your data was getting crowded in your storage center, you haven’t seen the cloud storage market lately. Verizon dealt itself into the game on Tuesday with its Verizon Cloud Storage Storage-as-a-Service offerings. Distinguishing itself from consumer-focused services such as Amazon’s S3, Verizon promises to provide enterprises with superior value and access speed, utilizing their global data network in combination with storage capacity. According to IT News, Verizon’s first nodes are set to go live in the U.S. in October; until then, they’ll jumpstart the service in July using Nirvanix’s Storage Delivery Network. The service is pay-as-you-go, starting at $0.25 per gigabyte per month, with discounts for more volume.
Despite concerns enterprises have over security, reliability and data recovery and retrieval, big names like EMC and AT&T entered the arena last year along with Amazon’s S3 with similar offerings. And though more carriers may join the game, Verizon is highlighting the benefits of being on its global IP network: data access with fewer network jumps along with Verizon’s security infrastructure.
Customers will be able to access data using:
- Application programming interfaces
- Third-party applications and backup agents
- Standard CIFS
- Standard NFS
- Browser-based portals for data management between Nirvanix and Verizon’s data centers
The SaaS is partnered with Verizon’s new consulting service, Verizon Data Retention Services, to help guide businesses through storage policy development to best suit their needs and operations.
What are your thoughts on the cloud storage bandwagon? What are your concerns surrounding switching your storage to the cloud with a new service like Verizon’s SaaS?
If you’re like me, you’ve yelled, cussed, and screamed over frustrations brought on by backup software. Let me share my experiences.
First off, I remember, back in the late 80s/early 90s, using a Colorado drive (remember those?!) with DOS-based software that just worked. Besides the occasional hardware burp, I knew I could set my 250MB system to backup overnight and by dawn it’d likely be finished.
Flash forward a few years to the mid to late 1990s and what did we have? More complexity. Sure, products like Arcserve and BackupExec had lots of features, but they also had tons of complexity. I just needed to run backups but couldn’t figure how to half the time without all the bells and whistles getting in the way. Perhaps I should’ve taken a class. On top of that, the software was extremely unstable and unreliable. Growing pains in that era I suppose. I miss the simplicity of networks, OSs, and so on from those days but certainly don’t miss all the headaches backup software brought on.
Flash forward to today: I don’t hear about such problems. Have the vendors finally gotten it? I’ve sort of witnessed this starting 6 or 8 years ago with the cool new disk imaging software company called Acronis that had a product called TrueImage. Acronis not only brought innovation to the table (being able to image a drive while the OS was running), but they helped me realize that backups can indeed be simple again. I thought Acronis TrueImage picked up where Ghost and Drive Image left off and they’ve continually innovated to the point where backups just work. No fuss, no muss, just good old reliable backups.
According to the recent SearchStorage.com quality awards, Acronis is holding its own against the big guys now, which is pretty impressive. It shows how a small company willing to innovate can take on the establishment. I love free enterprise! Don’t take this the wrong way, I’m no spokesperson. However, when I find software that works well I’m going to sing its praises and tell others about it.
Speaking of innovation, check out this graphical depiction of the history of computer storage. Very cool stuff that makes you realize how far we’ve come.
Kevin Beaver is an independent information security consultant, keynote speaker, and expert witness with Principle Logic, LLC and a contributor to the IT Watch Blog.
In light of this month’s theme, Storage in 2010, we’ve compiled some books on data storage and more energy-efficient IT operations. Have you read one of these titles or did we leave a great book out? Let me know at Melanie@ITKnowledgeExchange.com and we’ll add it to our list!
- The Green and Virtual Data Center by Greg Schulz: (CRC/Auerbach Publications, 2009) Rather than focusing on the political side of the green movement, Schulz highlights the realistic and technologically possible ways to upgrade to or create a more sustainable data center.
- Grow a Greener Data Center by Douglas Alger: (Cisco Press, 2009) “A guide to building and operating energy-efficient, ecologically sensitive IT and facilities infrastructure.
- Information Storage and Management: Storing, Managing and Protecting Digital Information by EMC: (Wiley, 2009) A wide angle view of “concepts, principles, and deployment considerations – rather than product specifics.”
- Storage, Data and Information Systems by John Wilkes, Christopher Hoover, Beth Keer and Pankaj Mehra: (Hewlett-Packard Laboratories, 2008) Described as a “refreshingly straightforward introduction to an important area,” check this title out if you want the facts in an easy-to-understand format.
- Storage Area Networks for Dummies, 2nd Edition by Christopher Poelker and Alex Nikitin: (2009) From the basic “What are SANs?” SANs for Dummies addresses all of your questions to help you navigate your way through the subject.
- Storage Virtualization: Technologies for Simplifying Data Storage and Management by Tom Clark: (Addison-Wesley Professional, 2005) An objective and technical look at one of the saviors of storage managers.
- Storage Networks Explained, 2nd Edition by Ulf Troppens, Rainer Erkens, Wolfgang Mueller-Friedt, Rainer Wolafka and Nis Haustein: (Wiley, 2009) For the ins and outs of SANs – the uses, forms and benefits – check out this book. For anyone in IT systems planning or operating.
What books or guides have helped you figure out your storage needs and best practices?