In this line of work, I can’t help but encounter an awful lot of IT buzzwords. I even write a newsletter dedicated to picking the latest and greatest of ’em. Thankfully, one of the guiding principles of WhatIs.com has always been to decode the tech jargon and spin and explain what something is, who invented it, how it works and why it might be important. Some concepts, like Web 2.0 or SOA, can be a bit tricky to tackle, nearly defying definition. That brings me to the subject of this post: cloud computing. The meme isn’t from just one company or tech visionary trying to gain momentum, either.
Google’s Eric Schmidt is talking about cloud computing, in the context of search and advertising networks — and in the wake of a (predicted) partnership with Apple as the good folks at Cupertino revamp a somewhat tired .Mac offering before the iPhone debuts, it’s not hard to see why.
George Gilder (one of those tech visionaries, without a doubt) thinks the desktop is dead and extols the coming age of the Internet cloud in the pages of Wired, certainly no stranger to cyber-utopian manifestos. In “Information Factories,” he makes quite a case for the upcoming “Petabyte Age”:
We’re all petaphiles now, plugged into a world of petabytes, petaops, petaflops. Mouthing the prefix peta (signifying numbers of the magnitude 10 to the 15th power, a million billion) and the Latin verb petere (to search), we are doubly petacentric in our peregrinations through the hypertrophic network cloud.
Dell has established a cloud computing page for data centers, capitalizing on the trend.
Amazon.com takes it one step further, offering up the Amazon Elastic Compute Cloud, or Amazon EC2, “a Web-based service that allows business subscribers to run application programs in the Amazon.com computing environment. The EC2 can serve as a practically unlimited set of virtual machines.” Jason Kolb thinks that this is an important idea to track, one that will “completely change the face of hosting and how we look at servers. ”
I can’t help think cloud computing is simply utility computing repackaged in a more attractive (if gauzy) name. Tech evangelists, marketers and CEOs may prefer to talk about computing resources “in the cloud” rather than as humdrum “utilities.”
For reference’s sake, WhatIs.com defines utility computing as:
…a service provisioning model in which a service provider makes computing resources and infrastructure management available to the customer as needed, and charges them for specific usage rather than a flat rate. Like other types of on-demand computing (such as grid computing), the utility model seeks to maximize the efficient use of resources and/or minimize associated costs.
The word utility is used to make an analogy to other services, such as electrical power, that seek to meet fluctuating customer needs, and charge for the resources based on usage rather than on a flat-rate basis. This approach, sometimes known as pay-per-use or metered services is becoming increasingly common in enterprise computing and is sometimes used for the consumer market as well, for Internet service, Web site access, file sharing, and other applications.
What do you think? Is on-demand computing from the likes of Google, Apple, Dell and Amazon the future of the Web? Or is it an outgrowth of the utility computing that Sun and HP have been experimenting with for some time? (The Wikipedians seem to agree, at least today, that utility computing = cloud computing). Will small businesses and organizations eventually never buy or see the servers they use, nor need to worry about supporting that expensive hardware ? Is “the cloud” just hype — or do we need to write up a definition?