The Troposphere

Meteorology for the cloud computing world


November 12, 2008  10:13 PM

Cloud tidbits from the 451 Group Client Conference



Posted by: JoMaitland
cloud computing

Duncan Johnston-Watt, the CEO of stealthy upstart Cloudsoft Corp. gave the first presentation on his new company at the 451 Group’s Client Conference in Boston today.

“I like to say you’re not doing more with less; you’re doing nothing,” he said, speaking of the shift to cloud computing. “You will own less and less of the infrastructure and care only about what business services you need.”

Cloudsoft is building a product that uses patented mediated routing IP from Enigmatec Corp, Johnson-Watt’s previous company. Called MarketMaker, the product will provide mediation as a service for service providers looking to get into the online auction, betting or brokerage and bookings businesses. “You want to be able to move mediation from one cloud to another to forefill orders  …  to move order books around to provide a guaranteed performance level and predictable behavior,” he said.

The name of the company, which implies it intends to be the Microsoft of the cloud, is no accident. “What is the Microsoft Office of the cloud … what are the essential business services you need from the cloud?” Johnston-Watt said. That’s where he believes cloud computing will get interesting.

Meanwhile, keeping it real, William Fellows, principal analyst and co-founder of the 451 Group  said cultural and organizational issues concerning power, trust, control and ownership are the biggest barriers to adoption of cloud services by enterprise IT. He also believes that the contractual language is not there yet for service-level agreements that meet compliance regulations. But Fellows insisted that  IT should not dismiss the trend. “Just understand what it’s good for.”

November 10, 2008  5:12 PM

EMC takes wraps off Atmos cloud plans



Posted by: JoMaitland
cloud computing, Cloud storage, EMC Atmos

EMC Corp. says that it has a handful of Web 2.0 service providers using its new Atmos cloud-optimized storage (COS) product, but none that were ready to discuss it today. So for now, Atmos is an interesting technology announcement waiting for a reality check from customers.

And while EMC is focused on selling this to service providers initially, it does believe there’s an enterprise play down the line for media and entertainment, life sciences, and oil and gas companies interested in building private clouds. Somewhat confusingly, EMC also hinted at its plans eventually to become a service provider itself, which may cause some channel tension, but for now Atmos is a product only.

Here’s a taste of what EMC claims it will do. Atmos is a globally distributed file system (code-named Maui) that runs on purpose-built EMC hardware (code-named Hulk).

The software automatically distributes data, placing it on nodes across a network according to user-defined policies. These policies dictate what level of replication, versioning, compression, deduplication and disk drive spin-down a particular piece of data should have as it resides in the cloud. Depending on how important the information is, there might one, five, or 10 copies of it around the world, for example.

The closest thing out there today that resembles Atmos is Cleversafe.org.

Atmos also provides Web service application programming interfaces, including Represntational State Transfer and Simple Object Access Protocol, as well as the Common Internet File System and Network File System support for integration with file services; a unified name space, browser-based admin tools and multitenant support for multiple applications to be served from the same infrastructure without co-mingling data. And Simple Network Management Protocol support provides a plugin to existing reporting tools on top of the existing reports and alerts Atmos offers, according to EMC.

The software ships on purpose-built hardware available in 120TB, 240TB or 360TB configurations. [Editor's note: The National Center for Atmospheric Research has an archive already several petabytes in size. It would need at least three of these boxes to contain just its existing data. In other words 360 TB is large, but not that large by today’s standards].

There’s also a fit with VMware as Atmos can run on a VMware image, although Mike Feinberg, the senior VP of the cloud infrastructure group at EMC, says users don’t need VMware to use Atmos.

EMC did not announce pricing details today either, except to say that it’ll be competitive will existing petabyte-scale JBOD-type offerings.


November 5, 2008  10:01 PM

Cloud computing allowed you to read an 1851 New York Times article online



Posted by: Mark Fontecchio
Amazon EC2, Amazon S3

Nicholas Carr recounts the story of the New York Times trying to get its archives all online, and using cloud computing to do it.

The short version: NYT scanned all their articles in, resulting in four terabytes worth of TIFF files. They wanted to convert them all the PDFs but weren’t capable of doing it inhouse. So a software programmer at NYT sent them all to Amazon’s Simple Storage System, created some code with Amazon’s Elastic Compute Cloud to convert them to PDFs, and voila, a day later it was done.

The total cost for the computing job? Gottfrid told me that the entire EC2 bill came to $240. (That’s 10 cents per computer-hour times 100 computers times 24 hours; there were no bandwidth charges since all the data transfers took place within Amazon’s system – from S3 to EC2 and back.)

If it wasn’t for the cloud, Gottfrid told me, the Times may well have abandoned the effort. Doing the conversion would have either taken a whole lot of time or a whole lot of money, and it would have been a big pain in the ass. With the cloud, though, it was fast, easy, and cheap, and it only required a single employee to pull it off. “The self-service nature of EC2 is incredibly powerful,” says Gottfrid. “It is often taken for granted but it is a real democratizing force in lowering the barriers.”

Which brings Carr to his main point: Cloud computing will be important for what it will be able to do that already can’t be done. Up to now, most people are focusing on how to transfer their current IT infrastructure into the cloud. But cloud computing will make its mark by opening up avenues that were previously closed, or not even built yet.

But as one commenter stated, moving existing infrastructure is going to naturally be the first focus, as enterprises are worried about their current infrastructure, and not necessarily new tasks that the cloud could tackle. It will take a more long-term visionary within the company (such as the chief technology officer) to figure out which new trenches to build.


November 3, 2008  2:54 PM

Rackable Systems CloudRack designed for cloud computing



Posted by: Bridget Botelho
cloud computing, Cloud storage, CloudRack, DataCenter, Rackable Systems

Fremont, Calif.-based Rackable Systems, Inc. is catering to cloud computing environments with a new server rack designed specifically for cloud environments called Systems CloudRack, the company announced October 30.

This new product from Rackable is one of many that we are seeing from vendors who are trying to design new equipment or re-purpose existing equipment for cloud computing environments, which are characterized by a CloudRacklarge number of server nodes in scalable data centers providing SaaS (Software-as-a-Service) to users.

According to Saeed Atashie, director of server products at Rackable, CloudRack was created with the density and power efficiency cloud environments demand.

CloudRack is 44U cabinet that supports up to 88 servers, 176 processors from either AMD or Intel, 704 cores, 352 TBs of storage and up to 8x 3.5” drives/board (4 drives/CPU). It is designed to be power efficient, and easy to service, according to Rackable.
“CloudRack is designed from the ground-up with cloud customers needs and buying behavior in mind,” Atashie said. “In comparison, a number of our competitors design for a general purpose (one size fits all) server market and then try to position these products in the cloud computing market.”

Rackable also announced servers for HPC and cloud environments back in June, the  XE2208, with twice the density of existing Rackable Systems servers.  Rackable is focusing products on the cloud computing market because it is “the latest industry mega-trend,” Atashie said. Other companies focusing products on the cloud include IBM, VMware, HP and Intel.

Atashie said Rackable already has customers lined up for the CloudRack, but would not disclose any names. In general, CloudRack will appeal to companies using cloud computing or those using high performance computing, the company reported.

The Rackable Systems CloudRack CR1000 model can be built to order. More information about specific configurations, pricing or Rackable Systems’ build-to-order model is available on Rackable Systems’ website.


October 29, 2008  6:12 PM

Cloud storage: What a difference a decade makes



Posted by: Alex Barrett
cloud computing, Cloud storage, Nirvanix, Rackspace, The Planet

When it comes to all the varieties of cloud services out there, cloud storage gets a lot of love from hosting providers such as RackSpace and the Planet, which have both made cloud-related storage moves of late.

But the skeptic in me wonders why hosting providers think that cloud storage will succeed when storage service providers (SSPs) of the late 1990s were such a blatant failure? I’m talking about companies like the dearly departed StorageNetworks, which rose to IPO stardom in 2000, only to shutter its doors two years later.

For one thing, said Rob Walters, the Planet’s general manager for data protection and storage, there’s a big difference between the storage used by SSPs of yore and today’s cloud providers. “The old SSPs used hardware like the EMC Symmetrix, the economics of which just didn’t work out,” he said. Cloud storage providers, on the other hand, rely heavily on taking commercial off-the-shelf (COTS) hardware and replicating it ad nauseum to get decent reliability and performance.

To that end, the Planet struck a deal last month with Nirvanix, a cloud storage provider that has written its own distributed “Storage Delivery Network” (SDN) and cloud-based virtual storage gateway, Nirvanix CloudNAS that runs on commodity Dell hardware. As part of the deal, The Planet customers can tap in to Nirvanix storage resources, and The Planet will act as one of the replicated nodes in Nirvanix’s geographically distributed SDN.

People are also looking to store data today that has different performance needs than what SSPs proposed to house, said Urvish Vashi, general manager for The Planet’s Dedicated Hosting, namely backup and archive data, plus Web 2.0 data like photographs and streaming video files. With these data types, “I/O to the disk isn’t the limiting factor, it’s I/O to the network.” In other words, for these files, it doesn’t matter if you store this data on a dog of a slow drive because access to it is limited by an even slower network.

And then, there’s the fact that things are just different now. Whereas 10 years ago public dialogue centered on security and privacy, people nowadays publish and expose every detail of their lives on blogs or sites like MySpace and FaceBook. Taking that idea one step further, the idea of hosting data on shared infrastructure just doesn’t phase companies the way it used to, Vashi said. “It’s less of an unusual choice than it used to be.”

I’m still skeptical, but willing to suspend disbelief.


October 27, 2008  10:28 PM

Microsoft launches Azure for cloud computing



Posted by: Leah Rosin
cloud computing, Microsoft Azure

Following on the heels of an IDC report predicting that cloud computing will capture IT spending growth over the next five years, another major player came to the cloud game on Monday. During a keynote speech at the Microsoft Professional Developers Conference 2008 (PDC2008), Ray Ozzie, Microsoft Corp.’s chief software architect, announced Windows Azure, the cloud-based service foundation underlying its Azure Services Platform.

The Azure platform combines cloud-based developer capabilities with storage, computational and networking infrastructure services hosted by Microsoft’s global datacenter network, providing developers the ability to deploy applications in the cloud or on-premises and enabling experiences across a range of business and consumer scenarios. A limited community technology preview (CTP) of the Azure Services Platform was initially made available to PDC2008 attendees.

“Today marks a turning point for Microsoft and the development community,” Ozzie said. “We have introduced a game-changing set of technologies that will bring new opportunities to Web developers and business developers alike. The Azure Services Platform, built from the ground up to be consistent with Microsoft’s commitment to openness and interoperability, promises to transform the way businesses operate and how consumers access their information and experience the Web. Most important, it gives our customers the power of choice to deploy applications in cloud-based Internet services or through on-premises servers, or to combine them in any way that makes the most sense for the needs of their business.”

The key components of Azure are summarized:

• Windows Azure for service hosting and management, low-level scalable storage, computation and networking

• Microsoft SQL Services for database services and reporting

• Microsoft .NET Services that are service-based implementations of familiar .NET Framework concepts such as workflow and access control

• Live Services for a consistent way for users to store, share and synchronize documents, photos, files and information across their PCs, phones, PC applications and websites

• Microsoft SharePoint Services and Microsoft Dynamics CRM Services for business content, collaboration and rapid solution development in the cloud

Nicolas Carr shared some of the nitty-gritty details:

During its preview stage, Windows Azure will be available for free to developers. Once the platform launches commercially – and, according to Ozzie, Microsoft will be “intentionally conservative” in rolling out the full platform – pricing will be based on a user’s actual consumption of CPU time (per hour), bandwidth (per gigabyte), storage (per gigabyte) and transactions. The actual fee structure has not been released, though Ozzie says it will be “competitive with the marketplace” and will vary based on different available service levels.

Now, it’s not horribly shocking that Microsoft has joined the movement to the cloud. But it’s a bit amusing because a lot of the cloud effort has been generated by those anti-Windows programmers, looking to share applications that directly compete with the Microsoft product suite. As I read through David Chappell’s Azure white paper, I couldn’t help but chuckle when I read this: “The Windows Azure compute service is based, of course, on Windows.”


October 22, 2008  9:18 PM

Rackspace: From managed hosting to cloud hosting



Posted by: Alex Barrett
cloud computing, Mosso, Rackspace, Storage, Virtualization, VMware, VPS, Xen

In an effort to wrap my mind around this cloud computing stuff, I watched the webcast of Rackspace’s cloud computing launch today, where the company laid out its plans to move from simple managed hosting provider to cloud provider extraordinaire, taking on Amazon Elastic Compute Cloud, or EC2, and Simple Storage Service, or S3, in the process.

Rackspace’s plan centers on acquisition, partnership and expanding its existing Mosso Web hosting product into three broad offerings: Cloud Sites website hosting, Cloud Files storage service, and Cloud Servers virtual private servers.

On the acquisition side, RackSpace has acquired Jungle Disk, a cloud-based desktop storage and backup provider that has thus far relied on Amazon’s S3. It also acquired Slicehost, a provider of Xen-based virtual private servers (VPSs) that claims 11,000 customers and 15,000 virtual servers.

As far as new Mosso offerings, the new Cloud Files will come in at $0.15 per GB of replicated data, or if the data is distributed across a content delivery network (CDN), at $0.22 per GB. CDN capabilities come by way of a partnership with Limelight Inc.

Also as part of Cloud Files, RackSpace will partner with Sonian Networks to provide cloud-based email archiving starting at $3/mailbox.

Coming soon, Cloud Servers is Mosso’s new name for Slicehost’s VPS offering. Under Slicehost, the services starts at $20/month for a virtual Xen server with 256GB of RAM, 10GB of storage, and 100GB of bandwidth. “Slices” scale to 15.5GB of RAM, 620GB of storage and 2,000GB of bandwidth for $800/month.

When it comes to the Xen-based Slicehost — aka Cloud Servers — I should note that Mosso is a longtime VMware customer that has publicly pondered the viability of the relationship as it expands its services. It will be interesting to see whether this acquisition signals a break from VMware or whether it will continue to use VMware as the underpinning of its Cloud Sites offering. Rackspace, care to comment?

On another note, Slicehost is one of many hosting providers that use open source Xen as the basis of their cloud offerings. Presumably, it’s also the kind of company to which Simon Crosby, CTO of Citrix Systems Inc., referred when Citrix announced XenServer Cloud Edition and Citrix Cloud Center (C3) at VMworld 2008.

At the time, Crosby said that luring these hosting providers into Citrix support contracts was a huge priority. “Trivially, we looked around and found a couple hundred hosted IT infrastructure providers using open source Xen,” he said. “XenServer Cloud Edition is intended to win greenfield accounts but also to bring the open source Xen guys back home.” XenServer Cloud Edition boasts features like the ability to run Windows guests and commercial support.

One final thought: If any of you find this whole cloud computing thing a bit, ahem, nebulous, Lew Moorman, Rackspace’s chief strategy officer, made an interesting distinction between different types of cloud offerings. “Cloud apps,” Moorman said, are what we used to think of as Software as a Service (SaaS); “cloud hosting,” meanwhile, refers to pooled external compute resources. And of course, there’s cloud storage. Rackspace, it seems, will offer all three.


October 22, 2008  4:20 PM

Five common cloud computing fears



Posted by: Lauren Horwitz
cloud computing

This blog was written by Caroline Hunter, assistant editor of SearchEnterpriseLinux.com.

Don’t let fears about cloud computing prevent you from investigating the technology’s potential to reduce costs and open up space at your company. Below we explore five common concerns enterprises express about adopting cloud computing.

1. Proprietary exploitation. As Richard Stallman has suggested, cloud computing might be largely a marketing ploy to get data center managers to spend money on unproven technology. At this point, there is some question about whether the cloud can provide concrete benefits for its price tag. Open standards for clouds provide justification for not rejecting the cloud just yet, because they enable you to elect how to use the cloud. In charging for a narrow definition of cloud services, proprietary cloud service vendors often eliminate that flexibility.

2.Lack of transparency. Just as clouds are hard to see, your critical system resources stored within a cloud are also difficult to locate. It’s better to have a clunky, secure set of resources than a cheaper, airy one that might float into someone else’s hands; and public/private access rights to clouds are far from being established. Also, in the cloud ensuring compliance with existing security standards is difficult. There are no established standards for providers to enforce, and an attacker can modify sensitive data within one.

3. Wrestling cloud standards. If you’ve figured out solutions to the first two hurdles the struggle is not over. Other companies and users may have different ideas about how you should administer the cloud and formalize those ideas into standards. Now you’re noncompliant. But noncompliant with what? CTO of Amazon.com Werner Vogels doesn’t like the term cloud because it can mean so many different things. So before you decide to be a cloud for Halloween, check out the  study by the browser company Opera. Much of the Web has escaped standards compliance as well. So perhaps the problem is how standards are created rather than how they are enforced.

4. Integration into existing environment. You dodged the marketing trap, secured your resources, and committed to a cloud configuration that works for you. But now you’ll have to figure out how to integrate the cloud into your existing environment and how to manage it. Like cloud computing itself, this is mostly uncharted territory. But there are tools available to help you out such as  Kaavo and OpenQRM, and RightScale.

5.Loss of service.This month, concern about service outages came true for a handful of Google users who lost access to their outsourced Gmail accounts for nearly 24 hours. Unfortunately, having your services run elsewhere can leave you helpless to get them back.

The choice to employ cloud computing brings a lot of uncertainty, and some reason to be wary about the safety of your business’s resources in the cloud. And because cloud computing is emerging and largely unproven, it’s still up to you to determine whether the sky is falling or it’s just a passing cloud.


October 21, 2008  8:52 PM

rPath offers Cloud Computing Adoption Model, webinar



Posted by: Bridget Botelho
cloud computing, rPath, Virtualization

Jake Sorofman, vice president of marketing for rPath, which provides technology for virtualizing software applications and managing cloud and virtualized environments, has offered a five step strategy on how to ease into cloud computing, and a webinar on the topic on October 2.  cloud

“Cloud Computing promises to reduce operating costs by increasing infrastructure utilization and reducing server sprawl; to reduce the cost of software consumption by allowing business lines to align cost with value received; and to dramatically improve business agility by compressing deployment cycles and time to value for application functionality,” Sorofman wrote. “It’s no surprise that cloud has attracted dozens of new entrants and forced incumbent vendors to articulate their own cloud strategy.”

For companies interested in moving their infrastructure into the clouds but aren’t sure where to begin, rPath’s Cloud Computing Adoption Model offers a set of clear guidelines to help move the process along.

Their model is a graduated, step by step approach for the adoption of cloud technologies that should help cut through the hype and lay out a clear game plan, incrementing toward cloud without putting projects, budgets and careers at risk, according to Sorofman.

Loosely modeled after the Capability Maturity Model (CMM) from the Software Engineering Institute (SEI) at Carnegie Mellon University, the Cloud Computing Adoption Model proposes five steps:

• Level 1: Virtualization. The first level of cloud adoption employs hypervisor-based infrastructure and application virtualization technologies for seamless portability of applications and shared server infrastructure.

• Level 2: Cloud Experimentation. Virtualization is taken to a cloud model, either internally or externally, using Amazon Elastic Compute Cloud (EC2) for compute capacity and as the reference architecture.

• Level 3: Cloud Foundations. Governance, controls, procedures, policies, and best practices begin to form around the development and deployment of cloud applications. Initially, Level 3 efforts focus on internal, non-mission critical applications.

• Level 4: Cloud Advancement. Governance foundations allow organizations to scale up the volume of cloud applications through broad-based deployments in the cloud.

• Level 5: Cloud Actualization. Dynamic workload balancing across multiple utility clouds. Applications are distributed based on cloud capacity, cost, proximity to user, and other criteria.

“At the end of the day, architectural innovations like cloud have transformational potential for enterprises. But the reality is that transformation can’t happen overnight — and it certainly can’t happen without a plan,” Sorofman wrote. “While the Cloud Computing Adoption Model may not represent a panacea for enterprise cloud computing, it does provide a context for thinking strategically about the pace, pattern and sequence of investments and returns that will set organizations on a pragmatic path to cloud.”

rPath is also hosting a webinar on this topic, “The Pragmatist’s Guide to Cloud Computing: A 5-Step Framework for Achieving the Strategic Value of Cloud Computing While Delivering Real ROI Along the Way,” on Thursday, October 23, 2008, at 11:00 a.m. Pacific time/2:00 p.m. Eastern time.

The webinar will feature guest speakers Jeff Barr, senior Amazon Web Services evangelist; Frank Gillett, vice president and principal analyst, Forrester Research; Jeff Schneider, CEO of MomentumSI; and Billy Marshall, founder and chief strategy officer, rPath. All registrants will receive a complimentary copy of “The rPath 5-Step Framework” along with the Forrester Research Paper, “Future View: The New Tech Ecosystems of Cloud, Cloud Services and Cloud Computing.” To register for the webinar, visit rPath’s website.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: