The Troposphere


March 16, 2009  3:21 PM

The Rackspace/Mosso PCI Debate

JohnMWillis John Willis Profile: JohnMWillis

A few weeks ago Rackspace made an announcement about hosting the first PCI complaint cloud solution. PCI is short for the Payment Card Industry Data Security Standard, which is a worldwide security standard for merchants who store, process or transmit credit card holder data.   Rackspaces’s Cloudsites (formally called Mosso) was used to enable the online merchant, The Spreadsheet Store , to move to the cloud without having to compromise the security of their online transactions (i.e., PCI compliance).  What should have been a great success story for the Rackspace/Mosso team turned into a little bit of a PR debacle.

Some of the cloud security experts and thought leaders took exception with the Rackspace/Mosso titled “Cloud Hosting is Secure for Take-off: Mosso Enables The Spreadsheet Store, an Online Merchant, to become PCI Compliant”, and they called out Rackspace/Mosso on their bold claim of being the first cloud provider to offer PCI compliancy. Craig Balding, an IT Security Practitioner and cloud expert, was the first blogger to point out in his blog article “What Does PCI Compliance in the Cloud Really Mean?”:

Mosso/Rackspace recently announced they have “PCI enabled” a Cloud Site’s customer that needed to accept online credit card payments in return for goods (i.e. a merchant).

However, the website hosted on Mosso’s Cloud, doesn’t actually receive, store, process, transmit any data that falls under the requirements of PCI.

Or to put it another way, its ‘compliance’ through not actually needing to be…

Craig goes on to say that Rackspace’s “PCI How To” document is just an “implementation of an age-old Internet architecture that involves redirecting customers wishing to pay for the contents of their online basket to an approved and compliant online payment gateway.”

Christopher Hoff, another cloud and security expert, also calls an objection to the aforementioned Rackspace/Mosso PCI hype by stating in his blog, “How To Be PCI Compliant in the Cloud…”, the following:

So after all of those lofty words relating to “…preparing the Cloud for…online transactions,” what you can decipher is that Mosso doesn’t seem to provide services to The Spreadsheet Store which are actually in scope for PCI in the first place!*

The Spreadsheet store redirects that functionality to a third party card processor!

So what this really means is if you utilize a Cloud based offering and don’t traffic in data that is within PCI scope and instead re-direct/use someone else’s service to process and store credit card data, then it’s much easier to become PCI compliant. Um, duh.

Ben Cherian of Ben Cherian’s blog, also goes on to refer to the Rackspace/Mosso antics as a trick when he states the following:

When I saw this, I wondered how it was possible, but as I read closer it became clear that it was just a trick! It seems that their “PCI-compliant” solution requires Mosso not to store any information that requires PCI compliance. Instead they offload the burden of compliance to a third-party payment gateway (Authorize.Net).

However, keeping it real, Greg Hrncir the Director of Operations at Mosso shot back with the following comment on Craig’s blog:

The truth is that we are the first Cloud, that we know of, that enabled its Cloud customers to gain PCI compliance using multiple technologies. The future of Cloud technologies is full of these types of hybrid solutions that combine the best of both worlds. The goal for a customer and online merchant, is to get PCI compliance, not be purist in terms of technology. On line merchants want to leverage the Cloud for scaling, and this is a good way to do it by combining both worlds.

In summary, I think they were all right. Craig, Chris, and Ben were perfectly within bounds to call out the titled Rackspace/Mosso hype and in doing so they all did a brilliant job educating us all on what PCI really means in or outside of a cloud.  However, Greg Hrncir, also points out that what Mosso did was a first-in movement and as a hybrid model they are setting the building blocks for otherwise roadblocked initiatives. In my opinion, what Rackspace has done is significant from a “cloud” industry standpoint; however, being “cloud” leaders they should have used a little bit more discretion in their announcement.  With all the hype already associated with cloud computing it is important for the leaders in this space to keep the discussion a little bit grounded.  However, this reminds me of an old friend of mine, that every time he would get into a fight he would stick his chin out and say “hit me”. In the Mosso/PCI debate it looks like Mosso got hit.

March 6, 2009  12:54 PM

Cloud computing and teenage sex…

JoMaitland Jo Maitland Profile: JoMaitland

What do cloud computing and teenage sex have in common?

Everyone talks about, few actually do it, and even fewer get it right, according to the following story.

Check it out:

http://web2.sys-con.com/node/862933


March 6, 2009  12:26 PM

Drug company calls out cloud security concerns

JoMaitland Jo Maitland Profile: JoMaitland

Eli Lilly and Co. tapped into Amazon Web Services to crunch a vast chunk of data associated with the development of a new drug. Its research time collapsed from three months to two hours, a huge advantage in the highly competitive pharmaceutical business.

The company repatriated the data over a secure line that connected end-to-end with Amazon. But the firm found there was no way to prove that all its data had left the Amazon cloud. It had to take Amazon’s word for it, which raised security concerns.

Read the full story here:

http://searchsecurity.techtarget.co.uk/news/article/0,289142,sid180_gci1348807,00.html


February 12, 2009  5:00 PM

IBM pushes cloud plans, partnerships

JoMaitland Jo Maitland Profile: JoMaitland

If anyone can solve the problems associated with selling cloud computing to enterprises, it should be IBM. Today the company announced a step in that direction with a slew of new services and partnerships to build cloud services for businesses.

The key challenges to overcome are:

    1) Identity management – who sees my application and data in the cloud? Security and regulatory requirements are crucial.
    2) Which workloads and applications are appropriate for cloud computing?
    3) How does my application in the cloud get access to my data which is still stored onsite?

IBM doesn’t have all these answers yet but says it is working with the following organizations to understand these issues.

    Elizabeth Arden, Nexxera, The United States Golf Association, and Indigo Bio Systems sign on as new IBM cloud computing customers
    IBM Global Services will offer data protection software “as a service” through the cloud, in addition to a new IBM cloud environment for businesses to safely test applications
    First live demonstration of a global “overflow cloud” – IBM and Juniper Networks to install hybrid cloud capabilities across IBM’s worldwide Cloud Labs for customer engagements. This is to let users bridge between private clouds and IBM’s public cloud offerings to turn up resources as needed.
    At 13 worldwide cloud centers, IBM offers server capacity on demand, online data protection, and Lotus e-mail and collaboration software.
    IBM Rational AppScan 7.8 lets users continuously monitor the Web services they publish into the cloud to check that they are secure, compliant and meet business policies.
    Service Management Center for Cloud Computing contains a set of offerings including Tivoli Provisioning Manager 7.1 and the new Tivoli Service Automation Manager, to automate the deployment and management of private clouds.
    Finally, IBM said it will launch a Tivoli Storage as a Service offering through its Business Continuity & Resiliency Services cloud. Not available until late in 2009, users will be able to consume Tivoli data protection technologies via a cloud and pay for only what they use. EMC and Symantec are already offering these kinds of services.

From these announcements it looks like IBM will be able to help businesses figure out which workloads to shift into the cloud, but there are no details yet on how it will ensure identity management, security and compliance.


February 3, 2009  2:57 PM

Amazon claims 400,000 web services users

JoMaitland Jo Maitland Profile: JoMaitland

An industry insider close to Amazon’s Web Services (AWS) business unit told us the company claims to have 400,000 customers using its web services offering.

AWS includes EC2, the compute-on-demand offering, S3, the hosted storage service, SimpleDB for hosted databases, Simple Queue Service (SQS) a communication channel for developers to store messages and CloudFront, which is a content delivery network.

Amazon has not publicly discussed much detail about its customers and how they are using AWS. For instance, of these 400,000 users, how many are using EC2 and S3, just S3 or just EC2? Is anyone using SimpleDB or CloudFront yet? How many of these users were one-time customers? My hunch is that 400,000 number includes any customer that has touched AWS regardless of whether they are still using it.

In conversations with IT users, it’s clear they are interested in these services, but need more reference cases on how to use it. A great success story goes a long way.

During a webinar on cloud computing today, James Staten, principal analyst at Forrester Research said enterprises need more transparency from EC2 to show that it can meet SLAs. “The predictability [of the service] is not good enough for business,” he said, noting that EC2 had two lengthy outages in 2008.  Small businesses and gaming and entertainment companies are the biggest adopters of EC2, he said. The former can’t afford to build their own datacenters, while gaming and movie companies require extra infrastructure around the release of new games and movies, which can be setup and torn down as needed.

Staten said enterprises are using cloud services like EC2 for R&D projects, quick promotions, partner integration and colloboration and new ventures.  He called for more companies to share how they are using these services and recommended that IT shops begin to experiment with it.  Staten suggested endorsing one to two clouds as “IT approved” and establishing an internal policy for using these services. He urged IT organizations to let cloud providers know what you want and what’s more important to you? Secure enterprise links, standards, SLA expectations, levels of support (24/7 phone support, for example)? My guess would be all of the above. If you’d rather, I can hammer on the vendors, so let me know.


January 22, 2009  8:10 PM

VMware touts benefits of private cloud computing, VDC-OS

Bridget Botelho Bridget Botelho Profile: Bridget Botelho

VMware, Inc. is on a mission to show companies that they can get the benefits of cloud computing without handing their mission critical applications over to an outside provider; with the upcoming Virtual Data Center-Operating System (VDC-OS), IT will be able to create secure, private cloud environments.

The yet to be released VDC-OS represents the evolution of the VMware Infrastructure; the platform, which is due for release sometime this year,  will transform traditional data centers into internal cloud environments. The business case for creating an private cloud is less complexity in the data center; software like VDC-OS will virtualize and automate systems to the point that there is less ‘knob turning’ and more time spent on tasks that improve business, said VMware Sr. Director of Product Marketing, Bogomil Balkansky.

“Too much of IT budgets are spent on management tasks and keeping the lights on, instead of on tasks that actually improve business,” Balkansky said. “Infrastructure complexities should not get in the way of this, but they do.”

While external clouds like Amazon EC2 offer the same benefits of internal clouds, VMware is betting that large enterprises won’t send their mission critical applications outside the four walls of their data centers to these providers. Instead, they will want to create private cloud compute infrastructures using software like VDC-OS.

“There are security challenges with public clouds; enterprises don’t trust [outsiders] with their customer and financial data,” Balkansky said.  “We want to transfer the notion of cloud computing to internal data center operations.”

VMware is also hosting a webinar on January 29 about Internal Cloud Computing, if you want to hear more on this.

Balkansky said private cloud computing environments will gain traction in large data centers, but that could just be a self-serving prophecy. After all, most public cloud providers won’t pay for VMware software and use free and open source Xen instead; hence, VMware has no place to go but within the enterprises that already know and love VMware.

While VMware is on an private cloud advocacy mission, as the largest virtualization provider on the planet, it can’t ignore the need to play well with public clouds. That’s where VMware’s vCloud initiative comes into play; it will eventually allow VMware users to move their virtual machines on demand between their datacenters and cloud service providers, and over 200 partners have signed up to support vCloud so far, Balkansky said.


January 19, 2009  7:53 PM

Cloud hype extends to NIC cards

JoMaitland Jo Maitland Profile: JoMaitland

Hifn says its new Express DS4100 NIC card is “optimized for the cloud”. What’s next? Cables? Batteries? My desk?

The problem with the cloud is speed in terms of uploads and downloads, says Hifn’s PR person.
“Try uploading a terabyte to the cloud and see how long it takes.” He has a point there, but I think it takes more than a slick NIC to fix this problem.

The exact speeds and feeds of the DS4100 will not be available until the official release next week, but ballpark pricing will be $1000 per card. It also supports virtualization and service-orientated architectures, just in case you need the whole ball of yarn.


January 8, 2009  2:57 AM

Sun buys Q-Layer a tad early

JoMaitland Jo Maitland Profile: JoMaitland

Sun Microsystems snapped up a key piece of cloud-enabling technology via its acquisition of Belgium-based Q-Layer this week, but it’s way ahead of most enterprise IT shops that are not ready for private clouds just yet.

Data from The451 Group, published in October, 2008, showed that 84% of their IT client base, several hundred large enterprises worldwide, have no plans to deploy internal, on-premise cloud computing.

Intergenia, a hosting company in Germany is the only public Q-Layer customer.

Q-Layer is focused on the orchestration layer above the hypervisor and supports VMware, Xen, Microsoft and Sun. Its NephOS software is designed to run on virtual and physical servers, storage and networks and abstracts all components in each layer through a uniform set of actions (E.g. create machine, reboot, backup, restore, start, stop, etc). The software translates these actions to the underlying physical or virtual technology. IT admins manage a virtual view that is automatically mapped to the underlying virtual or physical technology.

Other companies in this space include 3tera, Enomoly, Eucalyptus, DynamicOps, Arjuna and Cassatt, among others.

It sounds like great technology, which is typical of Sun, as is the timing. Sun’s track record of acquiring great technology and even building great technology way ahead of market adoption is second to none. This deal with Q-Layer looks to be in keeping with the technology focused company we know and love. Let’s hope IT shops are in a position to try this kit out sooner rather than later and Sun finally gets a break.


December 8, 2008  3:34 PM

Gartner VP predicts thousands of clouds

Bridget Botelho Bridget Botelho Profile: Bridget Botelho

Gartner’s Vice President and distinguished analyst Tom Bittman spoke with us about the IT industry evolution led by virtualization and cloud computing, and why big players like VMware won’t be the virtualization software of choice.

Since virtualization is the foundation of cloud computing, clouds are the next logical step for virtualization vendors like VMware and Citrix Systems, but Bittman said if these vendors don’t make pricing changes, cloud platform providers like Google and Amazon won’t use them.

“Cloud computing is a wide open market, dominated by open source Xen. It is a market that is there for the taking, and for VMware that would require a significantly different pricing model,” said Bittman, who also blogs about virtualization and cloud computing. “Sun and Citrix could get a major foothold in the cloud market as well, if they get their act together.”

VMware has taken steps towards becoming cloud friendly with its Vcloud initiative, but Vcloud is limiting because the provider has to use VMware, Bittman said. Microsoft also has its own cloud service, Azure, supported by Hyper-V.

Microsoft will probably try to turn Azure into a platform for ISV’s to build software as a service, so “in a lot of ways, they are trying to build a platform for a cloud,” Bittman said. But, “there is no reason Windows will be a prominent player in the cloud…[because providers] like Amazon EC2 don’t care what the OS is; all they care about is what is being provided.”

The future of clouds; more providers, fewer OSes

Today, cloud computing is dominated by a small number of large providers, but in the years ahead there will probably be ecosystems built around those islands; Software as a Service (SAAS) built upon the existing clouds, and the sharing of resources between cloud providers, Bittman said. He also expects fragmentation from the few general cloud platforms of today into many specialty cloud providers with applications and infrastructures that cater to specific industries, like healthcare, which have specific compliance requirements, Bittman said.

“We will see a growth to thousands of cloud providers and they won’t want to write their own software using Xen; they will want to buy software and that is where companies like Sun could make a play,” Bittman said.

Cloud computing is also changing the game when it comes to operating systems; the concept of the Meta-OS (like VMware’s Virtual Data Center OS) is changing the paradigm of using one OS per physical server, Bittman said.”The old idea is you build one platform to manage one box, but if I have 10,000 boxes, I don’t want 10,000 OSEs managing everything independently,” Bittman said. “If I turn an OS into a dumb container, it can work in a much more distributed way, like Microsoft’s Azure, which is essentially Windows 2008 sprinkled all throughout the data center. This is changing the way we look at OSes going forward.”

Cloud computing has the power to change things in the IT industry because of what it offers companies; flexibility and agility, Bittman said.

“Most infrastructures today focus on cost, but we are beginning to see a focus shift towards agility. People are using [cloud environments] not because of the cost savings, but because it is flexible. The ability to make changes according to demand qickly is becoming a more important factor for data centers,” Bittman said.


December 4, 2008  3:17 PM

Should Amazon EC2 follow Moore’s Law?

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio
Exchange Server ActiveSync

According to the economical side of Moore’s Law, processing power gets cheaper every year due to vendors being able to pack more of it in the same amount of space. Should cloud computing follow Moore’s Law?

Let’s take a look at Amazon’s computing offering in the cloud — the Elastic Compute Cloud (EC2). EC2 is a web service that lets customers rent Amazon servers on which they can host their own applications. There are different price levels called “instances.” The basic one is 10 cents an hour and, according to Amazon, is equivalent to a 32-bit system with 1.7 GB of memory,  1 EC2 Compute Unit, and 160 GB of instance storage.

So what is an EC2 Compute Unit? According to Amazon, it is equal to a 1.0-1.2 GHz 2007 Opteron or Xeon processor. When EC2 first came out in 2006, one EC2 Compute Unit was equivalent to an “early-2006 1.7 GHz Xeon processor” according to Amazon documentation on EC2 Instance Types.

(Two odd things about this: There’s a 20% difference between a 1.0 GHz processor and a 1.2 GHz processor, so what gives? And if a 1.0-1.2 GHz processor in 2007 is equivalent to a 1.7 GHz processor in 2006, why change the definition at all?)

The price has stayed the same since 2006 at 10 cents an hour for that basic instance, but you are getting the same amount of processing power now that you were in 2008. So it is not following the economic portion of Moore’s Law.

“You can now get a quad-core server for the same price you could get a single-core server in 2006. But cloud computing is not taking advantage of Moore’s Law,” said Raj Dutt, the CEO of Voxel Dot Net, a New York-based hosting company. “It’s the same price for the same amount of processing power.”

The question is whether it should. Clay Ryder, president of analyst firm Sageza Group, doesn’t necessarily think so. He sees EC2 and other cloud computing products to be based on a different pricing scheme. Whereas servers are based on a sales model that includes the cost of time, materials and markup, cloud computing is more of a values-based pricing model, and the two are not the same.

Ryder likened it to owning a car compared to renting or leasing. When you buy a car, you’re paying for the cost of materials to build it, the cost of labor it took to build (time), and any markup to make profit. When you rent one, you pay for a service.

“There is a lot of value in the Amazon approach,” he said. “You can turn it off and turn it on, and there’s no long-term costto you, and that is an intangible value.”

Ryder hits on something here. When you pay for EC2, you’re not just paying for the server hardware. You’re also paying for the data center infrastructure around it (land acquisition, building costs, power generation, chillers, racks, etc.) and the cost of labor it takes to maintain that infrastructure and the servers. If you have your own data center and your own people, you don’t pay for that when you go buy a dozen servers from Dell.

Some might still argue that at least a portion of the cost should be subject to Moore’s Law. After all, Amazon is charging the same price for the same amount of processing power, even though that processing power is getting cheaper for them to buy.

Then again, maybe Amazon has already factored in Moore’s Law, but has also factored in the increasing cost of labor, electricity, and materials to build a data center to run those servers. So in the end, it’s all a wash, and the price stays the same.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: