The Troposphere


December 12, 2012  7:24 PM

HP Cloud Compute undercuts Amazon, too

Beth Pariseau Beth Pariseau Profile: Beth Pariseau

It turns out that a headline I wrote for SearchCloudComputing.com last week about the new HP Cloud Compute service was only half right.

The headline calls out that HP has undercut Rackspace with its per-hour pricing of 4 cents (Rackspace’s on-demand offering is priced at 6 cents per hour). But as pointed out today by @ stackgeek on Twitter, that headline, as well as these paragraphs in the story:

Prices for Hewlett-Packard’s (HP) Cloud Compute service start at four cents per hour for a small instance with 1 GB of RAM – 2 cents lower than the price for Rackspace’s 1 GB instance. Amazon Web Services’ (AWS) Reserved Small Instance, which comes with 1.7 GB of memory, costs 3.9 cents per hour.

 While AWS still offers the best deal, HP’s pricing for the fledgling service might attract Rackspace customers.

…are not totally accurate.

I re-checked the Amazon pricing page, and @stackgeek is correct in pointing out the $69 one-time upfront cost for a small, lightly utilized one-year Reserved Instance. While the 3.9-cent hourly cost without this fee would make Amazon cheapest on a yearly basis, at $341.64, the fee makes Amazon’s cost per year $410.64. HP’s cost per year at 4 cents an hour is $350.40 and Rackspace’s price per year at 6 cents an hour is $525.60.

@stackgeek is also correct in pointing out that the original comparison was between Amazon’s Reserved Instance and HP and Rackspace’s on-demand instances. The pricing for an On-Demand Instance on Amazon is 6.5 cents per hour, which is half a cent more expensive than Rackspace. That makes the pricing for an Amazon On-Demand instance $47.45 a month, or $569.40 per year.

So, HP’s new service actually undercuts both Rackspace and Amazon on price. And Amazon Web Services is more expensive than Rackspace’s Cloud Servers as well.

I regret the errors.

December 4, 2012  3:49 PM

Cloud services: If not now, when?

Margie Semilof Margie Semilof Profile: Margie Semilof

 

The notion of using cloud-based services still terrifies enterprise IT pros, even though such services have advanced in both quality and variety for years. IT pros remain frozen by the specter of losing control of data, security breaches and random service outages. Some of these reasons may be losing validity.

In stark contrast to that fact, however, was the smashing success Amazon Web Services (AWS) had last week with re:Invent, its first conference  In the six years since AWS launched, cloud services have been increasingly embraced by start-ups and media delivery companies, along with a slew of forward-thinking developers .

Amazon executives, such as CTO Werner Vogels, a senior vice president Andy Jassey and founder Jeff Bezos, brought their value proposition to the masses in person. We’ve heard their case for cloud services before:  You can change Capex to variable expense, you can pay lower variable expenses, you don’t have to guess your capacity needs and you can have apps set up in minutes.

They hammered on traditional IT vendors, claiming the economics of AWS is disruptive to the HPs, Dells, Microsofts and Oracles of the world. Amazon is a low-volume, high-margin business and it’s not the game that the old guard can play, they said. They may have a point. With the possible exception of IBM, most of the long time stalwarts of the industry — many so reliant on hardware and more traditional services — have yet to present a compelling cloud services strategy that would make its largest customers disregard AWS.

For instance, you want security? AWS has all the standard security certifications, and the company can help implement them with a bigger and better team than you have.

But still, many IT shops have the same reasons for not moving forward.  They are risk averse, they don’t throw things out, legacy apps are hard to move and there are few examples of traditional enterprises that have made the leap. Amazon trots out its old standby, Netflix, along with more recent enterprises, including  NASDAQ, NASA, the United States Tennis Association, McGraw Hill and Novartis to name a few. Comcast, for instance, has been reinventing parts of its business using AWS, spending two years quietly refashioning its media delivery network.

Real enterprises still look for accounts they can relate to. At the conference, Amazon touted its prized new public customer Pinterest — not exactly a revenue-generating machine that supports a legacy back end.

And try to dig up a customer reference on the exhibit floor. One energetic marketing manager brightened when I asked for a name. He offered up Grindr.com. If you don’t know who they are, look them up. Hint: It’s not a competitor to the Subway chain of sandwich shops.

Not too far into the future, enterprises will absorb the cultural changes associated with cloud adoption. Amazon with AWS will be a winner, and some of its competitors will win as well. In the 1990s, we had the likes of Gates and Ballmer, Ellison and others, armed with “kill the competitor” strategies that generated high margins. This is a different game. For example, at re: Invent, AWS followed Google when it dropped the price of its cloud storage by 25%. The next morning, someone from Hitachi told me his customers asked, “Why does storage have to cost so much?” Is it getting hot in here or what?

In data-driven 21st century architectures, apps and processes must be automated as business shifts, as IT shops will be determined to not get stuck with hardware — and software — limitations. Security is integrated from the ground up. There are new attitudes. Failure is not an option? Forget that. AWS CTO Vogels says to regard failure is normal. Failure is always around the corner. Embrace it. It’s the new black.

The main thing is that if you are not constraining yourself up front, you will build more successful architectures. It will take more time, which is fine because this team, like another team from Seattle that forged new ground 30 years ago, has the long view. Amazon, and its competitors, look increasingly inevitable.

Margie Semilof is editorial director of the Data Center & Virtualization media group.


November 6, 2012  7:41 PM

Cloud-based collaboration replaces Office Space mentality of enterprise

Caitlin White Caitlin White Profile: Caitlin White

 

The modern office no longer looks like Office Space, with a staff of office drones tied to their cubicle desks, working from an office-provided desktop every day.  In a world of iPads, Blackberries, Androids, iPhones and laptops, employees are accessing information from everywhere, giving cloud-based collaboration a clear cue to make its entrance.

While some enterprises say they’re still preparing for the bring-your-own-device (BYOD) era to hit, the truth is it’s already here, whether they’re prepared or not. According to a report by Juniper Research, 150 million people use personal mobile devices for work. That number is set to more than double by 2014.

The rise of the global worker is complemented by a shift toward a services economy, said TJ Keitt, senior analyst at Forrester, a global research and advisory firm based in Cambridge, Mass. Automation that comes from new technologies, such as cloud computing, opens the doors for not only global workers but for the introduction of more creative jobs, such as consulting. And these creative jobs require more communication, collaboration and flexibility in working hours.

“Cloud collaboration is not just about being a different delivery mechanism, it’s about what you’re enabling in your workforce,” said Keitt in a Webinar last week.

A 2012 Forrester survey showed that agility — not cost-savings — was the primary reason companies gave for adopting Software as a Service (SaaS).

TechTarget’s 2012 cloud adoption survey echoed this finding, with 60% of survey respondents using public cloud because it offered increased availability.

Businesses have used collaboration tools primarily for two reasons: reduce overhead costs and improve communication among the workforce. Collaboration software means that there could be fewer in-house employees who are able to communicate without needing to travel, which cuts a company’s overhead costs. Cutting costs plus the ability to more easily dispense and share information make collaboration tools a boon to many businesses.

And companies can better capitalize on these benefits by moving collaboration to the cloud, Keitt argues.

“Cloud is a natural home for collaboration technology because of the confluence of employee mobility, globalization and innovation networks, which are changing the nature of business,” said Keitt.

Examples of cloud collaboration software making waves in enterprise IT include Google Apps for Business, GoToMeeting by Citrix Systems Inc. and IBM’s SmartCloud for Social Business.

But will enterprises’ hesitance to adopt cloud undermine the benefits of collaboration software?

Despite lingering concerns about security, compliance and vendor lock-in, TechTarget’s survey show a growing comfort with cloud services. 61% of the 1,500 IT pros surveyed reported they currently use cloud services.

This growing ease with cloud could be good news for enterprises. The rise of the global worker may mean increased access to information for employees, but it could also mean consumers are empowered by information.

In an era when a company’s mistake or a disappointing product could spread through social media like a social disease, the ability to quickly and efficiently communicate with customers could be a solid differentiator. Cloud-based collaboration software could match the changing tides in business, but cloud vendors have to work to overcome persistent qualms about cloud services if they to make major advances in the enterprise.

Caitlin White is associate site editor for SearchCloudComputing.com. Contact her at cwhite@techtarget.com.


October 18, 2012  12:34 AM

VMware CTO pledges new OpenStack management for vSphere

Beth Pariseau Beth Pariseau Profile: Beth Pariseau

Photobucket
VMware CTO Steve Herrod spoke to attendees at OpenStack Summit Wednesday.

VMware will work on a buffed-up compute driver for OpenStack’s Nova project which will allow OpenStack to manage advanced features of vSphere, according to VMware CTO Steve Herrod.

This means that despite the direct competition between OpenStack and VMware’s vCloud Director, VMware will allow OpenStack management tools to more easily manage vSphere virtual machines.

It’s a new olive branch extended to a suspicious OpenStack community by newcomer VMware, which has previously made clear that its proprietary cloud management tools will be able to wrap themselves around OpenStack clouds; this is the first time VMware has actively participated in allowing its hypervisor to be subject to management by another cloud platform.

Citing VMware properties including Spring, RabbitMQ, Linux, Hyperic, and Cloud Foundry, and bearing gifts in the form of hundreds of free copies of VMware Fusion, Herrod played up VMware’s open source street cred in a presentation to a skeptical but standing-room-only crowd at OpenStack Summit on Wednesday.

“We are not strictly a closed source company, we’re not strictly an open source company, we’re a blend of both,” he said.

There’s currently a compute driver within Nova, but it’s “pretty dumb,” Herrod said – essentially it allows users to create vSphere VMs and run them.

With a new driver written by VMware will come support for VMwareHA and live migration, Herrod said.

According to a later presentation by VMware staff engineer Sean Chen, the new driver will also include the ability to launch OVF disk images, use a VNC console to manage VMs, attach and detach iSCSI volumes, get guest information, conduct host operations, assign VLANs, link VMware with Quantum, and create custom VMware image properties for OpenStack’s Glance image management utility.

Herrod also hinted that VMware is exploring ways to integrate the Open vSwitch, used by network virtualization subsidiary Nicira, into the vSphere platform, possibly as a replacement for the existing VMware virtual switch.

“We are looking quite seriously at what aspect of the Open vSwitch to merge and have interoperating in vSphere environments,” he said.

Attendees at the conference weren’t necessarily about to fall into VMware’s outstretched arms, though Herrod’s presentation piqued their interest somewhat.

One a VMware user from a communications company in Texas said he still has yet to decide whether to use a vCloud or OpenStack environment for giving developers access to virtual machines.

“There’s more than one way to skin this cat,” he said.

Another attendee working for a major service provider mused that OpenStack, with its Quantum network virtualization features, might allow for better portability of vSphere VMs between private and public clouds.


August 28, 2012  4:39 PM

Seven minutes of terror in cloud performance testing

LaspeTT Profile: LaspeTT

Millions of viewers tuned in to NASA’s website to watch streamed live coverage of its ‘Curiosity’ rover landing on the surface of Mars earlier this month and though it all went off without a hitch, a server outage or a website blip could have done some serious damage to NASA’s reputation.

It was an ambitious project to say the least, and NASA knew its site would be hit with possibly its highest amount of website traffic for those seven, nail biting minutes. So how did it ensure everything ran smoothly with so much at stake? The space program turned to SOASTA‘s cloud testing software.

The NASA and SOASTA collaboration came about as a referral, of sorts, from folks at Amazon Web Services (AWS), a SOASTA technology partner. And with an already hefty bill of $25 million riding on the project, NASA wanted an audience and wanted to guarantee that audience saw an uninterrupted stream of the landing.

Often, a company’s reputation and the contents of its wallet are at stake.

“When Knight Capital crashed, it caused them to lose $16 million per minute just because they were down,” said Tom Lounibos, CEO of SOASTA. “If Twitter is down, it costs advertisers $25 million per minute.”

It really is about anticipating failure — imagining worst-case scenarios — so that when the actual moment comes, companies are ready to face adversity and deal with it. SOASTA used its predictive analysis software, GlobalTest, to imitate traffic conditions on NASA’s website three days before the Curiosity rover launch.

Predictive analysis allows you to understand when something could fail and why that happened. “We are in the business of adding more intelligence to the process,” Lounibos said. “We go through a lot of what if situations with predictive analysis.”

Some what-if situations in the NASA project consisted of load testing to help understand what might happen if there is an unexpected spike in traffic, or when back-end services require more capacity. By doing simulations and observing data, SOASTA can predict the effects on infrastructure, a Web application and the database, so that companies can optimize a website or applications to accommodate these changes.

NASA’s biggest issue was it could not predict how many people were going to watch the landing, Lounibos said. “We were able to help predict how much server capacity NASA would need,” he added.

SOASTA also helped NASA prepare for a failure scenario by simulating an outage on a portion of Web servers and proving that failover plans were indeed effective.

“When you’re streaming for millions of people you can’t afford to have failure because there is only one first,” Lounibos concluded.

Fernanda Laspe is the editorial assistant for SearchCloudComputing.com.


June 5, 2012  7:04 PM

Behind the curtain of Microsoft’s Azure song and dance

Stuart Johnston Profile: Stuart Johnston

 

Windows Azure customers anxious to learn what Microsoft has been hiding behind its back can finally exhale later this week in San Francisco.

 

One key piece of the Azure update is support for what Microsoft calls “Persistent Virtual Machine (VM) Roles,” which will let Windows Azure customers run legacy applications in VMs. That includes running Linux, sources said.

 

Another capability is a Web hosting framework codenamed “Antares” that will provide a fine granularity Web apps-hosting service aimed at customers who don’t see Azure as an economical platform for webpage hosting.

 

But will Microsoft be able to deliver those features sooner rather than later? Not in a single iteration, one source said. Instead of pulling off the “All singing, all dancing” vision Microsoft would like to promise, it’s more likely the company will need at least two iterations to achieve the basics.

 

Of course, now that the Windows 8 Release Preview is available there is sure to be a Windows Azure demo on tablets and mobile devices at the event.

 

Another key trend to watch for, sources said, is an increased focus on hybrid clouds.

 

Over the short to mid-term, Microsoft aims to achieve, “write once and run anywhere” capabilities for Windows Azure, if I can use the Java slogan. Customers want to be able to run their applications either in the data center or in the cloud, or as a hybrid of two interchangeably. And they want to be able to do so without rewriting any code or worrying about vendor lock-in.

 

The best way to do that seems simple enough — run applications on the same API on both platforms — Windows Azure and Windows Server 2012. That might not be as easy as it sounds, though.

 


Windows Azure numbers lower than Amazon

Just as important as what Microsoft says, however, is what Microsoft doesn’t say. That may be telling when it comes to judging the relative veracity and importance of plans and promises at the Meet Windows Azure event, which will be streamed.

 

Microsoft has been notably quiet about Windows Azure’s status for more than a year. That may be because sales of Windows Azure have been disappointing to date. Windows Azure has garnered fewer than 100,000 customers so far, according to the research firm Directions On Microsoft, based in Kirkland, Wash.

 

That’s quite lower than industry estimates for market leader Amazon Web Services.

 

In some respects, it’s the same struggle Microsoft has gone through before. How can the company and its products remain relevant in a computing universe that is constantly changing?

 

The event will likely resemble many previous Microsoft marketing splashes, with system integrators, application developers, resellers and other partners lined up to show solidarity for the company’s strategy du jour.

 

Again, when Thursday rolls around, remember to listen closely for what doesn’t get said as well as what does.

 

Stuart J. Johnston is Senior News Writer for SearchCloudComputing.com. Contact him at sjohnston@techtarget.com.


May 9, 2012  12:53 PM

Is Microsoft jettisoning Azure name?

Stuart Johnston Profile: Stuart Johnston

“If it is true, it’s pants-on-head retarded.”

That’s how Tier 1 analyst Carl Brooks described reports this week that Microsoft will drop “Azure” from the branding of its public cloud offering.

“Azure is a dynamite brand — it’s almost a byword, like Amazon is, for a certain kind of cloud infrastructure, and in a very positive way,” Brooks said. “They’d be nuts to drop it and I’m hard pressed to understand any potential benefit.”

As it turns out, Brooks was right; Microsoft isn’t that irrational — although sometimes it might seem that way. The confusion began when a popular tech blog got wind that the software titan had sent out an email to Azure subscribers advising them that it’s cutting “Azure” from the names of a bunch of Azure services.

“In the coming weeks, we will update the Windows Azure Service names,” the message said. “These are only name changes: Your prices for Windows Azure are not impacted,” according to the email quoted in the blog post.

What had occurred, however, was less than meets the eye. The changes are to Azure’s “billing portal,” another tech blog revealed, and don’t affect the overall naming of Azure services.

After several hours of silence, Microsoft did finally issue an official clarification. “Microsoft continues to invest in the Windows Azure brand and we are committed to delivering an open and flexible cloud platform that enables customers to take advantage of the cloud. The brand is not going away.”

That’s a good thing. “It would be like dropping ‘Exchange’ in favor of ‘Microsoft Email Server’,” Brooks added, calling the excitement “a tempest in a teapot.”


March 8, 2012  8:11 PM

Yup, your cloud hunch was right

Michelle Boisvert Michelle Boisvert Profile: Michelle Boisvert

Everything you’ve read about who is using cloud computing and why is pretty much true, so says at least one industry study.

According to a recent Cloud Industry Forum survey of 400 public and private companies of varying sizes, flexibility is the number one reason U.S. companies adopted the technology in 2011. Cost savings eked out second place.

Of the 31% of respondents who listed flexibility as the top reason for adopting cloud computing services, the majority were SMBs — tiny companies with up to 20 employees up to those with 100 to 999 employees (40% and 41%, respectively). Such companies tend to have limited in-house technical resources, and cloud offers self-service capabilities, on-demand scalability and the ability to quickly launch new services that might otherwise be delayed or pushed to the backburner completely.

Big companies with more than 5,000 employees (28% of respondents), on the other hand, looked to save using cloud services. And now the tables have turned slightly on who’s driving cloud services adoption. When cloud computing first started to catch on, business users were waving their flags for all things cloud. But once IT bigwigs — CTOs and CIOs — caught wind of cloud’s potential cost-cutting benefits, they started pushing for it too, according to Andy Burton, chairman of the Cloud Industry Forum (CIF) and CEO of Rise.

The ability to use cloud technology to launch a completely new service was a draw for 22% of respondents, while only 8% looked to cloud to either offset a lack of internal IT or because it was seen as a low-cost project.
Cloud = happiness for most adopters
Companies that jumped into cloud in 2011 must be seeing its benefits; 94% of respondents who adopted cloud have plans to expand cloud services in the next 12 months, according to CIF. The targeted apps? Email, asset management and security. Email and data storage applications will see the biggest push to the cloud in the next year, at 50% and 45%, respectively.

Burton said really big companies have moved resource workloads such as storage to the cloud because they know they can save money there. Smaller companies stick with simple apps like email. 
Once the warm and fuzzies pass, cloud concerns set in
Setting aside their love for cloud technology, plenty of IT pros are still nervous about trusting their data to others.  Top worries were data privacy and data security (56% and 53%, respectively). But these apprehensions will only cause companies to hesitate on adoption, not dismiss the idea completely.

“This may limit what companies put into the cloud and it will slow adoption rates,” Burton said. “People still have a tendency to want to know where their data resides.”

U.S. companies have made the boldest moves to the cloud. Their adoption rates are at 76% of those surveyed versus 53% of U.K. respondents. That may have much to do with EU data privacy laws that give end users the right to anonymity. Basically, a service provider has to give users the ability to remove content. And cloud services providers can’t guarantee that yet.

One surprise, in the U.S. cloud market, the largest companies are least concerned about this. According to the study, those least comfortable about privacy issues in the cloud are small private companies and public organizations.

Rise, the channel division of Fasthosts Internet Group with headquarters in the U.S. and U.K., was the sponsor of CIF’s “USA Cloud Adoption & Trends in 2012” survey.


February 16, 2012  2:30 AM

Making cloud viral in your enterprise

Michelle Boisvert Michelle Boisvert Profile: Michelle Boisvert

IT teams understand the cloud model and are trying to realize its economic benefits.

But what really drives cloud computing is end users’ expectation to have access to everything, all of the time, according to Geva Perry, author of the blog Thinking Out Cloud, at the Cloud Connect conference in Santa Clara, Calif., this week.

Consumerization of IT as well as democratization of IT and the trend of “millennial entitlement,” a younger end-user base that expects everything to just work, to be connected and accessible from anywhere, makes the cloud more relevant than ever, Perry said.

“Cloud is on-demand, it’s there, it has low upfront costs and that makes it easy for folks to adopt it.” Perry said. He claims enterprise IT has warmed to cloud as well, as IT pros find ways to make it work by minimizing friction, creating self-service and building and designing products in a way that encourages use.

After cloud makes its way into the enterprise, how can IT teams keep applications running seamlessly while still protecting consumers and end users? Plan, test and prepare for the worst.

Bill Gillis, director of eHealth Technologies at Beth Israel Deaconess in Boston, relies on virtual patching. “Our website [BIDMC.org] is attacked every 10 seconds, 24 hours a day,” said Gillis. And those attacks are only increasing. The health care provider relies on TrendMicro’s Deep Security app to secure its cloud, which includes a network of 1,500 physicians.

And as Beth Israel Deaconess grows to include more physician networks — and it will, as it expects to increase to 500 practices by the end of this year — Gillis plans to run to a mix of public and private clouds as well as virtual desktops to help control end points. “So we will just basically provide a URL to our physicians and it’s full virtualization.”

Don’t fear a cloud failure, prepare for it

The need for cloud managers to prepare was advice echoed all day at the conference. “Complexity always increases. Latency defects accumulate and will cause crazy failures to happen,” said Jesse Robbins, cofounder of Opscode.

Sure, outages happen. Robbins’ advice? Adopt resilience engineering, a practice often used in industries such as aviation, space transportation, health care and manufacturing, in which IT failures could be catastrophic to human life. The first step to do this is to “automate all the things.”

By allowing the cloud to run as automated as possible, IT staff can quickly see where failures will occur. Involve all departments in testing and load balancing. Gone are the days when IT simply threw things over the wall for testing. The DevOps culture is now, and it has its benefits in cloud.

Only after all teams are on board can cloud admins focus on reliability, specifically mean time to fail (MTTF) and not just mean time to recover. Remember, failures will happen eventually. “Automate all the things, test what you do and press the buttons,” Robbins concluded.


February 9, 2012  6:30 PM

VMware claims three-fold jump in ‘vCloud Powered’ clouds

Stuart Johnston Profile: Stuart Johnston

VMware has dominated the server virtualization marketplace since the early days — so why is it still so far behind in the cloud computing space?

In recent years, VMware has been pushing to stay even with other cloud competitors, with the release of products such as vCloud Director. In fact, the virtualization giant recently bragged up its burgeoning presence in cloud land. But how much is hype versus potential?

Tuesday, the company took another incremental step toward a more cohesive cloud strategy when it announced that this quarter it will ship vCloud Integration Manager (vCIM) — a toolset that enables third-party cloud resellers to self-provision cloud services to their customers without involving manual processes or intervention from VMware techs.

The idea is to cut the time and hassle required to configure, deliver and manage vCloud Director-based clouds for services and applications vendors, providing quicker monetization for a key segment of the cloud market. VMware vCIM will integrate with other VMware components, including vCloud Director and vSphere, as well as vShield Edge and vCenter Chargeback Manager.

Additionally, vCIM will provide a REST-based application programming interface (API) that ties into the service provider’s back office systems, including CRM and billing.

“[vCIM] is taking a provisioning request and automatically generating all that’s required to configure a new virtual data center,” Mathew Lodge, senior director of cloud services at VMware, said.

Meanwhile, the company claims to be making headway in the growing cloud marketplace, with more than 90 services providers now offering “vCloud Powered” services in some 19 countries. That’s triple the number the company could boast at the end of last year’s third quarter, according to Lodge.

VMware may be on the right path, from one analyst’s viewpoint.

“I believe that in order for VMware to spur more service provider adoption of [its] vCloud Powered stack — and to improve the quality of the service provider implementations that are vCloud Powered — the vCIM component is an important, useful element,” Lydia Leong, research vice president at Gartner, said.

But is it possible that some of VMware’s celebrations may be a bit premature?

“VMware has signed many service providers to [its] vCloud Powered program, but many of those service providers haven’t launched offerings yet,” Leong said. “While VMware-based solutions are getting strong adoption from mid-market and enterprise customers, especially for hosted private cloud solutions, the growth of Amazon Web Services in particular has dwarfed the VMware-virtualized market,” she added.

That’s not to say VMware is too late to come from behind, however.

“We’re early in the adoption cycle still, and VMware’s strong foothold in the internal data center should enable it to drive adoption of service provider clouds based on its technologies,” Leong said.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: