The Troposphere


August 28, 2012  4:39 PM

Seven minutes of terror in cloud performance testing

LaspeTT Profile: LaspeTT

 

Millions of viewers tuned in to NASA’s website to watch streamed live coverage of its ‘Curiosity’ rover landing on the surface of Mars earlier this month and though it all went off without a hitch, a server outage or a website blip could have done some serious damage to NASA’s reputation.

It was an ambitious project to say the least, and NASA knew its site would be hit with possibly its highest amount of website traffic for those seven, nail biting minutes. So how did it ensure everything ran smoothly with so much at stake? The space program turned to SOASTA‘s cloud testing software.

The NASA and SOASTA collaboration came about as a referral, of sorts, from folks at Amazon Web Services (AWS), a SOASTA technology partner. And with an already hefty bill of $25 million riding on the project, NASA wanted an audience and wanted to guarantee that audience saw an uninterrupted stream of the landing.

Often, a company’s reputation and the contents of its wallet are at stake.

“When Knight Capital crashed, it caused them to lose $16 million per minute just because they were down,” said Tom Lounibos, CEO of SOASTA. “If Twitter is down, it costs advertisers $25 million per minute.”

It really is about anticipating failure — imagining worst-case scenarios — so that when the actual moment comes, companies are ready to face adversity and deal with it. SOASTA used its predictive analysis software, GlobalTest, to imitate traffic conditions on NASA’s website three days before the Curiosity rover launch.

Predictive analysis allows you to understand when something could fail and why that happened. “We are in the business of adding more intelligence to the process,” Lounibos said. “We go through a lot of what if situations with predictive analysis.”

Some what-if situations in the NASA project consisted of load testing to help understand what might happen if there is an unexpected spike in traffic, or when back-end services require more capacity. By doing simulations and observing data, SOASTA can predict the effects on infrastructure, a Web application and the database, so that companies can optimize a website or applications to accommodate these changes.

NASA’s biggest issue was it could not predict how many people were going to watch the landing, Lounibos said. “We were able to help predict how much server capacity NASA would need,” he added.

SOASTA also helped NASA prepare for a failure scenario by simulating an outage on a portion of Web servers and proving that failover plans were indeed effective.

“When you’re streaming for millions of people you can’t afford to have failure because there is only one first,” Lounibos concluded.

Fernanda Laspe is the editorial assistant for SearchCloudComputing.com.

June 5, 2012  7:04 PM

Behind the curtain of Microsoft’s Azure song and dance

Stuart Johnston Profile: Stuart Johnston

 

Windows Azure customers anxious to learn what Microsoft has been hiding behind its back can finally exhale later this week in San Francisco.

 

One key piece of the Azure update is support for what Microsoft calls “Persistent Virtual Machine (VM) Roles,” which will let Windows Azure customers run legacy applications in VMs. That includes running Linux, sources said.

 

Another capability is a Web hosting framework codenamed “Antares” that will provide a fine granularity Web apps-hosting service aimed at customers who don’t see Azure as an economical platform for webpage hosting.

 

But will Microsoft be able to deliver those features sooner rather than later? Not in a single iteration, one source said. Instead of pulling off the “All singing, all dancing” vision Microsoft would like to promise, it’s more likely the company will need at least two iterations to achieve the basics.

 

Of course, now that the Windows 8 Release Preview is available there is sure to be a Windows Azure demo on tablets and mobile devices at the event.

 

Another key trend to watch for, sources said, is an increased focus on hybrid clouds.

 

Over the short to mid-term, Microsoft aims to achieve, “write once and run anywhere” capabilities for Windows Azure, if I can use the Java slogan. Customers want to be able to run their applications either in the data center or in the cloud, or as a hybrid of two interchangeably. And they want to be able to do so without rewriting any code or worrying about vendor lock-in.

 

The best way to do that seems simple enough — run applications on the same API on both platforms — Windows Azure and Windows Server 2012. That might not be as easy as it sounds, though.

 


Windows Azure numbers lower than Amazon

Just as important as what Microsoft says, however, is what Microsoft doesn’t say. That may be telling when it comes to judging the relative veracity and importance of plans and promises at the Meet Windows Azure event, which will be streamed.

 

Microsoft has been notably quiet about Windows Azure’s status for more than a year. That may be because sales of Windows Azure have been disappointing to date. Windows Azure has garnered fewer than 100,000 customers so far, according to the research firm Directions On Microsoft, based in Kirkland, Wash.

 

That’s quite lower than industry estimates for market leader Amazon Web Services.

 

In some respects, it’s the same struggle Microsoft has gone through before. How can the company and its products remain relevant in a computing universe that is constantly changing?

 

The event will likely resemble many previous Microsoft marketing splashes, with system integrators, application developers, resellers and other partners lined up to show solidarity for the company’s strategy du jour.

 

Again, when Thursday rolls around, remember to listen closely for what doesn’t get said as well as what does.

 

Stuart J. Johnston is Senior News Writer for SearchCloudComputing.com. Contact him at sjohnston@techtarget.com.


May 9, 2012  12:53 PM

Is Microsoft jettisoning Azure name?

Stuart Johnston Profile: Stuart Johnston

“If it is true, it’s pants-on-head retarded.”

That’s how Tier 1 analyst Carl Brooks described reports this week that Microsoft will drop “Azure” from the branding of its public cloud offering.

“Azure is a dynamite brand — it’s almost a byword, like Amazon is, for a certain kind of cloud infrastructure, and in a very positive way,” Brooks said. “They’d be nuts to drop it and I’m hard pressed to understand any potential benefit.”

As it turns out, Brooks was right; Microsoft isn’t that irrational — although sometimes it might seem that way. The confusion began when a popular tech blog got wind that the software titan had sent out an email to Azure subscribers advising them that it’s cutting “Azure” from the names of a bunch of Azure services.

“In the coming weeks, we will update the Windows Azure Service names,” the message said. “These are only name changes: Your prices for Windows Azure are not impacted,” according to the email quoted in the blog post.

What had occurred, however, was less than meets the eye. The changes are to Azure’s “billing portal,” another tech blog revealed, and don’t affect the overall naming of Azure services.

After several hours of silence, Microsoft did finally issue an official clarification. “Microsoft continues to invest in the Windows Azure brand and we are committed to delivering an open and flexible cloud platform that enables customers to take advantage of the cloud. The brand is not going away.”

That’s a good thing. “It would be like dropping ‘Exchange’ in favor of ‘Microsoft Email Server’,” Brooks added, calling the excitement “a tempest in a teapot.”


March 8, 2012  8:11 PM

Yup, your cloud hunch was right

Michelle Boisvert Michelle Boisvert Profile: Michelle Boisvert

Everything you’ve read about who is using cloud computing and why is pretty much true, so says at least one industry study.

According to a recent Cloud Industry Forum survey of 400 public and private companies of varying sizes, flexibility is the number one reason U.S. companies adopted the technology in 2011. Cost savings eked out second place.

Of the 31% of respondents who listed flexibility as the top reason for adopting cloud computing services, the majority were SMBs — tiny companies with up to 20 employees up to those with 100 to 999 employees (40% and 41%, respectively). Such companies tend to have limited in-house technical resources, and cloud offers self-service capabilities, on-demand scalability and the ability to quickly launch new services that might otherwise be delayed or pushed to the backburner completely.

Big companies with more than 5,000 employees (28% of respondents), on the other hand, looked to save using cloud services. And now the tables have turned slightly on who’s driving cloud services adoption. When cloud computing first started to catch on, business users were waving their flags for all things cloud. But once IT bigwigs — CTOs and CIOs — caught wind of cloud’s potential cost-cutting benefits, they started pushing for it too, according to Andy Burton, chairman of the Cloud Industry Forum (CIF) and CEO of Rise.

The ability to use cloud technology to launch a completely new service was a draw for 22% of respondents, while only 8% looked to cloud to either offset a lack of internal IT or because it was seen as a low-cost project.
Cloud = happiness for most adopters
Companies that jumped into cloud in 2011 must be seeing its benefits; 94% of respondents who adopted cloud have plans to expand cloud services in the next 12 months, according to CIF. The targeted apps? Email, asset management and security. Email and data storage applications will see the biggest push to the cloud in the next year, at 50% and 45%, respectively.

Burton said really big companies have moved resource workloads such as storage to the cloud because they know they can save money there. Smaller companies stick with simple apps like email. 
Once the warm and fuzzies pass, cloud concerns set in
Setting aside their love for cloud technology, plenty of IT pros are still nervous about trusting their data to others.  Top worries were data privacy and data security (56% and 53%, respectively). But these apprehensions will only cause companies to hesitate on adoption, not dismiss the idea completely.

“This may limit what companies put into the cloud and it will slow adoption rates,” Burton said. “People still have a tendency to want to know where their data resides.”

U.S. companies have made the boldest moves to the cloud. Their adoption rates are at 76% of those surveyed versus 53% of U.K. respondents. That may have much to do with EU data privacy laws that give end users the right to anonymity. Basically, a service provider has to give users the ability to remove content. And cloud services providers can’t guarantee that yet.

One surprise, in the U.S. cloud market, the largest companies are least concerned about this. According to the study, those least comfortable about privacy issues in the cloud are small private companies and public organizations.

Rise, the channel division of Fasthosts Internet Group with headquarters in the U.S. and U.K., was the sponsor of CIF’s “USA Cloud Adoption & Trends in 2012” survey.


February 16, 2012  2:30 AM

Making cloud viral in your enterprise

Michelle Boisvert Michelle Boisvert Profile: Michelle Boisvert

IT teams understand the cloud model and are trying to realize its economic benefits.

But what really drives cloud computing is end users’ expectation to have access to everything, all of the time, according to Geva Perry, author of the blog Thinking Out Cloud, at the Cloud Connect conference in Santa Clara, Calif., this week.

Consumerization of IT as well as democratization of IT and the trend of “millennial entitlement,” a younger end-user base that expects everything to just work, to be connected and accessible from anywhere, makes the cloud more relevant than ever, Perry said.

“Cloud is on-demand, it’s there, it has low upfront costs and that makes it easy for folks to adopt it.” Perry said. He claims enterprise IT has warmed to cloud as well, as IT pros find ways to make it work by minimizing friction, creating self-service and building and designing products in a way that encourages use.

After cloud makes its way into the enterprise, how can IT teams keep applications running seamlessly while still protecting consumers and end users? Plan, test and prepare for the worst.

Bill Gillis, director of eHealth Technologies at Beth Israel Deaconess in Boston, relies on virtual patching. “Our website [BIDMC.org] is attacked every 10 seconds, 24 hours a day,” said Gillis. And those attacks are only increasing. The health care provider relies on TrendMicro’s Deep Security app to secure its cloud, which includes a network of 1,500 physicians.

And as Beth Israel Deaconess grows to include more physician networks — and it will, as it expects to increase to 500 practices by the end of this year — Gillis plans to run to a mix of public and private clouds as well as virtual desktops to help control end points. “So we will just basically provide a URL to our physicians and it’s full virtualization.”

Don’t fear a cloud failure, prepare for it

The need for cloud managers to prepare was advice echoed all day at the conference. “Complexity always increases. Latency defects accumulate and will cause crazy failures to happen,” said Jesse Robbins, cofounder of Opscode.

Sure, outages happen. Robbins’ advice? Adopt resilience engineering, a practice often used in industries such as aviation, space transportation, health care and manufacturing, in which IT failures could be catastrophic to human life. The first step to do this is to “automate all the things.”

By allowing the cloud to run as automated as possible, IT staff can quickly see where failures will occur. Involve all departments in testing and load balancing. Gone are the days when IT simply threw things over the wall for testing. The DevOps culture is now, and it has its benefits in cloud.

Only after all teams are on board can cloud admins focus on reliability, specifically mean time to fail (MTTF) and not just mean time to recover. Remember, failures will happen eventually. “Automate all the things, test what you do and press the buttons,” Robbins concluded.


February 9, 2012  6:30 PM

VMware claims three-fold jump in ‘vCloud Powered’ clouds

Michelle Boisvert Stuart Johnston Profile: Stuart Johnston

VMware has dominated the server virtualization marketplace since the early days — so why is it still so far behind in the cloud computing space?

In recent years, VMware has been pushing to stay even with other cloud competitors, with the release of products such as vCloud Director. In fact, the virtualization giant recently bragged up its burgeoning presence in cloud land. But how much is hype versus potential?

Tuesday, the company took another incremental step toward a more cohesive cloud strategy when it announced that this quarter it will ship vCloud Integration Manager (vCIM) — a toolset that enables third-party cloud resellers to self-provision cloud services to their customers without involving manual processes or intervention from VMware techs.

The idea is to cut the time and hassle required to configure, deliver and manage vCloud Director-based clouds for services and applications vendors, providing quicker monetization for a key segment of the cloud market. VMware vCIM will integrate with other VMware components, including vCloud Director and vSphere, as well as vShield Edge and vCenter Chargeback Manager.

Additionally, vCIM will provide a REST-based application programming interface (API) that ties into the service provider’s back office systems, including CRM and billing.

“[vCIM] is taking a provisioning request and automatically generating all that’s required to configure a new virtual data center,” Mathew Lodge, senior director of cloud services at VMware, said.

Meanwhile, the company claims to be making headway in the growing cloud marketplace, with more than 90 services providers now offering “vCloud Powered” services in some 19 countries. That’s triple the number the company could boast at the end of last year’s third quarter, according to Lodge.

VMware may be on the right path, from one analyst’s viewpoint.

“I believe that in order for VMware to spur more service provider adoption of [its] vCloud Powered stack — and to improve the quality of the service provider implementations that are vCloud Powered — the vCIM component is an important, useful element,” Lydia Leong, research vice president at Gartner, said.

But is it possible that some of VMware’s celebrations may be a bit premature?

“VMware has signed many service providers to [its] vCloud Powered program, but many of those service providers haven’t launched offerings yet,” Leong said. “While VMware-based solutions are getting strong adoption from mid-market and enterprise customers, especially for hosted private cloud solutions, the growth of Amazon Web Services in particular has dwarfed the VMware-virtualized market,” she added.

That’s not to say VMware is too late to come from behind, however.

“We’re early in the adoption cycle still, and VMware’s strong foothold in the internal data center should enable it to drive adoption of service provider clouds based on its technologies,” Leong said.


February 3, 2012  7:30 PM

OpenStack may drop Hyper-V from next release

Michelle Boisvert Stuart Johnston Profile: Stuart Johnston

There’s been some handwringing since late January when the open source cloud platform OpenStack suggested cutting “dead wood” from the pending next release — and that includes Microsoft’s Hyper-V.

Sure vendors have shown excitement over the future of OpenStack and Hyper-V, but having Microsoft’s virtualization hypervisor in OpenStack doesn’t seem to matter to enterprise IT today, according to one cloud analyst.

The crux is that not a lot of OpenStack shops use Hyper-V at present and vice versa, Carl Brooks, analyst for infrastructure services at Tier1 Research, said.

“It’s a completely minor deal … the hypervisor support isn’t a big deal, functionally. Most users for the foreseeable future are [going to] stick with OVF [Open Virtualization Format] or similar,” he added.

Additionally, users can still run Windows as a guest operating system with OpenStack, said Chad Keck, senior director of sales at cloud-based hosting provider AppFog, who worked on OpenStack. “I don’t know anyone who is using OpenStack that is also leveraging Hyper-V,” Keck added.

That didn’t stop the discussion from getting a little shrill.

In a post to the OpenStack team mailing list, release manager Thierry Carrez described the project’s Hyper-V support as “known broken and unmaintained.”

“It sounds like a good moment to consider removing deprecated, known-buggy-and-unmaintained or useless feature code from the Essex tree,” Carrez’s post continued.

Microsoft said, however, it’s not giving up on support for the project and stressed its commitment to resolve current issues with Hyper-V and OpenStack.

Even if Microsoft drops the ball, there is little reason to worry, Brooks noted.

“By the time OpenStack is ready for prime time, it’ll probably support Hyper-V again,” he added. “If not, it will happen in a twinkling of an eye as soon as someone finds a good reason.”

Beth Pariseau also contributed to this article.

 


January 27, 2012  9:17 PM

Does Microsoft have the urge to merge Azure and System Center platforms?

Ed Scannell Ed Scannell Profile: Ed Scannell

Laying out its Microsoft Product Roadmap for 2012 this week, an analyst at market researcher Directions on Microsoft said it believes the company will bring the System Center management suite and Windows Azure closer together over the next few years to where the two will likely merge into a single platform.

Evidence of this tighter relationship can be seen in the upcoming System Center 2012 suite, due in early spring, which has new features supporting a number of capabilities in Azure. System Center and Azure won’t be the only two getting cozier. Microsoft will also enrich Windows Server to work more hand-in- glove with Azure as well, said Rob Helm, managing vice president of Directions on Microsoft.

System Center will continue its reach toward Windows Azure with Virtual Machine Manager (contained in System Center 2012) already gaining the ability to manage some Azure resources. I think Windows Server will also gain the ability to run Azure’s unique services for things like storage and authentication. This way if something deployed (on Azure) is not working out or there are security concerns, users can bring them over to Windows Server,” Helm said.

Continuing on what he sees for Azure in 2012, Helm said the cloud platform will receive two important updates this year – updates he originally expected in 2011 – that will make it more compatible with Windows Server as well as deploy applications with significantly less upfront costs. The first will be the VM roles feature which will allow the platform to run Hyper-V virtual machines.

The second will be the delivery of Application Virtualization, better known as Server App-V, which will allow Azure to run Windows Server components it can’t today, making it easier to summon up server applications, Helm said. He added that in the second half of this year Microsoft itself would be putting server-based apps up on Azure, namely some of its Dynamics applications such as Dynamics NAV.

As Azure gains the ability to host virtual machines, Helm predicts it will generally function as an Infrastructure as a Service (IaaS) offering, not just as a Platform as a Service (PaaS). This evolution will bring it more directly into competition with Amazon Web Services.

“I think you will gradually see Amazon Web Services and Azure converge in terms of their capabilities,” Helm said.

Let us know what you think about this story; email Ed Scannell at escannell@techtarget.com.


January 20, 2012  5:22 PM

Is the cloud Linux country?

Alex Barrett Alex Barrett Profile: Alex Barrett

VMware and virtualization changed the face of enterprise IT. And cloud computing — in some form or another — promises to do the same.

What shape will the cloud take? It’s still too early to say for sure, but my gut tells me the cloud will be inextricably linked with Linux-inspired tools, applications and operational philosophies.

Web 2.0 and the cloud set is dominated by mainstays of the Linux ecosystem: programming languages (Ruby and Python), operating system-provisioning tools (Cobbler and Foreman), configuration management and automation frameworks (Puppet and Chef) and monitoring suites (Nagios and Zabbix). Linux folks, who lament Windows’ cost, security and lack of programmability, also dominate the emerging DevOps movement.

In a roundabout way, a new Linux Foundation survey confirms my suspicions: New instances of Linux — and that has to describe anything remotely cloud-like — are overwhelmingly going toward new applications. In the past two years, the survey found, 71.6% of new Linux deployments went to brand new applications and greenfield deployments, versus 38.5% and 34.5% of new Linux instances that were derived from Windows and Unix migrations, respectively. It’s hard to change horses midstream, but less so when you’re still on the riverbank.

What kinds of new workloads are IT shops deploying on Linux? Big data, for one. Organizations that plan to add servers to support big data workloads will use Linux over Windows by a two-to-one margin (71.8% vs. 35.9%). Given big data’s open source and Linux heritage, that’s not entirely surprising, but it’s still quite telling.

Meanwhile, in the short term, the big names in cloud are hedging their bets.

Amazon, for example, recently extended its Amazon Web Services Free Usage Tier to Windows Server 2003 R2, 2008 and 2008 R2, providing developers up to 750 hours of testing time per month, for up to one year. The service was previously limited to Linux Amazon machine images, and it should be a boon to enterprise developers testing multi-tier apps that run on mixed platforms.

But at the same time, Microsoft itself is set to begin offering Linux instances on Azure, making it possible to move existing Linux apps to Redmond’s Platform as a Service (PaaS), rather than building them from scratch. I would have loved to have been a fly in the wall in that meeting.

Of course, Windows still dominates the data center. In the third quarter of 2011, Windows servers represented 49.7% of all factory revenue, compared to 18.6% for Linux servers, according to the IDC Worldwide Quarterly Server Tracker. But Linux server growth outpaced that of Windows by a healthy margin, 12.3% compared to 5.3% for Windows. Linux won’t overtake Windows anytime soon, but with cloud on the horizon, the wind is at its back.


January 9, 2012  8:41 PM

Windows Azure cloud to embrace Linux OS

Alex Barrett Stuart Johnston Profile: Stuart Johnston

Microsoft often has been seen as opposed to any operating system that isn’t Windows — particularly Linux. However, Redmond has been changing its attitude, in some cases even going out of its way to make room at the table for the open source OS.

In fact, if recent rumors are borne out, the company will soon add Linux to the list of OSEs that Windows Azure public cloud platform supports.

According to reports from Microsoft watcher Mary Jo Foley, Microsoft is adding support for Linux in addition to Windows Server in Windows Azure’s so-called Virtual Machine (VM) role, along with other upcoming changes to its Windows Azure public cloud offering.

It will do that in part to meet the demands of larger customers who have apparently been leaning on the company over the fact that heterogeneous data centers are the rule, not the exception. Linux is a fact of life, not something to be ignored, even in the cloud.

Additionally, and perhaps a little ironically, Azure does not support several key Microsoft applications, including SharePoint Server, SQL Server, Small Business Server and Terminal Server.

The VM role has been in beta for months. It provides an easy and quick way to move an application onto Azure by simply loading it as a Virtual Hard Disk (VHD) image into a VM role. Microsoft points to the VM role as a way to run legacy applications on Azure.

However, the VM role doesn’t currently persist application states nor does it support Linux.

Microsoft architects had apparently expected customers to build their applications on Azure’s Platform as a Service (PaaS) APIs. Writing apps from scratch is more work than running them in VMs.

“If Microsoft makes VMs stateless and even lets Linux VMs load, it would address some of [its] issues with Amazon [and other PaaS purveyors],” said Rob Sanfilippo, research vice president at analyst firm Directions on Microsoft.

If this is true, the move could help Microsoft’s public cloud story with enterprise IT.

“It’s the first non-Windows server supported by Azure [and] it broadens their offering …. If you really want to get the most out of Azure, a lot of organizations really just want to move their applications to the cloud,” Sanfilippo added.

The updated VM role capability with support for Linux and preserving application state is set to go into community technology preview, or CTP, in late March, said Foley.

Microsoft declined to comment on pending Azure futures and has not made any announcements regarding hosting Linux on Azure.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: