The Troposphere


February 27, 2013  7:39 PM

Rackspace puts Object Rocket in its pocket

Beth Pariseau Beth Pariseau Profile: Beth Pariseau

Rackspace is jockeying for position aboard the advanced infrastructure services bandwagon with its acquisition today of Object Rocket, a MongoDB Database as a Service provider.

Rackspace is hardly the first to offer a Database as a Service (DBaaS) or even MongoDB as a service — other vendors like Amazon Web Services (AWS) and SoftLayer have beaten them to the punch. But, the newly merging companies claim, the Object Rocket service is faster and offers more predictable performance than competitors.

This notion is further detailed in a blog post by Rackspace’s DevOps team that features performance benchmarks; the post claims that Object Rocket’s service showed more consistent throughput and consistently low latency compared with two other MongoDB services running on — you guessed it — Amazon.

“It’s basically become a standard see-saw,” is how Carl Brooks, an analyst at 451 Group, summed things up today. “AWS launches a crazy, interesting new service, customers try it out, find the performance is non-deterministic, another provider says, ‘Hey, I can do that a jillion times better on my iron!’ and the world continues to revolve.”

For those not in the know, MongoDB is among the most rapidly growing NoSQL databases, which are mainly used in large Web applications that require rapid access to objects or documents that are delivered whole, rather than assembled out of pieces located in the rows and columns of a traditional relational database. It’s used by eBay, Disney, FourSquare and other household names, though none of those blue chips are Object Rocket customers, specifically.

It’s unclear so far how Rackspace’s customer base will respond. A few calls I put in today to IT pros who run on Rackspace produced long pauses and a disinclination to comment on the new offering until more is known about it.

Could this acquisition be filling in a corner case, rather than appealing to the meat-and-potatoes enterprises among Rackspace’s customers? Time, of course, will tell…

February 25, 2013  7:12 PM

Bare-metal approach may ease cautious enterprises into cloud

Ed Scannell Ed Scannell Profile: Ed Scannell

The idea of migrating existing workloads to public clouds got a bit of a boost earlier this month when Racemi disclosed its migration software would support SoftLayer’s popular CloudLayer platform. Racemi sweetened the deal with a time-limited offer of $99 per migrations.

Rather than cost-savings, the more important aspect of the deal is SoftLayer’s “bare-metal cloud” approach, which allows customers to customize its hardware infrastructure from processors, storage needs and high-speed networks. What also enhances the company’s offering is CloudLayer doesn’t require any lengthy contractual commitment from customers, who only pay for those resources they need’

IT shops commonly voice objections about moving workloads to the public cloud for several reasons — from security risks to unknown ROI factors to the paranoia of moving their mission-critical data outside their four walls. These fears often turn into inertia, delaying decisions to move to the public cloud indefinitely.

SoftLayer’s approach could give some consumers more confidence to move forward by providing them more control over what hardware infrastructure to choose and just how much they want to pay for it.

The bare-metal cloud approach takes the hypervisor out of the mix, which can increase the raw processing power of consumer’s hardware infrastructure. This approach may help not only those suffering from inertia about moving to public clouds, but it may also give incentive to the growing number of shops with existing cloud implementations thinking about handling “big data” and large databases there.

CloudLayer is made up virtual servers, remote storage and a content delivery network. Each CloudLayer service can work in standalone mode or be integrated with a number of dedicated servers and automated services using one private network and management system.

“We think it can be this easy to create and control a hybrid computing environment, for instance, that is interoperable,” said Marc Jones, vice president of product innovation at SoftLayer.

What might also boost user confidence in this approach is the fact SoftLayer is one of the largest privately held cloud infrastructure providers with 13 data centers that every once in a while picks off a few Amazon Web Services customers.

“We don’t aggressively try to steal Amazon customers, but we do pick some up, particularly those that have performance issues where they need high disk I/O and higher network speeds,” Jones said.

For its part, Racemi updated its Cloud Path Software as a Service (SaaS) offering and DynaCenter on-premises software to support physical and virtual server migrations to CloudLayer. Customers can also automatically migrate cloud instances from other cloud providers to CloudLayer.


December 12, 2012  7:24 PM

HP Cloud Compute undercuts Amazon, too

Beth Pariseau Beth Pariseau Profile: Beth Pariseau

It turns out that a headline I wrote for SearchCloudComputing.com last week about the new HP Cloud Compute service was only half right.

The headline calls out that HP has undercut Rackspace with its per-hour pricing of 4 cents (Rackspace’s on-demand offering is priced at 6 cents per hour). But as pointed out today by @ stackgeek on Twitter, that headline, as well as these paragraphs in the story:

Prices for Hewlett-Packard’s (HP) Cloud Compute service start at four cents per hour for a small instance with 1 GB of RAM – 2 cents lower than the price for Rackspace’s 1 GB instance. Amazon Web Services’ (AWS) Reserved Small Instance, which comes with 1.7 GB of memory, costs 3.9 cents per hour.

 While AWS still offers the best deal, HP’s pricing for the fledgling service might attract Rackspace customers.

…are not totally accurate.

I re-checked the Amazon pricing page, and @stackgeek is correct in pointing out the $69 one-time upfront cost for a small, lightly utilized one-year Reserved Instance. While the 3.9-cent hourly cost without this fee would make Amazon cheapest on a yearly basis, at $341.64, the fee makes Amazon’s cost per year $410.64. HP’s cost per year at 4 cents an hour is $350.40 and Rackspace’s price per year at 6 cents an hour is $525.60.

@stackgeek is also correct in pointing out that the original comparison was between Amazon’s Reserved Instance and HP and Rackspace’s on-demand instances. The pricing for an On-Demand Instance on Amazon is 6.5 cents per hour, which is half a cent more expensive than Rackspace. That makes the pricing for an Amazon On-Demand instance $47.45 a month, or $569.40 per year.

So, HP’s new service actually undercuts both Rackspace and Amazon on price. And Amazon Web Services is more expensive than Rackspace’s Cloud Servers as well.

I regret the errors.


December 4, 2012  3:49 PM

Cloud services: If not now, when?

Margie Semilof Margie Semilof Profile: Margie Semilof

 

The notion of using cloud-based services still terrifies enterprise IT pros, even though such services have advanced in both quality and variety for years. IT pros remain frozen by the specter of losing control of data, security breaches and random service outages. Some of these reasons may be losing validity.

In stark contrast to that fact, however, was the smashing success Amazon Web Services (AWS) had last week with re:Invent, its first conference  In the six years since AWS launched, cloud services have been increasingly embraced by start-ups and media delivery companies, along with a slew of forward-thinking developers .

Amazon executives, such as CTO Werner Vogels, a senior vice president Andy Jassey and founder Jeff Bezos, brought their value proposition to the masses in person. We’ve heard their case for cloud services before:  You can change Capex to variable expense, you can pay lower variable expenses, you don’t have to guess your capacity needs and you can have apps set up in minutes.

They hammered on traditional IT vendors, claiming the economics of AWS is disruptive to the HPs, Dells, Microsofts and Oracles of the world. Amazon is a low-volume, high-margin business and it’s not the game that the old guard can play, they said. They may have a point. With the possible exception of IBM, most of the long time stalwarts of the industry — many so reliant on hardware and more traditional services — have yet to present a compelling cloud services strategy that would make its largest customers disregard AWS.

For instance, you want security? AWS has all the standard security certifications, and the company can help implement them with a bigger and better team than you have.

But still, many IT shops have the same reasons for not moving forward.  They are risk averse, they don’t throw things out, legacy apps are hard to move and there are few examples of traditional enterprises that have made the leap. Amazon trots out its old standby, Netflix, along with more recent enterprises, including  NASDAQ, NASA, the United States Tennis Association, McGraw Hill and Novartis to name a few. Comcast, for instance, has been reinventing parts of its business using AWS, spending two years quietly refashioning its media delivery network.

Real enterprises still look for accounts they can relate to. At the conference, Amazon touted its prized new public customer Pinterest — not exactly a revenue-generating machine that supports a legacy back end.

And try to dig up a customer reference on the exhibit floor. One energetic marketing manager brightened when I asked for a name. He offered up Grindr.com. If you don’t know who they are, look them up. Hint: It’s not a competitor to the Subway chain of sandwich shops.

Not too far into the future, enterprises will absorb the cultural changes associated with cloud adoption. Amazon with AWS will be a winner, and some of its competitors will win as well. In the 1990s, we had the likes of Gates and Ballmer, Ellison and others, armed with “kill the competitor” strategies that generated high margins. This is a different game. For example, at re: Invent, AWS followed Google when it dropped the price of its cloud storage by 25%. The next morning, someone from Hitachi told me his customers asked, “Why does storage have to cost so much?” Is it getting hot in here or what?

In data-driven 21st century architectures, apps and processes must be automated as business shifts, as IT shops will be determined to not get stuck with hardware — and software — limitations. Security is integrated from the ground up. There are new attitudes. Failure is not an option? Forget that. AWS CTO Vogels says to regard failure is normal. Failure is always around the corner. Embrace it. It’s the new black.

The main thing is that if you are not constraining yourself up front, you will build more successful architectures. It will take more time, which is fine because this team, like another team from Seattle that forged new ground 30 years ago, has the long view. Amazon, and its competitors, look increasingly inevitable.

Margie Semilof is editorial director of the Data Center & Virtualization media group.


November 6, 2012  7:41 PM

Cloud-based collaboration replaces Office Space mentality of enterprise

Caitlin White Caitlin White Profile: Caitlin White

 

The modern office no longer looks like Office Space, with a staff of office drones tied to their cubicle desks, working from an office-provided desktop every day.  In a world of iPads, Blackberries, Androids, iPhones and laptops, employees are accessing information from everywhere, giving cloud-based collaboration a clear cue to make its entrance.

While some enterprises say they’re still preparing for the bring-your-own-device (BYOD) era to hit, the truth is it’s already here, whether they’re prepared or not. According to a report by Juniper Research, 150 million people use personal mobile devices for work. That number is set to more than double by 2014.

The rise of the global worker is complemented by a shift toward a services economy, said TJ Keitt, senior analyst at Forrester, a global research and advisory firm based in Cambridge, Mass. Automation that comes from new technologies, such as cloud computing, opens the doors for not only global workers but for the introduction of more creative jobs, such as consulting. And these creative jobs require more communication, collaboration and flexibility in working hours.

“Cloud collaboration is not just about being a different delivery mechanism, it’s about what you’re enabling in your workforce,” said Keitt in a Webinar last week.

A 2012 Forrester survey showed that agility — not cost-savings — was the primary reason companies gave for adopting Software as a Service (SaaS).

TechTarget’s 2012 cloud adoption survey echoed this finding, with 60% of survey respondents using public cloud because it offered increased availability.

Businesses have used collaboration tools primarily for two reasons: reduce overhead costs and improve communication among the workforce. Collaboration software means that there could be fewer in-house employees who are able to communicate without needing to travel, which cuts a company’s overhead costs. Cutting costs plus the ability to more easily dispense and share information make collaboration tools a boon to many businesses.

And companies can better capitalize on these benefits by moving collaboration to the cloud, Keitt argues.

“Cloud is a natural home for collaboration technology because of the confluence of employee mobility, globalization and innovation networks, which are changing the nature of business,” said Keitt.

Examples of cloud collaboration software making waves in enterprise IT include Google Apps for Business, GoToMeeting by Citrix Systems Inc. and IBM’s SmartCloud for Social Business.

But will enterprises’ hesitance to adopt cloud undermine the benefits of collaboration software?

Despite lingering concerns about security, compliance and vendor lock-in, TechTarget’s survey show a growing comfort with cloud services. 61% of the 1,500 IT pros surveyed reported they currently use cloud services.

This growing ease with cloud could be good news for enterprises. The rise of the global worker may mean increased access to information for employees, but it could also mean consumers are empowered by information.

In an era when a company’s mistake or a disappointing product could spread through social media like a social disease, the ability to quickly and efficiently communicate with customers could be a solid differentiator. Cloud-based collaboration software could match the changing tides in business, but cloud vendors have to work to overcome persistent qualms about cloud services if they to make major advances in the enterprise.

Caitlin White is associate site editor for SearchCloudComputing.com. Contact her at cwhite@techtarget.com.


October 18, 2012  12:34 AM

VMware CTO pledges new OpenStack management for vSphere

Beth Pariseau Beth Pariseau Profile: Beth Pariseau

Photobucket
VMware CTO Steve Herrod spoke to attendees at OpenStack Summit Wednesday.

VMware will work on a buffed-up compute driver for OpenStack’s Nova project which will allow OpenStack to manage advanced features of vSphere, according to VMware CTO Steve Herrod.

This means that despite the direct competition between OpenStack and VMware’s vCloud Director, VMware will allow OpenStack management tools to more easily manage vSphere virtual machines.

It’s a new olive branch extended to a suspicious OpenStack community by newcomer VMware, which has previously made clear that its proprietary cloud management tools will be able to wrap themselves around OpenStack clouds; this is the first time VMware has actively participated in allowing its hypervisor to be subject to management by another cloud platform.

Citing VMware properties including Spring, RabbitMQ, Linux, Hyperic, and Cloud Foundry, and bearing gifts in the form of hundreds of free copies of VMware Fusion, Herrod played up VMware’s open source street cred in a presentation to a skeptical but standing-room-only crowd at OpenStack Summit on Wednesday.

“We are not strictly a closed source company, we’re not strictly an open source company, we’re a blend of both,” he said.

There’s currently a compute driver within Nova, but it’s “pretty dumb,” Herrod said – essentially it allows users to create vSphere VMs and run them.

With a new driver written by VMware will come support for VMwareHA and live migration, Herrod said.

According to a later presentation by VMware staff engineer Sean Chen, the new driver will also include the ability to launch OVF disk images, use a VNC console to manage VMs, attach and detach iSCSI volumes, get guest information, conduct host operations, assign VLANs, link VMware with Quantum, and create custom VMware image properties for OpenStack’s Glance image management utility.

Herrod also hinted that VMware is exploring ways to integrate the Open vSwitch, used by network virtualization subsidiary Nicira, into the vSphere platform, possibly as a replacement for the existing VMware virtual switch.

“We are looking quite seriously at what aspect of the Open vSwitch to merge and have interoperating in vSphere environments,” he said.

Attendees at the conference weren’t necessarily about to fall into VMware’s outstretched arms, though Herrod’s presentation piqued their interest somewhat.

One a VMware user from a communications company in Texas said he still has yet to decide whether to use a vCloud or OpenStack environment for giving developers access to virtual machines.

“There’s more than one way to skin this cat,” he said.

Another attendee working for a major service provider mused that OpenStack, with its Quantum network virtualization features, might allow for better portability of vSphere VMs between private and public clouds.


August 28, 2012  4:39 PM

Seven minutes of terror in cloud performance testing

LaspeTT Profile: LaspeTT

Millions of viewers tuned in to NASA’s website to watch streamed live coverage of its ‘Curiosity’ rover landing on the surface of Mars earlier this month and though it all went off without a hitch, a server outage or a website blip could have done some serious damage to NASA’s reputation.

It was an ambitious project to say the least, and NASA knew its site would be hit with possibly its highest amount of website traffic for those seven, nail biting minutes. So how did it ensure everything ran smoothly with so much at stake? The space program turned to SOASTA‘s cloud testing software.

The NASA and SOASTA collaboration came about as a referral, of sorts, from folks at Amazon Web Services (AWS), a SOASTA technology partner. And with an already hefty bill of $25 million riding on the project, NASA wanted an audience and wanted to guarantee that audience saw an uninterrupted stream of the landing.

Often, a company’s reputation and the contents of its wallet are at stake.

“When Knight Capital crashed, it caused them to lose $16 million per minute just because they were down,” said Tom Lounibos, CEO of SOASTA. “If Twitter is down, it costs advertisers $25 million per minute.”

It really is about anticipating failure — imagining worst-case scenarios — so that when the actual moment comes, companies are ready to face adversity and deal with it. SOASTA used its predictive analysis software, GlobalTest, to imitate traffic conditions on NASA’s website three days before the Curiosity rover launch.

Predictive analysis allows you to understand when something could fail and why that happened. “We are in the business of adding more intelligence to the process,” Lounibos said. “We go through a lot of what if situations with predictive analysis.”

Some what-if situations in the NASA project consisted of load testing to help understand what might happen if there is an unexpected spike in traffic, or when back-end services require more capacity. By doing simulations and observing data, SOASTA can predict the effects on infrastructure, a Web application and the database, so that companies can optimize a website or applications to accommodate these changes.

NASA’s biggest issue was it could not predict how many people were going to watch the landing, Lounibos said. “We were able to help predict how much server capacity NASA would need,” he added.

SOASTA also helped NASA prepare for a failure scenario by simulating an outage on a portion of Web servers and proving that failover plans were indeed effective.

“When you’re streaming for millions of people you can’t afford to have failure because there is only one first,” Lounibos concluded.

Fernanda Laspe is the editorial assistant for SearchCloudComputing.com.


June 5, 2012  7:04 PM

Behind the curtain of Microsoft’s Azure song and dance

Stuart Johnston Profile: Stuart Johnston

 

Windows Azure customers anxious to learn what Microsoft has been hiding behind its back can finally exhale later this week in San Francisco.

 

One key piece of the Azure update is support for what Microsoft calls “Persistent Virtual Machine (VM) Roles,” which will let Windows Azure customers run legacy applications in VMs. That includes running Linux, sources said.

 

Another capability is a Web hosting framework codenamed “Antares” that will provide a fine granularity Web apps-hosting service aimed at customers who don’t see Azure as an economical platform for webpage hosting.

 

But will Microsoft be able to deliver those features sooner rather than later? Not in a single iteration, one source said. Instead of pulling off the “All singing, all dancing” vision Microsoft would like to promise, it’s more likely the company will need at least two iterations to achieve the basics.

 

Of course, now that the Windows 8 Release Preview is available there is sure to be a Windows Azure demo on tablets and mobile devices at the event.

 

Another key trend to watch for, sources said, is an increased focus on hybrid clouds.

 

Over the short to mid-term, Microsoft aims to achieve, “write once and run anywhere” capabilities for Windows Azure, if I can use the Java slogan. Customers want to be able to run their applications either in the data center or in the cloud, or as a hybrid of two interchangeably. And they want to be able to do so without rewriting any code or worrying about vendor lock-in.

 

The best way to do that seems simple enough — run applications on the same API on both platforms — Windows Azure and Windows Server 2012. That might not be as easy as it sounds, though.

 


Windows Azure numbers lower than Amazon

Just as important as what Microsoft says, however, is what Microsoft doesn’t say. That may be telling when it comes to judging the relative veracity and importance of plans and promises at the Meet Windows Azure event, which will be streamed.

 

Microsoft has been notably quiet about Windows Azure’s status for more than a year. That may be because sales of Windows Azure have been disappointing to date. Windows Azure has garnered fewer than 100,000 customers so far, according to the research firm Directions On Microsoft, based in Kirkland, Wash.

 

That’s quite lower than industry estimates for market leader Amazon Web Services.

 

In some respects, it’s the same struggle Microsoft has gone through before. How can the company and its products remain relevant in a computing universe that is constantly changing?

 

The event will likely resemble many previous Microsoft marketing splashes, with system integrators, application developers, resellers and other partners lined up to show solidarity for the company’s strategy du jour.

 

Again, when Thursday rolls around, remember to listen closely for what doesn’t get said as well as what does.

 

Stuart J. Johnston is Senior News Writer for SearchCloudComputing.com. Contact him at sjohnston@techtarget.com.


May 9, 2012  12:53 PM

Is Microsoft jettisoning Azure name?

Stuart Johnston Profile: Stuart Johnston

“If it is true, it’s pants-on-head retarded.”

That’s how Tier 1 analyst Carl Brooks described reports this week that Microsoft will drop “Azure” from the branding of its public cloud offering.

“Azure is a dynamite brand — it’s almost a byword, like Amazon is, for a certain kind of cloud infrastructure, and in a very positive way,” Brooks said. “They’d be nuts to drop it and I’m hard pressed to understand any potential benefit.”

As it turns out, Brooks was right; Microsoft isn’t that irrational — although sometimes it might seem that way. The confusion began when a popular tech blog got wind that the software titan had sent out an email to Azure subscribers advising them that it’s cutting “Azure” from the names of a bunch of Azure services.

“In the coming weeks, we will update the Windows Azure Service names,” the message said. “These are only name changes: Your prices for Windows Azure are not impacted,” according to the email quoted in the blog post.

What had occurred, however, was less than meets the eye. The changes are to Azure’s “billing portal,” another tech blog revealed, and don’t affect the overall naming of Azure services.

After several hours of silence, Microsoft did finally issue an official clarification. “Microsoft continues to invest in the Windows Azure brand and we are committed to delivering an open and flexible cloud platform that enables customers to take advantage of the cloud. The brand is not going away.”

That’s a good thing. “It would be like dropping ‘Exchange’ in favor of ‘Microsoft Email Server’,” Brooks added, calling the excitement “a tempest in a teapot.”


March 8, 2012  8:11 PM

Yup, your cloud hunch was right

Michelle Boisvert Michelle Boisvert Profile: Michelle Boisvert

Everything you’ve read about who is using cloud computing and why is pretty much true, so says at least one industry study.

According to a recent Cloud Industry Forum survey of 400 public and private companies of varying sizes, flexibility is the number one reason U.S. companies adopted the technology in 2011. Cost savings eked out second place.

Of the 31% of respondents who listed flexibility as the top reason for adopting cloud computing services, the majority were SMBs — tiny companies with up to 20 employees up to those with 100 to 999 employees (40% and 41%, respectively). Such companies tend to have limited in-house technical resources, and cloud offers self-service capabilities, on-demand scalability and the ability to quickly launch new services that might otherwise be delayed or pushed to the backburner completely.

Big companies with more than 5,000 employees (28% of respondents), on the other hand, looked to save using cloud services. And now the tables have turned slightly on who’s driving cloud services adoption. When cloud computing first started to catch on, business users were waving their flags for all things cloud. But once IT bigwigs — CTOs and CIOs — caught wind of cloud’s potential cost-cutting benefits, they started pushing for it too, according to Andy Burton, chairman of the Cloud Industry Forum (CIF) and CEO of Rise.

The ability to use cloud technology to launch a completely new service was a draw for 22% of respondents, while only 8% looked to cloud to either offset a lack of internal IT or because it was seen as a low-cost project.
Cloud = happiness for most adopters
Companies that jumped into cloud in 2011 must be seeing its benefits; 94% of respondents who adopted cloud have plans to expand cloud services in the next 12 months, according to CIF. The targeted apps? Email, asset management and security. Email and data storage applications will see the biggest push to the cloud in the next year, at 50% and 45%, respectively.

Burton said really big companies have moved resource workloads such as storage to the cloud because they know they can save money there. Smaller companies stick with simple apps like email. 
Once the warm and fuzzies pass, cloud concerns set in
Setting aside their love for cloud technology, plenty of IT pros are still nervous about trusting their data to others.  Top worries were data privacy and data security (56% and 53%, respectively). But these apprehensions will only cause companies to hesitate on adoption, not dismiss the idea completely.

“This may limit what companies put into the cloud and it will slow adoption rates,” Burton said. “People still have a tendency to want to know where their data resides.”

U.S. companies have made the boldest moves to the cloud. Their adoption rates are at 76% of those surveyed versus 53% of U.K. respondents. That may have much to do with EU data privacy laws that give end users the right to anonymity. Basically, a service provider has to give users the ability to remove content. And cloud services providers can’t guarantee that yet.

One surprise, in the U.S. cloud market, the largest companies are least concerned about this. According to the study, those least comfortable about privacy issues in the cloud are small private companies and public organizations.

Rise, the channel division of Fasthosts Internet Group with headquarters in the U.S. and U.K., was the sponsor of CIF’s “USA Cloud Adoption & Trends in 2012” survey.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: