September 26, 2011 10:28 PM
Posted by: JoMaitland
Cloud First policy
, Steven VanRoekel
, Vivek Kundra
Former federal CIO Vivek Kundra has been slammed by IT pros working for the government for his “Cloud First” policy, according to a survey by MeriTalk, an online IT community for U.S. government workers.
The survey of 174 federal IT pros was conducted in August 2011 at the MeriTalk Innovation Nation forum, six months after Kundra’s resignation.
“Vivek’s tenure … was like a bottle of champagne — seems like a great idea, exciting start, but the plan’s unclear, and the next morning you wake up with the same problems and a sore head,” said Steve O’Keeffe, founder, MeriTalk. The firm has presented its findings to Steven VanRoekel, Kundra’s replacement.
The feds supported Kundra’s initiatives, but said timing, funding and conflicting mandates made it impossible to carry them out, according to O’Keeffe. Kundra placed a heavy emphasis on modernizing infrastructure spending on IT, which he said soaked up $19 billion per year out of the approximately $70 billion federal IT budget.
While the majority of federal IT professionals (71%) believe Vivek Kundra made a significant impact while in office and credit his vision as his greatest strength, the study revealed that top challenges under Kundra included lack of funding to fulfill mandates (59%), conflicting mandates (44%) and unrealistic goals/mandates (41%). When asked to vote on the three most important priorities for the new federal CIO, respondents said:
Reduce the number of mandates and conflicting mandates (60%)
Reassess goals/timelines to make success attainable (53%)
Listen to feedback/counsel from IT operations (46%)
According to the study, 92% of feds believe cloud is a good idea for federal IT, but just 29% are following the administration’s mandated “Cloud First” policy. And almost half (42%) say they are adopting a “wait-and-see” approach related to cloud. Respondents cite numerous challenges including security issues (64%), cultural issues (36%) and budget constraints (36%) as barriers to cloud computing.
Almost all feds (95%) also vote for data center consolidation, although the majority (70%) say federal agencies will not be able to eliminate the mandated 800 data centers by 2015. Respondents do anticipate realizing savings from their data center consolidation efforts, with most (74%) estimating the federal government can save at least $75 million overall. Respondents acknowledge, however, that investment is needed — 85% say Feds will not realize data center savings without new investment.
When it comes to cyber security, respondents unanimously agreed threats have increased in the last year (100% say yes). Feds say the most important priorities for cyber security going forward are: securing federal networks (68%), critical infrastructure protection (56%) and privacy protection (36%). However, feds say funding to meet these priorities is, on average, 41% short. Further, feds are unclear who owns cyber security, highlighting a leadership vacuum.
September 21, 2011 10:23 PM
Posted by: JoMaitland
, Autodesk on Amazon
One of the biggest challenges in computer-based design is the amount of processing power it takes to simulate how designs will perform in the real world.
Autodesk Inc., makers of the popular design software AutoCAD, will launch a suite of applications in the next two weeks for visualization, optimization and collaboration on Amazon’s cloud, reducing the computational overhead for users.
The company hopes to be a bridge to the cloud for its existing customers and also to attract smaller design firms that cannot afford big compute farms for 3D visualization.
“Our cloud services will open up these capabilities to more companies,” said Dr. Andrew Anagnost, VP of Web services at Autodesk. “We can do all that processing for them.”
An optimization service will run simulations and show the best result and a collaboration service will crunch data for specific users in a workflow model. In typical cloud fashion, Autodesk will offer a free subscription for a limited amount of capacity, with more capacity for a fee. The company didn’t release exact pricing.
Autodesk has plenty of experience running Software as a Service. Its Buzzsaw online data management tool for the construction industry is over a decade old and taught the company a few lessons. It was spun out and then back in and is currently run from Autodesk’s internal servers.
“There are benefits to that, but absolute problems too around scaling up dynamically … You can’t do it with internal infrastructure,” Anagnost said. Autodesk expects to push elements of Buzzsaw out to the cloud in an experimental way. “As long as the customer will see no difference, it will go to the cloud,” he said. “The lines will blur around what’s on the desktop and what’s in the cloud.”
Anagnost expects all Autodesk’s software will have online versions within three years.
The biggest limitations to its new services will be bandwidth and security. Smaller firms may not have enough pipe to upload data to the cloud. And security in the cloud, or lack thereof continues to be a worry for many companies.
August 25, 2011 3:49 PM
Posted by: JoMaitland
, Verizon buys CloudSwitch
Verizon’s acquisition of cloud computing software company, CloudSwitch, is a smart move by the telecom giant as enterprises look for hybrid cloud offerings, but will it continue the startup’s support of other clouds besides Verizon’s, and for how long?
CloudSwitch’s software lets users move applications, or workloads, between company data centers and the cloud without changing the application or the infrastructure layer. This notion of hybrid cloud, or connecting on premises IT with public cloud services, turns out to be the preferred approach for most companies considering cloud computing.
CloudSwitch has proven its software is an enabler of this model and has a dozen or so large enterprises, including Novartis and Biogen, using its product to move workloads to the cloud and back in-house, if necessary.
But a key selling point to CloudSwitch has been its multi-cloud, multi-hypervisor strategy and many users have downloaded its software to test Amazon Web Services.Verizon has been pretty gentle so far with its acquistion of Terremark, which is acquired for over a billion dollars last year, but will that extend to a small software acquistion?
“We will remain open and support multiple clouds,” said Ellen Rubin, founder and VP of products at CloudSwitch. “It’s resting on our relationships with Terremark … they told us not once but a dozen times that they want us to stay open.”
On the surface it’s a little weird to see Verizon buying a software company, but the cloud is all about developing APIs that let users request services, so it makes sense at high level. The CloudSwitch product will be part of Verizon’s Terremark division, under Kerry Bailey, group president of Terremark Worldwide.
July 18, 2011 4:54 PM
Posted by: CarlBrooks
, hey look some data
, high density systems
, licensing hell
, private cloud
, VMware cloud
, vRAM license changes
It’s now clear that VMware did the math on their customers and the new vRAM+socket licensing scheme. But did they unintentionally screw themselves on a booming trend among their own customers?
Survey data from TechTarget’s Data Center Decisions questionnaire, which may be the industry’s largest impartial and sponsorship-free annual survey, with more than 1,000 IT professionals responding, both confirms VMware’s rationale behind the move, and points to a new trend that may indicate a future stumble on its part. The survey data will be published later this week.
VMware CTO Steve Herrod told us in an interview last week that VMware knew the changes would be deleterious to some customers but that VMware’s internal accounting showed that it wouldn’t be more than 10-20% of customers, and it was worth it to simplify licensing for the other 80%. We have some very strong circumstantial evidence that Herrod was right on the money, but historical data shows there’s a twist.
The survey says that 11% of new server buys are going to ship with 128GB RAM and another 11% with more than 128GB RAM. Depending on how many sockets, that’s at least 11% and potentially more server buyers that are going to feel the “OMFG VMware licensing” kick in. So Herrod was right on.
BUT, here’s the kicker: the number of people buying the high RAM density servers in both categories doubled(!) over last year, from 5.75% and 5.29% in 2010. 99% growth is a trend that’s hard to ignore.
Coupled with data about efforts and intentions to build private cloud environments, it’s pretty clear that a lot of IT shops are fully intending to build out cloud-style environments that can show a consolidated economy of scale. But the new licensing means that if you don’t have enough CPU sockets to go with your ocean of RAM, you’re going to get bit hard.
I am taking it as axiomatic here that pretty much all new server buys with a ridiculous amount of RAM per box are intended for virtualization; that VMware’s 85% market share ensures that those buyers are VMware users and not the outliers on Xen or whatever; and that the trend towards trying to consolidate into bigger and bigger boxes will continue to skyrocket.
VMware has posted a fairly silly ‘clarification’ about the new licensing where it attempts to convince the public that users are confused over whether the licensing is about the amount of physical RAM or the amount of vRAM. Nobody is confused about that- that’s why there’s a “v” on there.
It’s the tying of the licenses to CPU sockets that’s causing the heartburn and the outrage from a vocal minority, since so many(twice as many as last year-will it double again this year?) are buying servers with socket/RAM configs that fall outside the licensing norms. Forward thinking users, appreciative of commodity server power and the examples of massively consolidated, massively virtualized cloud computing environments are being told, in a word, they will be penalized for trying to achieve as much efficiency as technology will allow them.
And in case anyone is wondering, VMware cloud providers under the VMware Service Provider Program(VSPP) have been bound by vRAM licensing for some time now. But they’re not limited by socket, like enterprise users are.
Chew on that one for a while as you think about building out a private cloud on vSphere, enterprise IT guys and gals.
In the meantime, here is an incomplete roundup of blogs, back-of-envelope calculations, license calculators and reactions to the vRAM+socket scheme
July 12, 2011 7:05 PM
Posted by: JoMaitland
, VMware license hell
, vRAM license
, vSphere 5
VMware’s controversial licensing and pricing changes in vSphere 5, leaked today are positively uncloud-like when it comes to cost, casting a shadow over the new features and functions in the product.
Offering pooled RAM as a licensing component instead of charging for physical RAM per host will take away some of the complexity of licensing in a virtual environment but it will increase the cost, according to some analysts and expert bloggers. According to this post, Enterprise vSphere 5 adds licenses every 32GB of RAM.
practically speaking, this may not mean much for a lot of VMware users, and will actually benefit many; anyone running multi-core CPUs in servers with less than 64GB RAM at a standard complement of 10 VMS/physical host might actually see their license pool shrink, something akin to the sun moving backward, according to many VMware users. This covers many kinds of data center operations, from normal workaday servers to blade clusters of many shapes and sizes.
However, this license scheme carries a sharp prejudice against the increasingly common practice of commodity servers with massive amounts of RAM and heavy use of high memory multitenancy and in-memory applications.
For example, provisioning an Exchange server with 64GB of RAM is fairly standard; a hosted exchange provider might run dozens of Exchange VMs across a few machines and giant pool of RAM- that operator is royally screwed. Likewise anyone running a content management or distribution application, or anything with large caching/forwardng requirements.
That’s a dominant model in the cloud world, less so in enterprise, but enterprises are rapidly adopting cloud computing tricks and techniques. Did VMware make the wrong calculation on favoring its majority current customer base over the customer base it’s probably going to have(or not if the licensing remians biased this way) in a few years?
VMware’s CEO Paul Maritz said to get to cloud, users have to have this kind of licensing in order to scale, but this doesn’t jibe with the success Amazon Web Services has had. AWS internal licensing = none, it’s open source and it’s the most proven, scalable cloud on the planet.
Microsoft bundles Hyper-V free with Windows Server. Virtualizing mission critical applications on “free stuff from Microsoft” was never a super attractive option for IT pros, but if the option is an order of magnitude jump in your VMware licenses, that could change.
The question going forward for cloud-style users will be are the features and functions in VMware’s software enough to justify the extra cost?
Most of today’s news was around vSphere 5, but the company also announced vCloud Director 1.5, which now includes the capability to support linked clones. This reduces the time to provision VMs down to 5 seconds, VMware claimed, and cuts the storage costs associated with these VMs as it’s thinly provisioned, meaning only allocated when actually used.
July 12, 2011 1:41 PM
Posted by: CarlBrooks
Citrix buys cloud.com
, cloud acquisitions
, fifth cloud wheel
, nice exit
Citrix will acquire platform startup Cloud.com to bolster its private cloud product line.
Cloud.com is a three year old software startup that makes a cloud computing platform: management and messaging software to turn a commodity hardware stack of servers and virtualization into a cloud platform, like Amazon Web Services. Citrix will be integrating Cloud.com into its cloud product lineup, which is basically VDI, GoToWhatever and unicorn snot at the moment.
The Cupertino-based ex-startup is one of the more mature cloud platform gambits, like Eucalyptus. It was already a Citrix partner and has a string of decent names in its badge collection; online game giant Zygna is a customer, currently in the process of building out it’s own personal cloud after living on AWS for some time; it’s using Cloud.com’s CloudStack in both cases, a telling example of the software’s versatility.
The deal is a nice exit for Cloud.com, founded by virtualization veteran Sheng Liang; it was rapidly able to collect and deliver on big customers and apparently the platform is solid, technologically. But it’s a bigger deal for Citrix, which has been, if not exactly sinking under the waves, not making a great deal of progress.
Now it has a valid, honest to goodness cloud stack to offer users that’s compatible to a T with Citrix’s Xen hypervisor and the crowd-pleasing OpenStack project. Cloud.com has said it is merging its code base for CloudStack with OpenStack, so Cirtix gets a double whammy with this buy. This won’t rock the cloud boat all that much; mostly it means Citrix isn’t doomed to being adonut spare fifth wheel on the HP, IBM, Microsoft and EMC/Cisco/VMware cloud wagon.
July 6, 2011 5:51 PM
Posted by: Beth Pariseau
, VMware PaaS
A new plugin has been developed that connects part of VMware’s vFabric middleware to its Cloud Foundry Platform as a Service (PaaS) offering, according to a post last week on the SpringSource Hyperic blog.
vFabric Hyperic, which monitors performance in custom Web applications, can now be integrated through the plugin into Cloud Foundry’s VMC command-line interface to monitor applications running on the PaaS platform. Features include auto-discovery, event tracking and metrics collection on Cloud Foundry system and account usage, as well as Cloud Foundry provisioned services. The new integration will also allow for starting, stopping and restarting Cloud Foundry applications, updating reserved memory, and scaling up or down by one application instance to meet performance demands.
Meanwhile, Hyperic is just one part of vFabric; other components include the Apache Tomcat-based tc Server; RabbitMQ messaging; GemFire distributed data management; and the vFabric Enterprise Ready Server for Web server load balancing. There have been hints from VMware that RabbitMQ will also make its way onto the Cloud Foundry platform — the Hyperic blog post refers to RabbitMQ, “once available,” as a provisioned service the Hyperic plugin will be able to manage.
But there have been hints about RabbitMQ since the launch of Cloud Foundry, and actual integration has yet to see the light of day. GemFire is another application that could lend itself to cloud-based deployment and development, and broadly, VMware says it would be the ‘natural evolution’ for such offerings to become services offered on the Cloud Foundry platform. But the devil’s in the details, and a detailed strategy for integration between the overall vFabric and Cloud Foundry platforms has yet to be publicly voiced by VMware.
Instead, with the latest release of vFabric, version 5, VMware deepened integration between vFabric and the vSphere hypervisor, rather than with Cloud Foundry — users can now change the ‘identity’ of VMs running different components of vFabric within a given block of vSphere licenses according to demand, and vSphere’s dynamic memory feature has been added to tc Server.
In the spirit of true open source, which Cloud Foundry aims to be, it would be helpful if VMware published a roadmap for integration plans, which would give confidence to developers interested in using the platform. Instead, as it stands today, Cloud Foundry has an experimental air –- in Paul Maritz’s words at the Structure conference last month, it’s a “calculated risk” at this point — and VMware could at least theoretically pull the plug on it at any time.
June 30, 2011 11:01 PM
Posted by: CarlBrooks
, cloud providers
, Dimension Data buy
, MSPs doing cloud
, Opsource acquisition
OpSource has been bought by ICT and IT services giant Dimension Data. This tells us several important things about the cloud computing market when we look at some of the details. It’s mostly positive unless you’re a private cloud cultist or one of the vendor giants enabling private cloud cargo cults in various areas of IT.
OpSource likely made out here, too; Informa analyst Camille Mendler said NTT, which now owns Dimension Data and was a 5% equity investor in OpSource, is well known for piling up money to get what it wants. “NTT was an early investor in OpSource years ago. They always pay top dollar (see DiData price, which turned off other suitors)” Mendler said in a message.
Mendler also pointed out the real significance of the buy: the largest providers are moving to consildate their delivery arms and their channel around cloud products, becuase that’s where the action is right now. Amazon Web Services is an outlier; private cloud in the enterprise is in its infancy; but service providers in every area are in a wholesale migration into and delivering cloud computing environments. OpSource already runs in some NTT data center floor space; DiData has a massive SP/MSP customer base and it’s OpSource’s true strength as well.
DiData is already actively engaged with customers that are doing cloudy stuff, said Mendler, and they basically threw up their hands and bought out the best provider focused cloud platform and service provider they could find. “There’s a white label angle, not just enterprise” she said.
And it’s not the only deal for cloud for providers, by NTT either: It bought controlling interest in an Australian MSP with a cloud platform in May. gathering in OpSource means NTT have a stake in most of the world in a fairly serious fashion when it comes to the next wave of public and hosted cloud providers.
Well, DiData is a huge firm in IT services. They have all the expertise and software they’d ever need, but instead of developing a platform or an IaaS to sell to customers, they bought one outright and are starting up a cloud services business unit to sell it pretty much as is. That means, as has been pointed out so many times before, building a cloud is hard work, and quite distinct from well understood data center architectures around virtualization and automation as we used to know them.
It also means there was a pressing need for a functioning cloud business today, or more likely yesterday. “Essentially, what Dimension has said is ‘nothing changes with OpSource’” said OpSouce CTO John Rowell.
Rowell’s a bit giddy; he said with access to DiData’s partnerships and customers, OpSource gets a fast track to global infrastructure growth in a way it couldn’t before. “We believe we can go head to head with Amazon and we‘ll be better than them,” he said. He might not be far off, at least in the MSP sector; OpSource does have a few pieces of the puzzle AWS doesn’t, like a working support system, mature networking (mature networking features in the cloud=hosting circa 1999) and a very slick interface that is pig easy to use or extend.
Overall though, it tells us the real action is behind the scenes for enterprise IT- cloud computing is on fire in the service world; it’s still mostly smoke in the enterprise world.
June 28, 2011 8:34 PM
Posted by: CarlBrooks
BPOS by any other name would smell like hosted Exchange
, Microsoft cloud
, What is Office365
, What Office365 is not.
Office 365 is live. Read the fluff here, or watch videos all day of Microsoft SMB customers. But what exactly is Office 365? Let us hove to established tradition for “WTF is this thing” stories and start with what it is NOT:
Office 365 is not Microsoft Office software. Not Word, Excel, PowerPoint or Outlook. If you do not have those things, signing up for Office 365 will not get them for you (you can buy them at the same time you sign up, however).
It is not compatible with Microsoft Office 2003. You need to be on Office 2007 or better, because Office 365 needs Office “Open” XML (OOXML) to do most of the neato-burrito online stuff. The Microsoft how-to’s (LGT to the guide for enterprise) say it will pretty much work with MSO07 or MSO10, but you will need MSO10 Professional Plus to use all the Office 365 features.
It is not an Exchange Server, a Communications (now Lync) Server or SharePoint server. It is also not like anything you would consider a hosted Exchange server, nor is it an online email/app suite like Gmail or Zoho. It is not an browser-based online service.
It is not cross platform. This is for Windows and Internet Explorer. It lives on ActiveX and Silverlight.
It is not anything whatsoever to do with mobile devices or mobile apps, except for delivering Exchange mail and using SharePoint Mobile (one of those things is very useful, the other one is SharePoint Mobile)
It is not Google Docs.
It is definitely not iCloud.
What it is:
Office 365 is a replacement for your Exchange and Sharepoint servers that comes as a monthly subscription service from Microsoft, and it is an add-on software pack to your Office installation. It also runs Communications Server (now Lync) as a service, but I’m not sure anyone’s ever actually used Lync. It’s not like hosted versions of these products, nor is it like running them yourself- Microsoft does 100% of the admin and you get zero access except to an identity and management layer for adding and managing users and mailboxes to some extent. This is the cloud computing part of Office 365. It is better known as BPOS.
Inboxes are 25 GB and message size limits are 25 MB, the signal benefit here is that you will only intensely annoy the recipients of your 25 MB emails and no longer your IT admin as well. Admins everywhere are chuckling in anticipated schadenfreude at the thought of Microsoft operators trying to unstick Exchange queues full of 25 MB attachments going to 25 GB mailboxes instead of them.
Office 365 lets you send email from yourdomain.com and not you.microsoft.com; it supposedly will do single sign-on if you let it sync with your AD. It requires Active Directory Federation Services 2.0, so Windows 2003 Server support is out the window for that feature.
It DOES NOT integrate any further than syncing users, addresses and the Global Address List. You CANNOT UNSYNC your AD, and it is in no way shape or form a tool for managing ADs. You’ll still do all user management from your domain server and you’ll manage Office 365 users on Office 365, unless you migrate completely to Office 365 and stop using local directory services (because all you use your ADs for is email, RIGHT?) Microsoft says you can do a standard cut-over or partial migration if you want to stick your entire email infrastructure in the Microsoft cloud.
The add-on part is a download called the “Office desktop setup.” Run it on each machine that will use Office 365 after installing Office 2007 or 2010. Once you’ve done that and set up your users, they can use Office WebApps to edit and share .doc files from their PC in a browser. Apparently it’s not too hot on mobile devices, though.
That’s what Office 365 is, in sum. Is it a Gmail/Google Apps killer? Not at any entry point that is not equal to “free,” it’s not. Its also clearly not set up to be used the same way. Where’s SkyDrive, by the way? Where’s the cross-browser support?
Is it pretty cool and does neat stuff, like real-time collabo on documents and websites with multiple editors (no more email chains of these: “pls rvw chnges and snd back asap thx attached” hooray!) Sure. and like it or lump it, the world pretty much runs on Office.
But will it upheave the Office desktop landscape? Not even a little bit.