Ahead in the Clouds

Page 8 of 9« First...56789

September 5, 2014  1:16 PM

Want cloud success? Eat your greens!

Archana Venkatraman Profile: Archana Venkatraman
Broccoli, food, Microsoft, Parmigiano-Reggiano, Vegetable

Cloud computing is becoming a default option of delivering IT services but to reap all the benefits of the cloud, enterprises must do the boring stuff first.

On Thursday, I attended a Westminster eForum seminar on the future of cloud computing where I witnessed very interesting conversations around cloud adoption, risks, and its future from speakers ranging from analysts, legal experts and industry association heads to cloud vendors and public sector professionals.

Studies show that broccoli may help in the pre...

Boring but necessary! (Photo credit: Wikipedia)

When experts said cloud can be secure and cost-effective and can lead to innovation – it did not raise any eyebrows from the delegates. This suggests to me that users are fully convinced of cloud’s benefits.

But even then, some cloud projects backfire. Why?

The excitement of cloud is leading enterprises to overlook the boring work they need to do beforehand to yield the full benefits of the cloud. Ovum analyst Gary Barnett illustrated this best in his (PowerPoint-free!) session. Here is an article where Gary shares the user instances where cloud has failed.

“My mum made sure I ate my broccoli before I got my pudding,” Gary said. But in the cloud world, no one’s eating the broccoli, he said.

“If you don’t clean up your data before putting it on the cloud platform, you will have cloudy rubbish.” He also pointed that some users are finding cloud expensive because they are not building proper policies and guidelines around its use.

Experts at the seminar insisted cloud is a secure way of doing IT and cloud breaches are usually because of users’ “silly and predictable passwords” and their lack of awareness. Gary urged enterprises to educate users on the loopholes of predictable passwords.

“No one loves the boring stuff. But just like you have to eat your greens, you have to do all the boring stuff before adopting the cloud. Otherwise you’re just transferring onsite mess offsite,” Gary said.

The “Eat your greens” theme continued throughout the seminar and the floor roared out laughing when Microsoft’s cloud director Maurice Martin said: “In my case, the greens were the cabbages, broccoli was too posh.”

 

September 1, 2014  1:30 PM

Five questions you must ask your cloud provider

Archana Venkatraman Profile: Archana Venkatraman
Cloud Computing, Encryption, FISMA, Google, Payment Card Industry Data Security Standard, United Kingdom

One of the main barriers to cloud adoption is data privacy. This is an issue because, for the majority of cloud providers, EU/EEA and US data privacy and Information Security standards are minefields which are very difficult to cross. And that is because their focus has been on the ease of use and functionality of their services rather than the all-important data privacy, information security, data integrity and reliability requirements around providing these services responsibly.

But, when looking through the plethora of cloud service providers, you can immediately sort the ‘wheat from the chaff’ once you start drilling down into the data privacy, information security, data integrity and reliability capabilities offered to ensure the protection of your and your customers’ data.

In this guest blog post, Mike McAlpen, the executive director of security & compliance and data privacy officer at 8×8 Solutions outlines the questions cloud users must ask their providers before signing a contract.


Have you chosen the right cloud services provider?  
– M
ike McAlpen


By asking your cloud services provider the following questions you will be on the way to knowing whether you can entrust your data into its care.  

  • Compliance with EU/EEA data privacy standards

The most important question is whether your provider can provide third-party verification/audit assurance of their compliance with EU/EEA and/or US data privacy standards? It is not enough for the provider to simply produce this verification/audit assurance, it must show that it has fully implemented the UK Top 20 Critical Security Controls For Cyber Defence and/or ISO 27001 and/or rigorous US standards such as the Federal Information Security Act (FISMA) and International Information PCI-DSS v3.0 security standards.

If this verification/audit assurance is not available then your business is at peril of not meeting EU/EEA and/or US standards.

In the US, many EU/EEA and other countries it can be a criminal offence if a breach of personal data privacy occurs and an individual employee or senior management, depending on the circumstances of the breach, is deemed to be responsible.

  • Onward Transfer of Data

Does your provider work with third-party suppliers in order to deliver the cloud services it offers? If so you must check that it has contracts in place with its third-party suppliers that provide assurance that they are, and will continue to be, compliant with EU/EEA and/or US standards.

  • Data Encryption

Does the cloud solutions vendor provide the capability to encrypt sensitive data when it is being transferred across the Internet and importantly again when it is ‘at rest’? (i.e. Stored by your cloud services provider, or in files on a computer, laptop USB flash drives or other electronic media?).

  • The Right to be Forgotten

Has your provider’s solution been engineered to enable it to identify and associate each user’s personal information data? It must also provide the capability for each user to view and modify this personal data. In addition, if the user wishes this data to be deleted, the provider must then be able to completely erase all of that person’s personal data without affecting anyone else’s data. 

  • Service Level Agreements (SLAs)

Outside of compliance with data privacy standards, another key issue is asking your provider how you will determine and then document, within your services contract, the required service level agreements (SLAs). It’s no use whatsoever having the cloud services you have always wanted if you have no way of measuring or monitoring if they are actually being delivered to an acceptable level or if there are no financial penalties for non-compliance.

If your provider cannot answer “yes” to the above questions and you cannot agree to mutually acceptable SLAs – look for another provider!


August 25, 2014  11:09 PM

VMworld 2014: What happened on Day 1

Archana Venkatraman Profile: Archana Venkatraman
Cloud Computing, Google, Open vSwitch, OpenStack, vCloud, VMware, VMware vSphere

On Day 1 of its annual conference VMworld 2014 themed “No Limits”, VMware unveiled its strategies around open cloud platform OpenStack and around container technology Kubernetes. It also launched new tools to extend its software-defined datacentre and hybrid cloud offerings.

Open software-defined datacentre

One of the significant announcements was the VMware Integrated OpenStack – a service that provides enterprises – especially SMBs the flexibility to build a software-defined datacentre on any technology platform (VMware or not).

VMware Integrated OpenStack distribution is aimed at helping customers repatriate workloads from “unmanageable and insecure public clouds”. Take that AWS.

Container technology and VMware infrastructures; Kubernetes collaboration

VMware is collaborating with Docker, Google and Pivotal to allow enterprises to run and manage container-based applications on its platforms.

At the annual conference, VMware said it has joined the Kubernetes community and will make Kubernetes’ patterns, APIs and tools available to enterprises. Kubernetes, currently in pre-production beta, is an open-source implementation of container cluster management.

With Google, VMware’s efforts will focus on bringing the pod based networking model of Open vSwitch to enable multi-cloud integration of Kubernetes.

Not only will deep integration with the VMware product line bring the benefits of Kubernetes to enterprise customers, but their commitment to invest in the core open source platform will benefit users running containers,” said Joerg Heilig, VP Engineering, Google Cloud Platform. “Together, our work will bring VMware and Google Cloud Platform closer together as container based technologies become mainstream.”

With Docker, it will collaborate to allow Docker Engine on VMware workflows. It will also work to improve interoperability between Docker Hub with VMware vCloud Air, VMware vCenter Server and VMware vCloud Automation Center.

New hybrid cloud capabilities

At VMworld, VMware released new hybrid cloud service capabilities and a new line-up of third-party mobile application services. The new capabilities include vCloud Air Virtual Private Cloud OnDemand that offers customers with on demand access to vCloud Air. Another capability – VMware vCloud Air Object Storage – is aimed at providing users with scalable storage options for unstructured data. It will enable customers to easily scale to petabytes and only pay for what they use, according to the company.

It also launched mobile development services within VMware’s vCloud Air’s service catalog.

Management as a service offerings

VMware also released two new IT management tools under its vRealize brand- for managing a software-defined datacentre and public cloud infrastructure services (IaaS).

VMware vRealize Air Automation is the cloud management tool that allows users to automate the delivery of application and infrastructure services while maintaining compliance with IT policies.

Meanwhile, VMware vRealize Operations Insight offers performance management, capacity optimization, and real-time log analytics. The tool also extends operations management beyond vSphere to an enterprise’s entire IT infrastructure. Another sign than VMware is opening up its ecosystem to accommodate other virtualisation platforms.

Partnerships with Dell on software defined services

VMware  has extended collaboration with Dell to combine its NSX network virtualisation platform with the latter’s converged infrastructure products.

“Global organisations are adopting the software-defined datacentre as an open, agile, secure and efficient architecture to simplify IT and transition to the hybrid cloud,” said Raghu Raghuram, executive vice president, SDDC division, VMware. “The software-defined datacentre enables open innovation at speeds that cannot be matched in the hardware-defined world. As partners, VMware and Dell will advance networking in the SDDC, and collaborate to make advanced network virtualisation available to mutual customers.” 

Partnership with HP on hybrid cloud

VMware and HP have extended their collaboration to give momentum to users’ SDDC and hybrid cloud adoption. As part of the partnership, HP Helion OpenStack will support enterprise-class VMware virtualisation technologies.

The companies will also make standalone HP-VMware networking solution generally available. Together, these collaborative efforts can help simplify the adoption of the software-defined datacentre and hybrid cloud with less risk, and with greater operational efficiency and lower costs.

All in all, looks like VMware is opening up to competitive platforms and warming up to open source technologies but retains its standoffish traits when it comes to public cloud services.


August 20, 2014  3:53 PM

Microsoft Azure goes down for users around multiple regions including Europe and Asia

Archana Venkatraman Profile: Archana Venkatraman
Cloud Computing, Microsoft, Satya Nadella, StorSimple, Windows Azure

Just when I thought to myself: Cloud services must be improving as there are fewer outages reported this year than there were last year, Microsoft Azure cloud service went down for many users, including European ones, earlier this week.

Microsoft’s Azure status page currently displays a chirpy: 

All good!

Everything is running great.

It also displays a bright green check besides its core Azure platform components such as Active Directory, and popular cloud services including its SQL Databases, and storage services.

A snoop into its history page shows that all wasn’t good aboard Azure on Monday and Tuesday. Users experiencing full service interruption and performance degradation across several services including StorSimple, storage services, website services, backup and recovery and virtual machine offerings.

For a brief moment on Tuesday, August 19th, a subset of its customers in West Europe and North Europe using Virtual Machines, SQL Database, Cloud Services, and Storage were unable to access Azure resources or perform management operations. Users accessing Azure’s Website cloud services in Northern Europe too faced connectivity issues.

WELCOME TO Microsoft®

WELCOME TO Microsoft® (Photo credit: Wikipedia)

The previous day, some of its customers across multiple regions were unable to connect to Azure Services such as Cloud Services, Virtual Machines, Websites, Automation, Service Bus, Backup, Site Recovery, HDInsight, Mobile Services, and StorSimple. 

Some of the services were down for almost five hours.

This week’s global outage follows last week’s (August 14th) Azure outage where users across multiple regions experienced full service interruption to its Visual Studio Online. The news doesn’t bode well for CEO Satya Nadella’s “cloud-first” strategy.

Here is a detailed report on Azure’s latest datacentre outage.

Well, I may have tempted fate. Resilience and reliability are two words I’ll use sparingly to describe public cloud services. 


August 18, 2014  10:54 AM

PUE – the benevolent culprit in the datacentre

Archana Venkatraman Profile: Archana Venkatraman
Data Center, energy, PUE, SCADA, ScienceLogic

Internet of Things, big data, and social media are all creating an insatiable demand for scalable, sophisticated and agile IT resources, making datacentres a true utility. This is making big tech and telecom companies to drift a bit from their core competency and build their own customised datacentres – take Telefonica’s €420m investment in its new Madrid datacentre.

But the mind-boggling growth of computing infrastructure is occurring amid shocking increases in energy prices. Datacentres consume up to 3% of global electricity and produce 200 million metric tons of carbon dioxide, at an annual cost of $60bn. No wonder, IT energy efficiency is primary concern for everyone from CFOs to climate scientists.

In this guest blog post, Dave Wagner, TeamQuest’s director of market development with 30 years of experience in the capacity management space explains why enterprises must not be too hung up on PUE alone to measure their datacentre efficiency.

Measuring datacentre productivity? Go beyond PUE
-by Dave Wagner

In their relentless pursuit of cost effectiveness, companies measure datacentre efficiency with power utilization effectiveness (PUE). The metric measures the total amount of power coming onto the datacentre floor, divided by how much of that power is actually used by the computing equipment.

PUE = Total energy
            IT energy

PUE is necessary but not a sufficient indicator to gauge the costs associated with running or leasing datacentres.

While PUE is a detailed measure of datacentre electrical efficiency, it is one of several elements that actually determine total efficiency. In the bigger picture, focus should be on more holistic and accurate measures of business productivity, not solely on efficient use of electricity.

Gartner analyst Cameron Haight talked about how a very large technology company owns the most efficient datacentre in the world with a PUE of 1.06. This basically means that 94% of every watt that comes into the floor actually gets to processing equipment. This remarkably efficient PUE achievement does not detail what they do with all of that power, and how much total work is accomplished. If all that power is going to servers that are switched on but essentially idling and not actually accomplishing any useful work, what does PUE really tell us? Actual efficiency in terms of doing real-world work could be nearly zero even when the PUE metric indicates a well-run datacentre in isolation.

Datacenter

Datacenter (Photo credit: Wikipedia)

Boiled down, what companies end up measuring with PUE is how efficiently they are moving electricity around within the datacentre.

By some estimates, many datacentres are actually only using 10-15% of their electricity to power servers that are actually computing something. Companies should minimize costs and energy use, but nobody invests in a company solely based on how efficiently they move electricity.

Datacentres are built and maintained for their computing capacity, and for the business work that can be done thereupon. I recommend correlating computing and power efficiency metrics with the amount of useful work and with customer or end user satisfaction metrics. When these factors are optimised in a continuous fashion, true optimization can be realised.

I’ve talked about addressing power and thermal challenges in datacentres for over a decade, and have seen progress made – recent statistics show a promising slowdown in datacentre power consumption rates in the US and Europe due to successful efficiency initiatives. Significant improvements in datacentre integration have helped IT managers control the different variables of a computing system, maximising efficiency and preventing over- or under-provisioning, both having obvious negative consequences.

An integrated approach to planning and managing datacentres enables IT to automate and optimise performance, power, and component management with the goal of efficiently balancing workloads, response times, and resource utilisation with business changes. Just as the IT side analyses the relationships between the components of the stack–networking, server, compute, and applications–the business side of the equation must always be an integral part of these analyses. Companies should always ask how much work they are accomplishing with the IT resources they have; unfortunately, often easier said than done. In the majority of datacentres and connected enterprises, the promise of continuous optimisation has not been fully realised, leaving lots of room for improvement.

As datacentres grow in size and capabilities, so must the tools used to manage them. Advanced analytics have become essential to bridging IT and business demands, starting with relatively simple co-relative and descriptive methods and progressing through predictive to prescriptive approaches. Predictive analytics are uniquely suited to understand the nonlinear nature of virtualised datacenter environments. 

These advanced analytic approaches enable enterprises to combine IT and non-IT metrics in such a powerful way that the data generated by the networked computing stack can become the basis for automated and embedded business intelligence. In the most sophisticated scenarios, analytics and machine algorithms can be applied in such a way that the datacentre learns from itself and generates insight and models for decision-making approaching the level of artificial intelligence.

 


August 7, 2014  11:01 PM

What’s making Oregon the datacentre capital

Archana Venkatraman Profile: Archana Venkatraman
Facebook, Google, Hillsboro, Intel, Oregon

I am just back from Oregon where I attended a workshop at Intel’s Hillsboro campus. What amazed me the most – apart from the most delicious Peruvian cuisine I had at Portland of course – is Intel’s large presence in the area and the number of big datacentres in Oregon.

Intel is the biggest employer of the region and has multiple, vast campuses there. It even has its own airport in Hillsboro, Oregon from where it operates regular flights to its Santa Clara headquarters for its employees. Several flights that carry up to 40 Intel employees operate every day. The hotel I stayed in at Hillsboro told me that on any given day, about 70% of the people it serves are Intel-related.

Apart from Intel almost hijacking Oregon with its presence, the state is also home to many datacentre facilities. Facebook (Prineville datacentre), Google (Its first datacentre – The Dalles), Amazon (Boardman), Apple (also Prineville), and Fortune Datacentres (Hillsboro) – they all have large facilities in Oregon. 

Here’s why:

Cost:

One of the primary reasons many tech giants consider Oregon as the home to their datacentres is cheaper costs. Oregon does not have sales tax and this means computer products, building materials and services are cheaper than elsewhere in the US. In addition, power – which is a main datacentre money-guzzler – is cheaper in Oregon. Furthermore, the local government lures tech giants by providing incentives such as tax breaks and subsidies. All these factors attract datacentre investment here.

Prineville, Oregon

Prineville, Oregon (Photo credit: Wikipedia)

Talented workforce

Because of the tech culture of the region, many professionals develop server management and virtualisation skills. The emphasis on IT skills in the universities and Silicon Valley’s investment in regular training workshops make the workforce in the area more talented and skilled for datacentre management.

Climate

Oregon’s weather is comparatively mild. This makes the tricky task of datacentre cooling a little bit easier. It is simpler to devise cooling strategies for a facility when the ambient temperature does not vary highly. Oregon does not get baking hot like Texas or Kansas in the summers nor does it get overwhelmingly snowed under in winters.

Connectivity

The vast stretches of fibre optics cable that run even across Oregon’s mountains, lakes and deserts provide fast connections and latency of milliseconds. Its proximity to Silicon Valley is another puller of datacentre investment.

Geography, stability and security

Big cloud and IT service providers love political and economic stability, and physical security and Oregon gives them that. The region is not too prone to natural disasters such as volcano eruptions, earthquakes or hurricanes – which acts as another big attraction to datacentre builders. Take Iceland for instance, despite its promise of 100% green geothermal energy and fibre optics connections to mainland Europe, many IT providers hesitate to set up datacentres there because of its vulnerability to natural disasters.  

Oregon has seismically stable soil and as part of the west coast, it has little to no lightning risk – one of the major cause of outages in the US.

As Google, which opened The Dalles in 2006 by investing $1.2bn, says, Oregon has the “right combination of energy infrastructure, developable land, and available workforce for the datacentre”.

I wonder what Oregon’s equivalent in Europe would be?


July 28, 2014  3:44 PM

AWS is not the only pretty one in the room anymore

Archana Venkatraman Profile: Archana Venkatraman
Amazon, AWS, Barbie, Google, IBM, Jeff Bezos, Microsoft, OpenStack

It may be too early to conclude that the party at AWS towers is over but the cloud provider is definitely feeling the heat of the competition and the commodity cloud price wars – its quarterly earnings report showed.


Amazon’s net sales increased 23% to $19.34bn but it reported a second-quarter net loss of $126m and warned that sales could slow in the current quarter. Amazon’s business segment which includes AWS also saw growth drop to 38% year-over-year after witnessing consistent growth rates between 50% and 60% in the last two years.

Beautiful Bride Barbie - OOAK reroot

Beautiful Bride Barbie – OOAK reroot (Photo credit: RomitaGirl67)

I still remember how Amazon founder Jeff Bezos, at the first ever (2012) AWS re:Invent conference in Vegas, said that a high-margin business is not the right one for AWS.   


There is no incentive to be efficient for businesses operating on high margins because they would make profits anyway, said Bezos.

“Operating a low-margin business is harder,” he said adding that the AWS business model is very similar to the retailer’s Kindle business model – where the money is not made when the device is sold, but when people use it and keep buying services for it. 

But the price cuts – which are becoming more frequent and deeper (65% cheaper) and driven more by market forces than by internal decisions – is becoming its biggest problem. Since 2008, AWS has slashed cloud services prices 42 times.

AWS has been leading the public cloud price-war, almost over-zealously but other behemoths including Microsoft and Google who have equally deep pockets have been quick to undercut one another in the race to the bottom in pricing for cloud services.

Although the cloud market is still growing rapidly, AWS is finding that its share of the larger pie is shrinking, even while its user number is still growing.  It looks like the growth is not enough to offset the price cuts – and this must be where the problems lie. Customers love discounts and price cuts but investors don’t.

With Microsoft and Google apparently now serious about this market, AWS finally has credible competitors,” says Gartner’s public cloud expert Lydia Leong.

In May 2014, Synergy Research Group explained how Microsoft has grown its cloud infrastructure services “remarkably in the last year and is now pulling away from the pack of operators chasing Amazon“.

“AWS is likely to continue to dominate this market for years, but the market direction is no longer as thoroughly in its control,” Leong says.

AWS is no longer the only pretty one in the room. It is having to make space for Google Cloud Platform, Microsoft Azure, OpenStack, and IBM SoftLayer and also for the ferociously emerging players such as Digital Ocean and Profitbricks. 


July 22, 2014  11:36 PM

Azure brings sunshine to Microsoft’s lacklustre earnings. And how!

Archana Venkatraman Profile: Archana Venkatraman
Gartner, Google, Microsoft, microsoft office 365, nokia, Satya Nadella

Satya Nadella is going to be a happy man as his “mobile-first” “cloud-first strategy” is gathering momentum. Microsoft’s cloud business has reported a triple-digit YoY growth, the company’s earnings report for Q4 ended June 30, 2014 showed. 

Microsoft’s commercial cloud revenue grew 147% with an annualised run rate that exceeds $4.4bn (£2.58bn) even as the company’s overall profit was down 7%. 
I’m proud that our aggressive move to the cloud is paying off,” said chief exec Nadella. 

nadella-tecnomovida

Satya Nadella, Microsoft CEO (Photo credit: tecnomovida)

Other cloud highlights of the Azure provider’s results included a 11% revenue growth in its Windows volume licensing sales and similar double-digit revenue growth for server products including Azure, SQL Server, and System Center. 
Its Office 365 Home and Personal subscribers totaled more than 5.6 million, adding more than 1 million subscribers again this quarter. 
 “We are thrilled with the tremendous momentum of our cloud offerings with Office 365 and Azure both growing over 100% again,” said Kevin Turner, chief operating officer at Microsoft. 
 As Gartner’s research vice president, Merv Adrian told me, “In what was clearly a well-planned posture of demonstrating his command of the whole portfolio, Nadella delivered a strong, visionary picture of Microsoft’s ‘Digital work and life experiences’ stressing the power of its portfolio in enterprise offerings old and new.”

“There was good news in enterprise business — from SQL Server, from “All-up Dynamics” growth, with CRM nearly doubling, and with a commitment to expand Azure footprint and capacity, launch new services and deliver more hybrid cloud tiering,” Merv thinks.

While cloud offered a ray of sunshine to the company’s earnings, Microsoft blamed Nokia acquisition for the dent in its profits. 
 Microsoft’s profit for the quarter March to June 2014 was $4.6bn (£2.7bn), compared with $4.97bn for the same period last year. The company said the Nokia division, which it completed acquiring in April, lost $692m. 
 Last week, Microsoft said it will cut 18,000 jobs – more than 12,000 jobs related to the Nokia phone business division alone. This “restructuring plan to streamline and simplify its operations” is the most severe job cut in the company’s 39-year history. 
 Microsoft laid claims to impressive cloud revenues even in the first quarter of 2014 with analysts insisting that the software giant is “now pulling away from the pack of operators chasing Amazon”. 
 AWS was the lone leader in Gartner’s magic quadrant until June this year when Microsoft joined its arch-rival in the Leader quadrant. AWS is beginning to face significant competition from Microsoft in the traditional business market, and from Google in the cloud-native market, noted Gartner analysts Leong, Douglas Toombs, Bob Gill, Gregor Petri, Tiny Haynes. 
 The biggest takeaway from Microsoft’s earnings announced today is that it is indeed crushing it in the cloud sales and riding on the cloud momentum.


July 18, 2014  4:45 PM

Cloud-first? Cabinet Office seeks £700m datacentre partner for ‘top secret’ data

Archana Venkatraman Profile: Archana Venkatraman
Cabinet office, Cloud Computing, Data Center, Home Office, Rackspace

The Cabinet Office and GDS (Government Digital Service) have issued a service contract notice seeking a private partner that can provide datacentre colocation services to handle UK government’s information classified as “official”, “secret” and “top secret”.

The government has earmarked up to £700m for the four-year datacentre infrastructure agreement.

“The operating environment is to be capable of housing computer infrastructure that initially handles information with UK Government security classification ‘official’ but there may be a future requirement for Data Centre Colocation Services that handle information with ‘secret’ and ‘top secret’ security classification,” the government document read. “The provision of secret and top secret [information] would be subject to separate security accreditation and security classification,” it added.

The facilities partner must be able to subscribe for a majority shareholding (up to 75% less one share) in the new private company limited established by the Cabinet Office to provide Data Centre Colocation Services – DatacentreCo.

But under government’s Cloud First policy, many existing and new applications will move to the public cloud over the next few years. The Cabinet Office’s cloud-first strategy, announced last year, meant that the cloud will be mandated as the first choice for all new IT purchases in government.

The new potentially £700m datacentre will host ‘legacy’ applications “not suitable or not ready for cloud hosting or for which conversion to cloud readiness would be uneconomic,” the document read.

Cabinet Office, 70 Whitehall, London (next to ...

Cabinet Office, 70 Whitehall, London (next to Downing Street) (Photo credit: Wikipedia)

The Cabinet Office wants the full spectrum of datacentre services – rack space, power facilities, network and security. The datacentre hosting the official and secret information will be spread across an area of 350 sq. metres hosting 150 standard 42u racks. This sounds like a modular datacentre requirement.

And it wants “at least two separate [facility] locations subject to appropriate minimum separation requirements”.  

Also on the government wish-list are datacentre compliance with security requirements, scalability, proven track-record in the last three years, performance certificates and specific latency performance requirements (less than 0.5 milliseconds) – to cater to the requirements of initial users — Department of Work and Pensions, the Home Office and the Highways Agency.

The main aim is to have a datacentre facility that is high-quality, efficient, scalable, transparent, service-based (‘utility’) models – basically cloud-like but not the cloud.

How long do you reckon we’ll have to wait before the government declares “serious over-capacity in datacentres” like it did in 2011?


July 11, 2014  4:18 PM

Cloud’s Hollywood moment – as a villain in Cameron Diaz’s Sex Tape

Archana Venkatraman Profile: Archana Venkatraman
Uncategorized

For those still wondering if cloud computing is really mainstream – even Hollywood thinks so. Cameron Diaz’s rom-com Sex Tape releasing next Friday is all about the dangers of the cloud.

Cameron diaz

Cameron diaz (Photo credit: Wikipedia)

The movie stars Diaz and Jason Segel as a couple making a sex tape in an attempt to spice up their boring lives.  The video inevitably makes it to the cloud through Segel’s iPad on which it was filmed.  The movie tracks how the couple desperately tries to get the video off the cloud while embarrassingly juggling comments from their parents, bosses and even mailman who all see it.

Here’s some of the dialogues between Diaz (as Annie)and Segel (Jay):

Annie: (walks in) Honey, that sounds familiar, is that our…

Jay: You know the Cloud?

Annie: Stares ominously before yelling F@#$.

Jay: It went up! It went up to the cloud

Annie: And you can’t get it down from the cloud?

Jay: Nobody understands the cloud. It’s a f#$@ing mystery.

Whether they succeed to wipe their content off the cloud or not, we’ll know only on 18thJuly.  But it looks like a big struggle with Jay and Annie taking desperate measures like nicking devices belonging to their friends and families and even breaking network infrastructures to get the tape off the cloud.

Maybe Jay and Annie are showing in satirical manner how cloud is a one way street – easy to get it up (even inadvertently) but damn hard to get it off!

Here’s the trailer of Sex Tape starring Cameron Diaz, Jason Segel and The Cloud:

 


Page 8 of 9« First...56789

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: