Data center facilities pro


May 4, 2009  1:37 PM

Manos explains move to Digital Realty Trust

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Michael Manos, the former data center pro at Microsoft, lays out his reasons for moving to Digital Realty Trust in a blog post this morning.

There are two main ones: 1) Digital is in a position to effect change in the data center industry in a major way; and 2) They are led by forward-looking people who are passionate about the industry. Sounds like good reasons to me. Manos expands on the “passionate” point:

Let me reiterate that passionate point a moment, this is not some real estate company looking to make a quick buck on mission critical space.  I have seen enough of those in my career.  This is a firm focused on educating the market, driving innovation in application of technology, and near zealot commitment on driving efficiencies for their customers.  Whether its their frequent webinars, their industry speaking engagements, or personal conversations they are dedicated to this space, and dedicated on informing their customers.  Even when we have disagreed on topics or issues in the past, its always a great respectful conversation.  In a nutshell, they GET IT.

Digital certainly is out at industry events, speaking, running booths, and being a strong presence. And they do hold regular webinars — I would say once every couple months or more frequently — a few of which I’ve listened into and found to be helpful. The company is considered the largest data center real estate company in the world, with presence all over the U.S. and Europe (you can see all their locations here).

Manos also lauded Digital for being big on quick data center deployment, which is something he worked hard on with Christian Belady while at Microsoft. The modular data center is a hot topic now, particularly among large Web companies and data center colocation businesses who are looking at the data center as a set of Lego building blocks with which to build data centers as quickly as possible.

Manos revealed that one of his goals while at Digital Realty Trust will be to “develop the ability to deliver data center capacity start to finish in 16 weeks.” That’s less than four months. The normal data center planning and construction process usually takes 4-6 times longer. So that is worth keeping a close eye on.

April 21, 2009  2:58 PM

Redefining a “green” data center

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

It is all well and good to talk about building green data centers here in the United States, but what is happening overseas?

Though some of the video teleconference at The Uptime Institute’s show last week was dull, there were some interesting insights into how different parts of the world run their data centers. The video conference had participants in New York, San Francisco, Italy, and South Africa, among other places. When talking about data center energy efficiency, the folks in South Africa had some interesting things to say.

Some countries in Africa have unreliable power grids. Nigeria is one nation that sticks out, and one mentioned during the conference. And so in Nigeria, many data centers are running 24 hours a day, 7 days a week, on diesel generators. That’s right: diesel generators. It’s the only way they can guarantee uptime. Talk about running up your carbon footprint.

As the guy from South Africa said, “it goes totally against the green movement we’re talking about.”

But Robert Aldrich, the senior manager of the efficiency assurance program at Cisco, said it just opens up other opportunities for electricity generation. Solar power, for example, becomes a much more attractive option for Nigerian data centers.


April 20, 2009  7:51 PM

NetApp the “unnamed customer” with $1.4M rebate

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Another update to the story on data center utility rebate programs: I just heard from Joe Miller, the facilities operations manager for NetApp in North Carolina. Miller told me that NetApp is the “unnamed customer” that Mark Bramfitt was mentioning when talking about a $1.4 million rebate it gave to a customer last year.

NetApp received the money for the design of a new data center in Sunnyvale, Calif. Energy efficiency measures taken at the facility include airside economizers, flywheel UPSes, a variable-speed primary chiller plant, and more efficient transformers. PG&E estimates at the time that the NetApp data center design will save PG&E about 11 million kilowatt-hours of energy every year, which equates to savings of $1.2 annually and a carbon footprint reduction of almost 3,400 tons.

Thanks to Miller for identifying this “unnamed customer.”


April 20, 2009  12:37 PM

Utility: Ratepayers wouldn’t stand for data center rebate program

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Following my story on data center utility rebates, I got the following response from Chris Johnston, the national critical facilities chief engineer for Syska Hennessy Group, which does a lot of data center design and construction projects. Here it is:

Some thoughts about this process.

  1. This rebate money is not PGE’s (Pacific Gas & Electric) money – that money can only come from their shareholders and you can bet that the shareholders aren’t paying for this. The rebate money is paid out of the rate base, so Microsoft, NetAPP, (Bank of America) and similar users in the same rate class paid for the rebate.
  2. The next time another user in the eBay and KP (Kaiser Permanente) rate class gets a rebate, eBay and KP get to pay for it.
  3. Eventually, the smarter users who participate get an incentive that is paid by the not-so-smart users.
  4. This works in PGE’s service area now, but wouldn’t work elsewhere in the US. I talked with a major utility elsewhere and they said that their ratepayers wouldn’t stand for something like this.
  5. The more virtualization is done, the less a data center remains a constant load.

In particular I find it interesting that some utilities claim that they wouldn’t even be able to implement the program because their ratepayers wouldn’t put up with it. This could be quite a roadblock to data center efficiency in some areas.


April 13, 2009  9:51 PM

Google on data center efficiency: Stop making excuses

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

NEW YORK — Bill Weihl, Google’s green energy czar, told a group of data center operators here that some of them need to stop making excuses for not improving their facilities’ energy efficiency.

After years of secrecy around how its data centers operate, Google has now drawn the curtain to show how efficient its data center facilities are. But during a panel discussion at The Uptime Institute’s conference in New York today, some questioned whether all data centers should be cut in the same mold.

In particular, the question was whether the data center power usage effectiveness (PUE) of some businesses — financial institutions, for example — should be compared to those of search engines such as Google.

“Should a bank have the same PUE as a search engine?” Ken Brill, Uptime founder and executive director. “The answer is no.”

The reasoning behind it is that bank and financial applications require a higher level of uptime than search queries, and thus need more redundancy, which leads to lower efficiency. But Weihl questioned the logic.

“We actually have some Sarbanes Oxley requirements,” he said. “We’re not just a search engine company. We also run very reliable data centers that I think any data center operator here would be proud to run.”

Weihl later added that the discussion sounded like “people making excuses for why the EPA or DOE should not push hard for a standard because, hey, we’re different.”

“To me,  not to be too combative, but that sounds like an excuse for not doing better.”

Currently the federal government is working on developing an Energy Star rating for data centers. Michael Zatz, the manager of the Energy Star commercial buildings program, sees the potential for different categories of data centers, but would prefer that those categories be defined by what kind of work the data centers perform, and not necessarily by what industry they’re in or how they identify themselves.


April 8, 2009  11:31 PM

Michael Manos leaving Microsoft to join Digital Realty Trust

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Michael ManosMichael Manos, Microsoft’s senior director of data center services, will leave the company to join Digital Realty Trust’s data center operations.

Manos has been a big part of Microsoft opening its data centers to public view and advocating for measuring energy consumption. He was also one of the leaders of Microsoft’s containerized data center strategy, and introduced a conceptual open-air data center design. But not everything is rosy in the Microsoft data center world. The company announced earlier this year that it was slowing construction of its Chicago and Iowa facilities, and cutting data center capital expenditures by $300 million.

At Digital Realty Trust, Manos will serve as the senior vice president of technical services, overseeing data center construction and design worldwide. Digital is the largest landlord of data center properties in the world.

In a press release, Digital CEO said that Manos “was a tremendous innovator at Microsoft, and his role here at Digital Realty Trust will have an even bigger impact by enabling our customers to directly take advantage of his broad expertise in data center architecture, construction and operations.”

Manos is expected to join Digital Realty Trust in early May. In the release, he is quoted as saying that Digital “has assembled an incredibly talented team of experts in datacenter construction, energy efficiency and operations who have been on the front line of modernizing the way people around the world design and run datacenters. Joining Digital Realty Trust gives me a unique opportunity to play a significant role in creating this new vision of the datacenter and bringing those advancements to the industry as a whole.”

Chris Crosby is also a senior VP of technical services at Digital, and has been the public face of the company at industry shows, giving many technical presentations. Once Manos joins Digital, the two will work side-by-side.

It is unclear how or if Microsoft will fill the vacancy left when Manos leaves.


April 6, 2009  6:25 PM

Sorting through the Google data center summit hype

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Over the past week, there has been a lot of discussion online regarding the Google data center energy summit held at the company’s Mountain View facility last week. In particular, there has been a flurry of activity pointing to a video tour inside a Google container-based data center. Let’s step back and take a look at what information was actually new, however.

Container-based data center

Google confirmed what everyone already knew — that the company has a container-based data center. Robert X. Cringely reported this back in November 2005, so it’s not exactly news. But it’s the first time Google actually confirmed the rumors and showed a sneak peek inside. You can take a look at the video below, taken by Data Center Knowledge:

[kml_flashembed movie="http://www.youtube.com/v/bs3Et540-_s" width="425" height="350" wmode="transparent" /]

Pretty cool stuff. But as James Hamilton, an engineer at Amazon, wrote in a recent blog post, it’s interesting to note that Google built this container-based data center, but then never returned to that design. In fact, Hamilton thinks that the data center design isn’t optimized for shipping containers: Continued »


March 18, 2009  1:28 AM

Update on AFCOM Data Center World figures

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Just now I received a call updating me on the attendance figures from AFCOM’s Data Center World. As it turns out, there were 822 attendees, about 95% of which were end users (that makes about 781 end users, within the 700-800 range I wrote about last week in a post about the AFCOM Data Center World show).

Also, there were 275 exhibiting companies and 345 booths, according to an AFCOM spokesperson. The spokesperson didn’t know how many exhibitor personnel there were. From talking to people from AFCOM and vendors at the show, I heard there were more 900. This makes sense, as it rounds to a little more than 3 people per exhibitor.

I also said in my original post that only one of about a dozen AFCOM board members is an end user. I was wrong. The representatives from Intel and Nortel are data center users that work for vendor companies, and I left them out, so there are actually three end users on the board.

Needless to say, the AFCOM leadership is upset because I said the show was vendor-heavy, but the fact is that it was. Of the 40 educational sessions in four tracks — best practices, data center management, emerging technology, and facilities/greening — 35 were run by vendors or consultants.

In a down economy, I realize there’s a higher chance of there being more vendors than users. But I did hear from users who thought there was too much vendor presence at the show, and not enough end user presence, especially in the educational sessions. I stand by my hope that AFCOM gets more end users to run sessions in future shows.


March 16, 2009  2:41 PM

Multi-tiered data centers

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

Building a data center with one level of redundancy across the facility might seem foolish to some. Do your Web servers need to have the same uptime as your mainframe?

And expensive, too. Peter Gross, the vice president of critical facilities at Hewlett-Packard who was the CEO of EYP Mission Critical Facilities before HP bought them, said that a Tier 4 data center costs about $3,500 per square foot, while a Tier 1 costs $1,000 per square foot.

“With a 50,000-square-foot data center, the cost of a Tier 4 would be about $180 million,” he said. “But if you could do half at Tier 4 and half at Tier 2, it might cost $140 million. So you could save $40 million right there.”

That is the idea behind a new offering from HP and EYP – the ability to build multi-tiered data centers. Gross said the concept consists of building modular data center blocks, each with their own level of redundancy. You can connect them in different backup power configurations — for example N+1 or 2N — and thus can provision your data center in a more detailed way according to what suits your business.

Though HP is calling it a multi-tier specification, the redundancy definitions won’t have anything to do with The Uptime Institute’s tier classification system, which is the de facto standard on data center reliability. Gross said they’re just using the word “tiers” because that’s what data centers are used to hearing. Although using that word might confuse their potential customers, I think.

Gross also said that the redundancy levels will not be restricted to the four-level system from Uptime. Instead, a customer will tell HP EYP, for example, that it wants a 2% failure probability in the next five years for this portion of the data center, and a 7% failure probability in the next five years for that portion of the facility. Then HP EYP will go off and design a data center to that specification, and (presumably) draw up some kind of contract that solidifies it.


March 12, 2009  11:43 PM

Sorting through data center utility rebates

Mark Fontecchio Mark Fontecchio Profile: Mark Fontecchio

There’s a good chance that your utility company has an incentive program in place if you boost your data center energy efficiency. But sorting through the programs can be a hassle, and dealing with the back-and-forth with the utility company can as well.

In addition, said Adam Fairbanks, vice president of data center services for Bluestone Energy Services, data center managers have enough on their plate. They’re more concerned with keeping their facilities online, not looking for utility rebates.

Bluestone is a consulting firm that serves as a middleman between data centers and the utility company. It looks to get data centers utility rebates. By coming into the building and doing an audit, Bluestone can determine if there needs to be a more detailed technical assistance study, and then report on how much money you can save by implementing the utility rebate program.

Fairbanks said that utilities will usually pay for half of this assistance study, and in the end, the utility company itself sends the end user a letter detailing how much of a rebate the utility will give if you move forward with energy efficiency measures.

Data center utility rebate programs from Pacific Gas & Electric have made the most headlines in the data center world, but Fairbanks said utilities up and down the East Coast, in the Midwest and on the West Coast offer incentive programs. Not all of them are as specific as PG&E’s, which for example has one specifically for server virtualization projects. Most of them just give a rebate based on how much power you can prove you’ll save through an energy efficiency project.

Is hiring Bluestone or another consultant worth it? It could be by lessening your own headache, and if Bluestone can prove that the utility rebate will save you more than you have to pay Bluestone in the first place. Then again, if you have the in-house expertise and knowledge to do it yourself, go for it. Exploring your utility’s rebate programs is a great way to pick up some extra savings.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: