Data center temperature policies

20 pts.
Tags:
Cisco
Cooling systems
Data Center administration
Data center cooling
Data Center in 2010
Data Center management
Data Center solution
I am involved with an energy efficiency project where the server rooms in 15 buildings lost their servers when a network was installed.  These rooms in these buildings now only have from one to three routers and switches in them.  The buildings are used only four days a week and we wanted to implement  building wide cooling standards of 74 degrees occupied with a setback to 85 degrees during unoccupied periods,   Do you know of any problem with these rooms having the same cooling set-points as the rest of the spaces in the building?   The client seems insistent that these room be kept at 72 degrees (citing TIA-569-B) but that could be because that's the temperature they were kept at when they used to have servers in the rooms.  This could be an example of "we always did it that way so we always will".  

Google searches yield plenty regarding temperature in rooms with the server in them but nothing for rooms without.   Cisco's on-line equipment specs for router's and switches cite maximum operating temperatures of 104 degrees.   Its in the desert where humidity is typically 4 to 14%.

What is the risk of letting routers and switches expereince swings from 74 to 85 degrees?  I tried asking Cisco but that prroved an exercise in futility.



Software/Hardware used:
Cisco

Answer Wiki

Thanks. We'll let you know when a new response is added.

From Poster of the question:
Manufacturers of IT equipment have acknowledged the need for energy efficiency even in IT facilities by incorporating algorithms to vary the fan speed in their units. If ambient temperature increases, the fan maintains component temperature by simply moving more air across itself. As noted in ASHRAE’s 2008 guide, page 3 discussing the high side limit: “The concern that increasing the IT inlet air temperature might have a significant effect on reliability is not well founded. An increase in inlet temperature does not necessarily mean an increase in component temperature”.

Google ASHRAE Extended Environmental Envelope August 1 2008, or google ASHRAE AND IT Room Temperatures

The existing buildings in the project vary in age, shell construction, insulation attributes, and HVAC equipment capacities. Some of the buildings are a corrugated metal exterior with varying ceiling heights to accommodate varied trade and craft activities. Built for specific purposes, many were provided with only a modicum of heating or cooling capacity yielding modest levels of comfort versus winter or summer conditions outside, but not intended to achieve or maintain office operating temperatures, let alone server room temperatures. Some buildings have a conditioned office space built-out inside the shell and some of these do have cooling and heating separate from the modicum provided for trades and crafts people, but even some of these situate the communication equipment room in such a way the desired temperature constant was not achievable in the past and remains so now. Indeed at one building with a porous exterior shell and minimal cooling equipment, loggers indicate when it was 98 degrees outside in the unoccupied weekend, the swamp coolers only maintained 92 degrees inside by the data equipment! Been that way for a few years now.

Given these existing-condition realities at the facilities, the client already has practical experience with the subject of operating communication room equipment with temperature fluctuations. If there have been no failures it is likely for the reasons of manufacturer’s built-in durability to tolerate high temperatures or, as stated by ASHRAE, on-board algorithms increase fan speed in the equipment to maintain component temperature.

Whatdya’ think?

Discuss This Question: 3  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • Labnuke99
    One thing to remember is that the room temperature is not going to reflect the internal temp of the servers/routers, etc. So, the 85 degree extreme sounds very high for these types of devices. Servers and drive arrays like their temps to be kept as steady as possible and cycling temps up & down can cause premature failures. I would recommend setting a constant temp for these spaces where servers are in use.
    32,960 pointsBadges:
    report
  • jinteik
    I recommend that any equipment (server, firewall, router, switches and etc) should always be kept in a cool room. I agree with Labnuke99 that room temp will not be the same as internal hardware temp...with temp cooler, i believe your devices will too last longer.. if the room is hot, servers can shut down by itself when it reach the maximum heat rate too..
    17,370 pointsBadges:
    report
  • Labnuke99
    I would say it depends on the end user organization's tolerance for equipment failure and downtime. What is the MTBF & MTTR? What is the restore point objective (RPO) also?
    32,960 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following