Maximum Operating Temperature Of iSeries / P Series Servers

1050 pts.
Tags:
Data center cooling
iSeries data center
Raised floors
I work in a Data Center with a raised floor. There is a 10 ton unit that blows air from the ceiling and a 5 ton unit that only blows the air under the floor. For a backup unitl, we had an 5 ton unit that blows air from the ceiling, but that unit broke and facilities refuses to repair it since the plan is to replace it with a 10 or 15 ton unit (which may take months before we get through the red tape to have it finally installed). Over the years the heat load in this Data Center has grown tremendously. When the 10 ton until failed a few months ago and we had the 5 ton backup working the temperature in this room shot up to 90 degrees. So, if the 10 ton unit fails now (without a working backup) then the temperature will surely go over 100 degrees. My question is, at what temperature can the iSeries and P Series operate at without damage the hardware? Of course, we would open the floor tiles, open the doors, install large fans and portable A/C units, but our iSeries and P Series severs share the room with a massive server farm (over 100 servers). My biggest concern is this happening over a weekend were I won't be able to set to the room to save the Data Center in time. A lot of these are blade servers generate heat like a furnace. My server guy thinks the servers will survive even if the temperature hits 140. I think it's safe to say he is dead wrong. So, at what temperature can we expect hardware problems to occur?

Answer Wiki

Thanks. We'll let you know when a new response is added.

As a facilities manager for over 20 years and not knowing your business requirements, structures or political structures, the first issue is that I think facilities need a kick in the bum for two reasons. A not providing a environment that allows you system to run in the required redundant mode (if critically required) either due to sufficient capacity and the required short term repairs.
The second is that from your description the computer room air conditioning design is not efficient and not the required system for a modern (or the last 10 years) data centre. The fact you have both under floor supplied air and on ceiling supplied air (Unless it is a Liebert type or similar correctly designed & constructed hot/cold isle system) the air flow from the systems are competing with each other and creating a mixed air flow temperature, instead of a correct cold supply to the servers and the correct hot air return to the A/C units, with the minimum amount of hot & cold air mixing.
While the air mixing may be removing the total heat from the room the actual air to the servers is hotter than possible and the efficiency drops and when you have A/C unit failures you have less ability to manage the higher temperatures as the mixed air into the servers is hotter than if correctly separated. E.g less redundant ability to handle failures.
In relation to the server temperatures it is really the operating temperature of the electronic components on the cards in the servers. Once theses reach about 60degC or 140degF they are fried. The ability of the server to remove the heat from the components is the server design and the amount of air flow its fans provide. (heat removal)
In practice I have found that nearly all servers are ok until the air into the server reaches 32degC or 89degF. A lot of servers now specify supply temperatures up to 38 to 40degC (100 to 104degF)
One way to help slightly may be to try and increase the air blown thought he server.

Discuss This Question: 1  Reply

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • Batman47
    Thank you Dave, I don't think we have to worry about air mixing problem from the 5 ton unit because you can't even feel the air blowing under the floor thru the perforated tiles. The vendor recommended the smaller unit blow under the floor to prevent a 'dead rat' smell. People laughed at that but when the 5 ton unit failed he was absolutely correct. Our primary unit works well, especially when the 2nd stage kicks in (2nd compressor). We do have hot & cold isles... the cold air blows in from the ceiling unobstructed (registers removed) down in front of the server racks. Of course, there are many ways to improve this.... we have one blade server rack with no return... so that hot air is indeed mixed into the cold. Our biggest problem right now is not having redundancy at all. We spent $100,000 on new units a few years ago and now they are talking about a redundant system totalling 15 tons blowing from the ceiling. I can see that being the primary in a few short years and then we are back to not having a sufficient redundancy again. Crazy, huh? In the meantime I'm on standby like a volunteer fireman waiting for a disaster until this company wakes up and provides sufficient reduntant cooling. The key right now will probably be get ting the servers guys in here to shut down as many servers as possible. I do believe 100 degrees for an extended period spells disaster. Thanks for your input.
    1,050 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following