when relevant content is
added and updated.
Google used machine learning to parse the multitudinous data inputs on its data center operations, as a way to bust through a plateau in energy efficiency evidenced by its measured power usage effectiveness (PUE).
In a white paper describing the effort to improve PUE below 1.12, Google’s Jim Gao, data center engineer, wrote that the machine learning approach does what humans cannot: Model all the possible operating configurations and predict the best one for energy use in a given setting.
The 19 factors that interrelate to affect energy usage are as follows, according to Google’s program:
- Total server IT load (kW)
- Total campus core network room IT load (kW)
- Total number of process water pumps (PWPs) running
- Mean PWP variable frequency drive (VFD) speed: Percent
- Total number of condenser water pumps (CWP) running
- Mean CWP VFD speed: Percent
- Total number of cooling towers running
- Mean cooling tower leaving water temperature set point
- Total number of chillers running
- Total number of dry coolers running
- Total number of chilled water injection pumps running
- Mean chilled water injection pump set point temperature
- Mean heat exchanger approach temperature
- Outside air wet bulb temperature
- Outside air dry bulb temperature
- Outside air enthalpy (kJ/kg)
- Outside air relative humidity: Percent
- Outdoor wind speed
- Outdoor wind direction
Gao states: “A typical largescale [data center] generates millions of data points across thousands of sensors every day, yet this data is rarely used for applications other than monitoring purposes.” Machine learning can understand nonlinear changes in efficiency better than traditional engineering formulas.
Read the paper here.