Kenneth Brill, executive director of the Uptime Institute, has an article over on Forbes.com in which he shares five tips for cutting operating expenses related to cooling in the datacenter. They include the following:
1. Set temperature and relative humidity set control points correctly on cooling units. Deeming this the "lowest hanging fruit," Brill states that cold intake air "is actually bad for reliability because it causes water to condensate inside the hardware." Rather than setting your temperature at a chilly 59 degrees Fahrenheit, he asserts that the "desired cooling unit leaving temperature would be 70 degrees F."
[ For additional tips on cutting costs in your datacenter, please read "Beat the datacenter heat, cheap." ]
2. Don't run more cooling units than you need. According to Uptime research, datacenters run, on average, three times more cooling units than necessary. On top of that, he writes, "computer rooms with the most excessive cooling had the highest percentage of hot spots." Having just as much cooling as you need to address your heat load is ideal in that it increases cooling stability while saving energy.
3. Ensure that your cooling units are delivering at their rated capacity. There are various reasons why your cooling systems aren't operating at the level outlined by the manufacturer, Brill writes. It could be due to incorrect piping, plugged filters, or stuck throttling valves. He recommends that you bring in a third party, rather than your current contractor, to perform the investigation.
4. "Deliver cold air where it is most needed." Rather than randomly mixing hot and cold air in your datacenter, Brill prescribes (among other things) setting up hot and cold aisles. The difference in the temperatures between the two aisles should be at least 10 degrees Fahrenheit. "A small difference indicates significant air mixing, reduces cooling capacity and efficiency," he writes.
5. "Eliminate dehumidification and humidification." Brill writes that a restricted amount of outside air should be coming into a computer room. As such, he advises that "there should be no need for de-humidification or humidification." To address this, he suggests performing an inventory of all the cooling units in your datacenter to determine which have humidification or dehumidification lights on. Being in either mode can be a huge waste of energy, as well as "a significant water leakage risk."
Notably, this advice would seem not to apply to datacenter operators that use outside air, or "free cooling," to chill their datacenters.
I've just skimmed the surface of Brill's advice. I suggest you read it in its entirety.