“There’s really nobody on the IT side that’s seeing an electricity bill and saying, ‘Gee, I’m responsible for making that meter spin,’ ” says Richard Hodges, principal of GreenIT, a consultancy that advises clients on how to reduce IT power costs.
Bill Clifford, CEO of Aperture Technologies, a supplier of software that helps manage datacenters, agrees. His advice to IT managers: “Go find out who your facilities liaison is and become really good friends. A smart CIO today is going to want to have those types of people on their team to anticipate needs and not simply react to problems.”
The biggest problem with most cooling systems is that datacenters typically have way more than is needed, says Neil Rasmussen, CTO with American Power Conversion, which provides products and services for powering datacenters. He says many datacenters use gear that’s rated for three times as many servers as they’re actually serving. “A lot of people think, ‘I’ll invest in power and cooling for the future,’ ” he says. “In the meantime, you have this big 8,000-horsepower engine and it’s burning fuel.”
Overbuilding made more sense in past decades because installation of air-conditioning systems could require the removal of entire building walls. Now the gear is more modular, which allows designers to add units more gradually, as they are needed.
Another strategy to reduce cooling costs is to abandon the traditional row-oriented approach, in which cold air passes through ducts on a datacenter floor to cool the areas surrounding a bank of servers. Because the air coming out of the vent mixes with much warmer air in the room, this method requires cooling systems to be set to temperatures of 45 degrees or lower, drawing a considerable amount of power.
Instead, Rasmussen encourages the adoption of rack-oriented cooling, which blows cold air directly into server racks, so there is less opportunity for it to mix with the ambient air. Under the rack-oriented approach, air is typically cooled to only 70 degrees, significantly reducing the load on the cooling system. Rack-oriented cooling has the additional benefit of delivering air that has more moisture in it, in many cases eliminating the expense of powering and housing humidifiers.
Other cooling remedies include simple changes to a datacenter’s floor layout so that a server’s exhaust vents in one row aren’t blowing into the intake vents of the next row — which is a problem Rasmussen estimates plagues as many as 30 percent of datacenters. He also suggests companies located in regions where it gets cold use cooling systems that take advantage of those temperatures. So-called economizer settings can save a bundle by drawing on air from outside, eliminating the need to run compressors.
There are other ways to tame the power monster in the datacenter. One possibility that may not be as far off as many think is the use of DC (direct current) to power datacenter.