Turning up the heat
A more controversial green computing initiative involves raising the maximum air intake temperature on equipment racks in data centers to as high as 80.6 degrees to reduce the energy demands of cooling systems, as Technical Committee 9.9 of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) recommended in 2011. But today, just 7 percent of all data centers run at 75 degrees or higher, according to a recent Uptime Institute survey. The idea of raising the temperature may seem anathema to many data center managers, but some organizations are slowly inching up their thermostats.
At Earth Rangers, a nonprofit focused on environmental education, IT systems director Rob DiStefano raised temperatures in a small data center from the high 60s in 2010 to 77 degrees, but stopped there when network storage temperature alarms went off. "Storage units are the biggest heat monster in the room," he says. He could have reconfigured the alarms from the factory defaults, but the idea made him uncomfortable. "We didn't want to risk it," DiStefano says. And with intake air temperatures at 77 degrees, air temperatures on the back of the racks were getting uncomfortably warm, says Andy Schonberger, director of the Earth Rangers center.
For his part, Humphries raised the temperature in the FedEx Colorado Springs data center by 5 degrees. He declined to say where the temperature is now set, but he says if he set the temperature at 76 degrees on the intake side of the racks, the temperature in the hot aisle would top 100 degrees. "Sending in someone to replace servers in 100-degree heat is not what we want," he says. Humphries says the law of diminishing returns kicks in as you approach the upper range of the ASHRAE limit: Fans run longer and equipment works harder, and adding heat containment would have gone against FedEx's commitment to simplicity in the data center. But raising the temperature 5 degrees in Colorado Springs yielded cost savings that are significant enough for the organization to begin phasing in a similar change at another major data center in Tennessee.
Savings also added up at Raytheon, which raised temperatures in the network distribution rooms in its Tucson, Ariz., facility from 65 to 75 degrees without running into problems. That step alone saved 112,000 kilowatt-hours per month -- enough energy to power 100 homes, according to Moore. Raytheon has expanded the initiative to other facilities, but savings vary depending on location, total power use and other variables.
Roger Schmidt, an IBM fellow and chief engineer on data center efficiency, recommends that Web 2.0 and lower-tier data centers turn the needle closer to the 80.6-degree mark, but he says that even Tier 1 data centers in risk-averse industries such as banking can safely ease the mercury up to 75 degrees.
Another underappreciated strategy is to set up instrumentation in the data center that lets administrators monitor and manage both temperature and power use. Most IT organizations still don't do this, according to Gartner. Schonberger advises building a business case for this by tracking the half-dozen pieces of equipment that are your company's biggest energy consumers. "It doesn't save you any money, but it allows you to prioritize," he says.