Datacenters explore novel ways to cut energy use

Air-flow management and aisle containment systems among ideas offered by datacenter operators trying to tackle rising energy consumption costs at their facilities

Putting datacenters on decommissioned ships and reusing hot water from cooling systems to fill the town swimming pool were among the wackier ideas floated at the Data Center Energy Summit on Thursday.

Datacenter operators came together to compare notes about the best ways to tackle rising energy consumption at their facilities. Ideas ranged from the exotic to the more down to earth, like improving air-flow management and using outside air in colder climates to cool equipment.

[ See related story: "The three principles of datacenter energy efficiency" ]

After a brief lull a few years ago, a new wave of datacenter construction and expansion is under way, stretching the power and cooling capacities of existing facilities and putting pressure on utilities' electrical grids, speakers here in Santa Clara, Calif., said.

Subodh Bapat, vice president for energy efficiency at Sun Microsystems, described a perfect storm of factors that are forcing datacenters to become more power efficient.

The electricity consumed by microprocessors is increasing by 16 percent per year as they become more powerful, he said, which contributes to a 14 percent increase in the power consumed by each new generation of servers. At the same time, energy prices in the United States have increased by about 12 percent on average for the past three years and are expected to keep climbing.

That's forcing some datacenters to consider unusual solutions. One large health-care center is looking at reusing hot water expelled by its cooling systems to do its laundry, Bapat said. A hosting company in the Northeast is freezing water overnight, when the cost of electricity is cheaper, and then blowing air over the ice during the day to provide cold air for cooling systems.

Another company hopes to put portable datacenters on ships docked at port, giving it "the biggest heat sink in the world [the ocean] to get rid of waste heat," Bapat said.

Most people at the summit here were looking for more down-to-earth solutions. They were offered by several large datacenters who discussed results from pilot tests designed to help show the real-world savings offered by from various energy-saving techniques.

One of the most effective is better air-flow management so that cold air pumped in to cool equipment doesn't mix with hot exhaust air coming out, said Bill Tschudi of the Lawrence Berkeley National Laboratory.

Many datacenters use alternating hot and cold aisles to keep warm and cold air separate, but that method is only partially effective because the air mixes in the spaces above the aisles, Tschudi said.

Oracle tested "hot-aisle containment" at a datacenter in Austin, Texas, which involves building an enclosure around server racks so that the hot exhaust can be siphoned off. Cooling systems account for as much as half the energy consumed by some datacenters, said Mukesh Khattar, who heads Oracle's energy efforts.

The hot-aisle containment allowed Oracle to reduce the fan speed in its cooling system by 75 percent, which reduced the fan's power consumption by 40 percent, Khattar said. It installed a variable frequency drive to control the fan and got payback for the investment in nine months, he said.

Yahoo tested cold-aisle containment, and then installed a wireless sensor network to monitor temperature and humidity around the room. The sensor network allowed Yahoo to gradually increase the temperature in its datacenter without creating heat spots that could damage equipment. The setup can reduce cooling energy costs by 25 percent, said Christina Page, head of Yahoo's energy and climate strategy.

There are up-front capital costs to consider. Sensor networks cost $6 to $8 per square foot to implement, said Troy Mitchell, sales director with SynapSense, which sells the sensor equipment. Doing hot-aisle containment in a big, 43,000-square foot datacenter like Yahoo's would cost $250,000 to $300,000, he said.

The containment systems must not interfere with sprinkler systems. Yahoo used flame-retardant PVC connected to the racks with "fusible links" that would collapse at 130 degrees Fahrenheit in the event of a fire, allowing the sprinklers to operate, Mitchell said.

Another case study looked at air-side economizing, which is "basically just a fancy name for opening the windows" and using outside air for cooling, Bapat said. The air must be filtered and de-humidified, but in cooler climates like San Francisco, it can be used for most of the year, he said.

Dean Nelson of Sun said datacenters should consider raising their overall temperatures. Sun tested modular cooling systems on five-year-old servers, and they operated without any problems even when aisle temperatures reached 85 degrees Fahrenheit.

"It makes me wonder," he said, "why are we running our cold aisles at 65 degrees?"

Accenture is publishing the results from the case studies on its Web site, along with an overview that compares their effectiveness.

Copyright © 2008 IDG Communications, Inc.

How to choose a low-code development platform