Hurricane Electric's new energy-efficient data center isn't much to look at. In fact, at first blush, it's tough to distinguish any appreciable differences between the newly opened facility, housed in an old Apple manufacturing plant in Fremont, Calif., and the company's first data center a mile and half down the road -- even though the latter was built some 10 years ago.
Yet in designing the newest addition to its data center family, Hurricane Electric, which offers collocation services and hosts the world's largest IPv6-native Internet backbone, drew on lessons learned developing and running its first facility over the years. Those lessons are reflected in its approach to selecting power, cooling, and IT equipment; laying out server racks; setting the temperature of the facility; and staggering equipment installation.
In terms of similarities, both data centers boast row after row of server racks, housed in what look like large, identical, dark-blue lockers. Each has its own number and combination lock. Server racks in both facilities are installed in an energy-efficient hot-aisle/cold-aisle arrangement, which prevents mixing of air. Both facilities are comfortably warm, not meat-locker-cold like a traditional data center.
Free cooling today, liquid cooling tomorrow?
Yet there are certainly differences. Among them, the first data center has a traditional raised floor, designed for cold air to blow upward to cool machines. Phases one and two of the new facility, which offers a total of 208,000 square feet of floor space, have no raised floor; racks rest on the concrete floor, and cool air comes down from above, delivered via McQuay Maverick II Rooftop HVAC systems.
The new cooling system, which Hurricane has also installed in its original data center, uses a significantly more energy-efficient approach to cooling, compared to employing constantly spinning fans that push artificially chilled air up from the floor, These rooftop systems use variable-frequency drives, intended to rotate only as much as needed. More important from an energy-savings perspective, the systems run in full-economizer mode, which means much of the data center's cooling is free, compliments of Fremont's mild climate.
The data center's warmer climate still surprises and even concerns some clients, Levy said. To alleviate concerns that their machines aren't going to overheat, Hurricane has small signs hung up beside doors to the data center, explaining that the temperature and humidity are in line with ASHRAE's guidelines and that the approach saves energy. (The sign plays up the green angle but doesn't allude to the cost savings, which presumably benefit customers' pocketbooks.)
In discussing cooling, Martin Levy, director of IPv6 strategy at Hurricane, mourned the passing of SprayCool, a company that developed an innovative liquid cooling technology. SprayCool devised a non-corrosive, non-conductive liquid that, when sprayed on, say, a hot CPU, would evaporate and instantly cool the processor, thus theoretically eliminating the need for power-hungry fans.
Levy suggested the company made some business mistakes -- such as basing itself in Spokane, Wash., instead of Silicon Valley -- and that its efforts also were thwarted by Intel rolling out more energy-efficient chips and by data center operators' reluctance to permit liquid near their machines. But that sort of liquid cooling is "the wave of the future," Levy predicts. "[It] will come back with a vengeance."
Stuck with AC power
For power delivery, after much deliberation and research, Hurricane Electric went with Eaton's 9395 UPS system, which operates at 99 percent efficiency in Energy Saver Mode. That is, the UPS has built-in intelligence to adjust to power demands and deliver load as efficiently as possible. Only when power is lost or when load is out of a pre-specified voltage or frequency limit does its rectifier and inverter kick in.
I asked Levy why Hurricane went with a traditional AC-based power delivery system instead of a DC-based system, which arguably is more energy efficient because it requires fewer conversions as power travels from the wall to the rack. His reply: "It's hard to use [DC] when you have customers bringing in their own equipment. If I were a Facebook or a Google, going the homogeneous route, I'd use it in a heartbeat. It's a great win."
[ Syracuse University's a href="http://www.infoworld.com/d/green-it/syracuse-university-enlists-dc-power-liquid-cooling-green-data-center-749?source=fssr">newly opened Green Data Center runs on DC.]
Up and Atom
During the tour, Levy was quite keen to show me the innards of one of the servers the company uses for customers, who tend to run lightweight Web apps. The servers in question are from Super Micro, running Atom processors from Intel. Yes, that Atom, the processor made famous for powering lightweight computers like netbooks. Levy said the servers are extraordinarily efficient, requiring a mere 37 watts in part because the servers have no fans or any other superfluous components. (Google claims to use bare-bones servers as well to save on energy costs.) Compare that to the 280-plus watts necessary for a more traditional 1U server.
Slow, steady, and simple
Even though companies are reportedly clamoring for data center space, Hurricane Electric isn't working 24/7 to pack every square foot of its new facility with powered-on infrastructure -- a traditional tactic employed by data center operators in the past. Rather, the company is steadily adding and turning on power and cooling in line with demand, a measured approach that saves money.
For example, there are currently just two backup power generators, even though the facility will ultimately need 14 when the facility is fully occupied. "If this was 1999, we would buy all 14 generators today," he said.
Hurricane Electric is in the process of developing tools in-house to monitor power consumption and efficiency in its data center, rather than going with a toolset from an existing vendor. Levy describes having that sort of power-consumption information available for customer is a competitive differentiator (though Hurricane certainly isn't alone here). The idea is to give customers granular, detailed information about how efficiently they're using the power and infrastructure they're paying for -- as well as guidance as to how to make better use of it all, such as replacing older servers with new, more energy-efficient variants.
Notably, the company doesn't employ temperature-mapping technology to pinpoint hot spots in real time. Hot spots aren't a problem, according to Levy, because of the low-density design of the facility.
Speaking of power, Hurricane has planned for the data center to run out of excess space at the same time that it runs out of excess power, thus avoiding the unenviable position in which other data center operators have been stuck. Solutions to the problem have included virtualization, which doesn't work very well for a collocation provider whose customers decide how their gear is used. Another approach to gain more energy, as well as reduce energy costs, is to invest in solar panels, microturbines, or other systems for generating power on-site. Levy dismisses this as not worthwhile, in part because of the added complexity it would introduce.
In fact, Hurricane doesn't even use cages to separate groups of racks from one another, a feature some collocation providers offer in the name of security so that some miscreant or a competitor can't wander up to your machines and tamper, spy, or so on. Notably, in the first phase of the new data center, Hurricane set up racks in 11 similarly designed rooms, each with their own door and each housing 80 cabinets arranged in six rows. The approach didn't yield enough benefits to warrant separating the equipment in that manners, so in the current phase, Hurricane is simply lining up row after identical row in a vast warehouse-like room.
Maintaining simplicity, in fact, is one of Hurricane's unwritten tenets, which explains why there's absolutely nothing flashy -- or even particularly eye-catching -- about the facility. Rather, the company has figured out a basic recipe that works and is running (or rather, marching steadily) with it.
This story, "Simplicity yields efficiency at new Hurricane Electric data center," was originally published at InfoWorld.com. Follow the latest developments in green IT at InfoWorld.com.