Modular, containerized datacenters being sold by vendors such as IBM, Sun, and Rackable Systems fit storage and hundreds, sometimes thousands of servers into one large shipping container with its own cooling system. Microsoft, using Rackable containers, is building a datacenter outside Chicago with more than 150 containerized datacenters, each holding 1,000 to 2,000 servers. Google, not to be outdone, secured a patent last year for a modular datacenter that includes "an intermodal shipping container and computing systems mounted within the container."
(See related slideshow: IT takes a close look at shipping container-based datacenters.)
To hear some people tell it, containerized datacenters are far easier to set up than a traditional datacenter, easy to manage and more power-efficient. It should also be easier to secure permits, depending on local building regulations. Who wouldn't want one?
If a business has a choice between buying a shipping container full of servers, and building a datacenter from the ground up, it's a no-brainer, says Geoffrey Noer, a vice president at Rackable, which sells the ICE Cube Modular Data Center.
"We don't believe there's a good reason to go the traditional route the vast majority of the time," he says.
But that is not the consensus view by any stretch of the imagination. Claims about efficiency are overrated, according to some observers.
IBM touts a "modular" approach to datacenter construction, taking advantage of standardized designs and predefined components, but that doesn't have to be in a container. "We're a huge supporter of modular. We're a limited supporter of container-based datacenters," says Steve Sams, vice president of IBM Global Technology Services.
Containers are efficient because they pack lots of servers into a small space, and use standardized designs with modular components, he says. But you can deploy storage and servers with the same level of density inside a building, he notes.
Container vendors often tout 40 to 80 percent savings on cooling costs. But according to Sams, "in almost all cases they're comparing a highly dense [container] to a low-density [traditional data center]."
Containers also eliminate one scalability advantage related to cooling found in traditional datacenters, according to Sams. Just as it's more efficient to cool an apartment complex with 100 living units than it is to cool 100 separate houses, it's more cost-effective to cool a huge datacenter than many small ones, he says. Air conditioning systems for containerized datacenters are locked inside, just like the servers and storage, making true scalability impossible to achieve, he notes.
Gartner analyst Rakesh Kumar says it will take a bit of creative marketing for vendors to convince customers that containers are inherently more efficient than regular datacenters. Gartner is still analyzing the data, but as of now Kumar says, "I don't think energy consumption will necessarily be an advantage."
That doesn't mean there aren't any advantages, however. A container can be up and running within two or three months, eliminating lengthy building and permitting times. But if you need an instant boost in capacity, why not just go to a hosting provider, Kumar asks.
"We don't think it's going to become a mainstream solution," he says. "We're struggling to find real benefits."
Kumar sees the containers being more suited to Internet-based, "hyper-scale" companies such as Google, Amazon, and Microsoft. Containerized datacenters offer scalability in big chunks, if you're willing to buy more containers. But they don't offer scalability inside each container once it has been filled, he says.
Container vendors tout various benefits, of course. Each container is almost fully self-contained, Rackable's Noer says. Chilled water, power, and networking are the only things from the outside world that must be connected to each one, he says. Rackable containers, which can be fitted with as many as 22,400 processing cores in 2,800 servers, are water-tight and are fitted with locks, alarms, and LoJack-like tracking units. Sun's Modular Data Center can survive an earthquake -- the company made sure of that by testing it on one of the world's largest shake tables at the University of California in San Diego.
A fully equipped Rackable ICE Cube costs several million dollars, mostly for the servers themselves, Noer says. The container pays for itself with lower electricity costs due to an innovative Rackable design that maximizes server density, Noer says.
But it's still too early to tell whether containerized datacenters are the way of the future. "We're just at the cusp of broad adoption," Noer says.
Potential use cases for containers include disaster recovery, remote locations like military bases, or big IT hosting companies that would prefer not to build brick-and-mortar datacenters, Kumar says.
A TV crew that follows sporting events may want a mobile datacenter, says Robert Bunger, director of business development for American Power Conversion. APC doesn't sell its portable datacenter, but in 2004, it built one into a tractor-trailer as a proof-of-concept. It was resilient. "We pulled that trailer all over the country" for demos, Bunger notes.
But APC isn't seeing much demand, except in limited cases. For example, a business that needs an immediate capacity upgrade but is also planning to move its datacenter in a year might want a container because it would be easier to move than individual servers and storage boxes.
UC San Diego bought two of Sun's Modular Data Centers. One goal is to contain the cost of storing and processing rapidly increasing amounts of data, says Tom DeFanti, principal investigator of the school's GreenLight energy efficiency research project. But it will take time to see whether the container approach is more efficient. "The whole idea is to create an experiment to see if we can get more work per watts," DeFanti says.
The Modular Data Center is not as convenient to maintain as a regular computer room, because there is so little space to maneuver inside, he says. But "It seems to me to be an extremely well-designed and thought-out system," DeFanti says. "It gives us a way of dealing with the exploding amount of scientific computing that we need to do."
Beware vendor lock-in
Before purchasing a containerized datacenter, enterprises should consider several issues related to their manageability and usefulness. Vendors often want you to fill the containers with only their servers, Kumar notes. Besides limiting flexibility at the time of purchase, this raises the question of what happens when those servers reach end-of-life. Will you need the vendor to rip out the servers and put new ones in, once again limiting your choice of technology?
"At the moment, most vendors will fill their containers only with their servers," Kumar says.
IBM, however, says it uses industry-standard racks in its portable datacenter, allowing customers to buy whatever technology they like. (Compare server products.) DeFanti said Sun's Modular Data Center allows him the flexibility to buy a heterogeneous mix of servers and storage. Rackable, though, steers customers toward either its own servers or IBM BladeCenter machines through a partnership with IBM.
"I think vendors are learning that people want more flexibility," DeFanti says.
Another consideration is failover capabilities, says Lee Kirby, who provides site assessments, data center designs and other services as the general manager of Lee Technologies. If one container goes down, its work must be transferred to another. Server virtualization will help provide this failover capability, and also make it easier to manage distributed containerized datacenters -- an important consideration for customers who want to distribute computing power and have it reside as close to users as possible, Kirby says.
"I think it is key that the combination of virtualization and distributed infrastructure produce a container that can be out of service without impacting the application as a whole," Kirby says.
This story, "Google, Microsoft spark interest in modular datacenters" was originally published by NetworkWorld.