I've been surprised at the way fairly traditional companies have embraced the cloud -- but don't always embrace the benefits. For most, the payoff has been relatively small and confined to the infrastructure layer.
The thing is, most of the benefits of IaaS (infrastructure as a service) have already been realized through virtualization, which during the last decade cut costs through the infrastructure equivalent of Conway's law. Every department wanted its own server or, worse, its own server farm. Virtualization provided those departments with the illusion of dedicated infrastructure, while at the same time enabling management to pool resources and centralize IT operations.
[ Which freaking PaaS should you use? InfoWorld helps you decide. | Stay on top of the state of the cloud with InfoWorld's "Cloud Computing Deep Dive" special report. | For a quick, smart take on the news you'll be talking about, check out InfoWorld TechBrief -- subscribe today. ]
This resulted in some great cost savings. Granted, the software was expensive -- I frequently heard people report that a VM cost 80 percent of its "metal" equivalent -- but those figures usually didn't count the ancillary costs. It was a pretty good deal.
Nonetheless, we often saw companies that had adopted virtualization deploy small virtual server farms by hand with no automation. Basically, it was the modern version of the IT guy with a stack full of floppies loading Windows on a bunch of machines.
Why IaaS is no panacea
Although the benefits vary based on the services offered, IaaS tends to be a small step up from in-house virtualization. Management in particular has a tendency to get all excited about the idea of "outsourcing to the cloud." Soon it becomes clear that maintaining cloud infrastructure takes nearly as much work -- plus, in many cases, special knowledge of a particular cloud service's peculiarities.