The news that AT&T has joined the rapidly growing ranks of cloud computing providers reinforces the argument that the latest IT outsourcing model is well on its way to becoming a classic disruptive technology.
By enabling datacenter operators to "publish" computing resources -- such as servers, storage, and network connectivity -- cloud computing provides a pay-by-consumption scalable service that's usually free of long-term contracts and is typically application- and OS-independent. The approach also eliminates the need to install any on-site hardware or software.
[ Get past the "cloud" hype, with InfoWorld's special report on what cloud computing: What the cloud really is -- and isn't. | Lessons from early adopters. | Dangers of the cloud. | Test Center: Cloud vs. cloud. ]
Currently dominated by Amazon.com and several small startups, cloud computing is increasingly attracting the interest of industry giants, including Google, IBM, and now AT&T. "Everyone and their dog will be in cloud computing next year," predicts Rebecca Wettemann, vice president of research at Nucleus Research, a technology research firm.
Yet James Staten, an infrastructure and operations analyst at Forrester Research, warns that prospective adopters need to tread carefully in a market that he describes as both immature and evolving. Staten notes that service offerings and service levels vary widely between cloud vendors. "Shop around," he advises. "We're already seeing big differences in cloud offerings."
To help cut through the confusion, here's a rundown some major cloud providers -- both current and planned -- all offering resources that go beyond basic services such as SaaS (software as a service) applications and Web hosting:
3Tera: appliance-driven virtual servers
3Tera's AppLogic is a grid engine that has evolved over time into a full-fledged cloud computing environment. The company says its offering is designed to enable datacenters to replace expensive and hard-to-integrate IT infrastructure -- such as firewalls, load balancers, servers, and SANs -- with virtual appliances. Each appliance runs in its own virtual environment.
AppLogic combines servers into a scalable grid that's managed as a single system via a browser or secure shell. According to 3Tera, datacenters can add or remove servers on the fly, monitor hardware, manage user credentials, reboot servers, install software, build virtual appliances, back up the system, repair damaged storage volumes, inspect logs, and perform every other management tasks from a single point of control, all while the system is running.
Amazon.com: As-you-need-them basic IT resources
Amazon was an early cloud computing proponent, and the company now has one of the market's longest menu of services. Amazon's core cloud offering, the Elastic Compute Cloud (EC2), provides a virtualized cloud infrastructure that's designed to provide scalable compute, storage, and communication facilities.
Amazon's cloud computing arsenal also includes the Simple Storage Service (S3), a persistent storage system, as well as the Simple Database (SimpleDB), which provides a remotely accessible database, and the Simple Queuing Service (SQS), a message queue service that's also an agent for tying together distributed applications created by the EC2, S3, and SimpleDB combo.
AT&T: Scalable hosting in a managed network
AT&T Synaptic Hosting aims to give datacenters the ability to manage applications, compute resources on servers, and stored data elastically, so they can scale up or down as needed. The hosted platform provides dynamic security and storage capabilities as well as a Web portal to manage capacity, conduct maintenance, and monitor network service and performance.
AT&T has long offered hosting services, but not ones that could scale up or down as needed. AT&T's resources and services run within its own network, rather than across datacenters linked via the public Internet, which the company claims provides more certainty over server levels.
Google: Resources for small businesses and developers
Google already offers several cloud-based services, such as e-mail and storage, for consumers, a well as the AppEngine development and provisioning platform for individual developers. The company's logical next step, given its vast infrastructure resources, would be a move into the enterprise cloud market.
"There's not that much difference between the enterprise cloud and the consumer cloud," Google CEO Eric Schmidt said last May during an appearance in Los Angeles with IBM chief Sam Palmisano, as the companies announced a joint cloud computing initiative. Over the next year or so, Google and IBM plan to roll out a worldwide network of servers for a cloud computing infrastructure. The IBM-Google cloud runs on Linux-based machines using Xen virtualization and Apache Hadoop, an open source implementation of the Google File System. Provisioning is automatic, courtesy of IBM's Tivoli Provisioning Manager.
IBM: A platform for your "internal" cloud
Aside from its Google venture, IBM is focusing its cloud strategy on "Blue Cloud," a series of cloud computing offerings that will enable computing across a distributed, globally accessible fabric of resources, rather than on local machines or remote server farms. Built on IBM's massive-scale computing initiatives, Blue Cloud aims to give datacenters the ability to establish their own cloud computing architecture to handle the enormous data-processing power required for video, social networking, and other Web 2.0 technologies.
Initially, the Blue Cloud technology must be deployed internally at each organization, essentially as the foundation for an "internal" cloud. The Blue Cloud platform, running on IBM BladeCenters with Power and x86 processors and Tivoli service management software, dynamically provisions and allocates resources as workloads fluctuate for an application. Blue Cloud is being billed as a more distributed computing architecture than typically found in most enterprise datacenters. It is based on Hadoop. Over time, IBM expects to offer Blue Cloud resources on demand, in the provisioned style of Amazon.com and AT&T.
IBM also provides hosting services for SaaS providers, including SAP and SucecssFactors.
Sun Microsystems: An on-demand grid, and perhaps more
With its "the network is the computer" mantra, Sun provided much of the inspiration for the cloud computing movement. And its Sun Grid Engine was one of the first on-demand cloud offerings, providing access to compute and storage resources optimized for parallel-processing applications.
The company also has a research venture dubbed "Project Caroline" meant to provide a configurable pool of virtualized compute, storage, and networking resources to small and medium-size SaaS providers, so they don't need to develop their own infrastructure. There have been recent reports that Sun is planning to turn Project Caroline into a full-blown business, but there's been no official word from they company yet.
Terremark Worldwide: Resource pool for on-demand servers
The Terremark Enterprise Cloud is designed to give datacenters an Internet-optimized computing infrastructure. Enterprise Cloud clients buy a dedicated resource pool of processing, memory, storage, and networking, from which they can deploy servers on demand. A Web portal allows server to be dynamically provisioned from a pre-allocated pool of dedicated computer resources. Terremark promises that its cloud servers behave exactly like their physical counterparts, allowing applications to be run without modification.
XCalibreCommunications: Self-provisioned virtual servers
Described by some observers as Europe's answer to Amazon's EC2, Scotland-based XCalibre's FlexiScale provides self-provisioning of virtual dedicated servers via a control panel or API. Persistent storage is based on a fully virtualized high-end SAN/NAS back end.