A datacenter with a mind of its own — or more accurately, a brain stem of its own that would regulate the datacenter equivalents of heart rate, body temperature, and so on. That's the wacky notion IBM proposed when it unveiled its autonomic computing initiative in 2001.
Of the initiative's four pillars, which included self-configuration, self-optimization, and self-protection, it was self-healing — the idea that hardware or software could detect problems and fix itself — that created the most buzz. The idea was that IBM would sprinkle autonomic-computing fairy dust on a host of products, which would then work together to reduce maintenance costs and optimize datacenter utilization without human intervention.
Ask IBM today, and it will hotly deny that autonomic computing is dead. Instead it will point to this product enhancement (DB2, WebSphere, Tivoli) or that standard (Web Services Distributed Management, IT Service Management). But look closely, and you’ll note that products such as IBM's Log and Trace Analyzer have been grandfathered in. How autonomic is that?
The fact is that virtualization has stolen much of the initiative's value-prop thunder: namely, resource optimization and efficient virtual server management. True, that still involves humans. But would any enterprise really want a datacenter with reptilian rule over itself?
-- Eric Knorr
How do you see autonomic computing and similar self-responsive computing initiatives affecting the enterprise long term?
Crackpot tech 2008: Crackpot technologies that could shake up IT
Eight more technologies that straddle the divide between harebrained and brilliant -- each with a promise to transform the future of the enterprise
Crackpot tech 2007: 12 crackpot tech ideas that could transform the enterprise
These technologies straddle the divide between harebrained and brilliant as they promise to shake the pillars of tomorrow's enterprise