Technologies that push the envelope of the plausible capture our curiosity almost as quickly as the would-be crackpots who dare to concoct them become targets of our derision.
Tinkering along the fringe of possibility, hoping to solve the impossible or apply another's discovery to a real-world problem, these free thinkers navigate a razor-thin edge between crackpot and visionary. They transform our suspicion into admiration when their ideas are authenticated with technical advances that reshape how we view and interact with the world.
[See also: Slideshow on 12 crackpot technologies; Plus: Share your own crackpot tech stories or nominate your favorites.]
IT is no stranger to this spirit of experimentation. An industry in constant flux, IT is pushed forward by innovative ideas that yield advantage when applied to real-world scenarios. Sure, not every revolutionary pose sets the IT world afire. But for every dozen paper-based storage clunkers, there's an ARPAnet to rewrite IT history -- itself a time line of what-were-they-thinkings and who-would-have-thoughts.
It's in that tenor that we take a level-headed look at 12 technologies that have a history of raising eyebrows and suspicions. We assess the potential each has for transforming the future of the enterprise.
1. Superconducting computing
How about petaflops performance to keep that enterprise really humming? Superconducting circuits -- which are frictionless and therefore generate no heat -- would certainly free you from any thermal limits on clock frequencies. But who has the funds to cool these circuits with liquid helium as required? That is, of course, assuming someone comes up with the extremely complex schemes necessary to interface this circuitry with the room-temperature components of an operable computer.
Of all the technologies proposed in the past 50 years, superconducting computing stands out as psychoceramic. IBM's program, started in the late 1960s, was cancelled by the early 1980s, and the Japan Ministry of Trade and Industry's attempt to develop a superconducting mainframe was dropped in the mid-1990s. Both resulted in clock frequencies of only a few gigahertz.
Yet the dream persists in the form of the HTMT (Hybrid Technology Multi-Threaded) program, which takes advantage of superconducting rapid single-flux quantum logic and should eventually scale to about 100GHz. Its proposed NUMA (non-uniform memory access) architecture uses superconducting processors and data buffers, cryo-SRAM (static RAM) semiconductor buffers, semiconductor DRAM main memory, and optical holographic storage in its quest for petaflops performance. Its chief obstacle? A clock cycle that will be shorter than the time it takes to transmit a signal through an entire chip.
So, unless you're the National Security Agency, which has asked for $400 million to build an HTMT-based prototype, don't hold your breath waiting for superconducting's benefits. In fact, the expected long-term impact of superconducting on the enterprise remains in range of absolute zero.
-- Martin Heller
2. Solid-state drives
Solid-state storage devices -- both RAM-based and NAND (Not And) flash-based -- have held promise as worthwhile alternatives to conventional disk drives for some time despite the healthy dose of skepticism they inspire. By no means new, their integration into IT will only happen when the technologies fulfill their potential and go mainstream.
Volatility and cost have been the Achilles' heel of external RAM-based devices for the past decade. Most come equipped with standard DIMMs, batteries, and possibly hard drives, all connected to a SCSI bus. And the more advanced models can run without power long enough to move data residing on the RAM to the internal disks, ensuring nothing is lost. Extremely expensive, the devices promise speed advantages that, until recently, were losing ground to faster SCSI and SAS drives. Recent advances, however, suggest RAM-based storage devices may pay off eventually.
As for flash-based solid-state devices, early problems -- such as slow write speeds and a finite number of writes per sector -- persist. Advances in flash technology, though, have reduced these negatives. NAND-based devices are now being introduced in sizes that make them feasible for use in high-end laptops and, presumably, servers. Samsung's latest offerings include 32GB and 64GB SSD (solid-state disk) drives with IDE and SATA interfaces. At $1,800 for the 32GB version, they're certainly not cheap, but as volume increases, pricing will come down. These drives aren't nearly the speed demons their RAM-based counterparts are, but their read latency is significantly faster than that of standard hard drives.
The state of the solid-state art may not be ready for widespread enterprise adoption yet, but it's certainly closer than skeptics think.
-- Paul Venezia
3. Autonomic computing
A datacenter with a mind of its own -- or more accurately, a brain stem of its own that would regulate the datacenter equivalents of heart rate, body temperature, and so on. That's the wacky notion IBM proposed when it unveiled its autonomic computing initiative in 2001.
Of the initiative's four pillars, which included self-configuration, self-optimization, and self-protection, it was self-healing -- the idea that hardware or software could detect problems and fix itself -- that created the most buzz. The idea was that IBM would sprinkle autonomic-computing fairy dust on a host of products, which would then work together to reduce maintenance costs and optimize datacenter utilization without human intervention.
Ask IBM today, and it will hotly deny that autonomic computing is dead. Instead it will point to this product enhancement (DB2, WebSphere, Tivoli) or that standard (Web Services Distributed Management, IT Service Management). But look closely, and you'll note that products such as IBM's Log and Trace Analyzer have been grandfathered in. How autonomic is that?
The fact is that virtualization has stolen much of the initiative's value-prop thunder: namely, resource optimization and efficient virtual server management. True, that still involves humans. But would any enterprise really want a datacenter with reptilian rule over itself?
-- Eric Knorr
4. DC power
The warm, humming bricks that convert AC from the wall to the DC used by electronics are finally drawing some much deserved attention -- from datacenter engineers hoping to save money by wasting less energy. The waste must often be paid for twice: first to power equipment, then to run the air conditioner to remove the heat produced. One solution is to create a central power supply that distributes pure DC current to rack-mounted computers. But will cutting out converters catch on, or is the buzz surrounding DC to the datacenter destined to fizzle?
Researchers at the Department of Energy's Lawrence Berkeley National Laboratory have built a prototype rack filled with computers that run directly off 380-volt DC. Bill Tschudi, principal investigator at the lab, says that the system uses 15 percent less power than do servers equipped with today's most efficient power supplies -- and that there can be even greater savings when replacing the older models still in use in most enterprises. If the server room requires cooling, as it does everywhere except in northern regions in the winter, the savings can double, because the air-conditioning bill also can be cut by 15 percent.
Others are working on bringing additional DC savings to the enterprise. Nextek Power, for instance, is building a system that integrates the traditional power grid, rooftop solar panels, and computer hardware using DC power. Choosing this standard avoids the inefficiencies of converting the DC produced by the panels to AC, then back to DC when it reaches the computers.
"It's a big opportunity, because we've shown that there's big energy savings," Tschudi says of the prospects of DC. "And it's also got more reliability because there are fewer points of failure."
Cost savings? Reliability? The prospects for DC to the datacenter are looking up.
-- Peter Wayner
5. Holographic and phase-change storage
What enterprise wouldn't benefit from a terabyte USB dongle on every key chain and every episode of Magnum, P.I. on a single disc? Thanks to phase-change memory and holographic storage, today's pipe dream is shaping up to be tomorrow's reality.
Currently under development by IBM, Macronix, and Qimonda, phase-change storage is being touted as 500 times faster and a magnitude smaller than traditional "floating gate" flash technology. Whereas flash memory involves the trapping of electrons, phase-change memory achieves its speed by heating a chalcogenide alloy, altering its phase from crystalline to amorphous.
This technology could prove critical in embedded computing apps, as memory cell degradation has forced many appliance developers to add expensive NVRAM (nonvolatile RAM) to store configuration information, rather than risk premature flash failure. Once realized, it could dramatically drive down the cost of appliances and push new capabilities into enterprise handsets.
Holographic storage, on the other hand, could quickly change the way we think about CDs and DVDs. So quickly, in fact, that enterprise archiving may bypass slow-to-ship dual-layer optical drives altogether and head straight to holographic optical.
InPhase Technologies is already shipping engineering prototypes of a holographic disc storage system with 60 times the storage capacity of today's DVDs. The advent of 3-D optical storage could herald the era of sending a copy of your entire corporate database off-site affordably. Think of what holographic storage could do for personal records portability: a durable ID card that contains your entire medical file in your wallet.
Regardless of which technology ships first, the enterprise is likely to benefit from both soon. Maybe those data crystals from Babylon 5 aren't so far off after all.
-- Brian Chee
6. Artificial intelligence
Few terms carry as much emotional and technical baggage as AI (artificial intelligence). And while science-fiction authors probe AI's metaphysical boundaries, researchers are producing practical results. We may not have a robot for every task, but we do have cell phones that respond to our voice, data-mining tools that optimize vast industries, and thousands of other measurable ways AI-influenced computing enhances how the enterprise gets work done.
That said, AI itself remains elusive, and the measure of AI's position on the enterprise crackpot scale depends wholly on where you set the goals. Restricted to applying templates and well-defined theorems to sets of data with precise definitions, computers are, after all, becoming very adept at using statistics to make educated guesses about the world. And though speech-recognition software, for example, may not hear the actual message, whatever that means, it does know that a certain pattern of sounds and frequencies almost always corresponds to a particular word.
Greg Hager, a professor of computer science researching machine vision at Johns Hopkins University, says, "For a long time, people thought that the way you would solve those problems was to understand how people would solve those things and then write a program that would do what people do."
That approach has yet to produce much success, but as Hager points out, less sophisticated, more statistical algorithms that take educated guesses are becoming increasingly accurate. Some of the best algorithms for recognizing objects in images, for instance, look for salient features, waiting until enough key points are recognized. Such an algorithm could recognize a Ford sedan from multiple angles but wouldn't be smart enough to use that experience to recognize a Chevrolet.
"It's a paper-thin notion of intelligence," Hager says, but one that's still useful in many basic cases. Expect the enterprise to benefit from similar AI-inspired computing paradigms and technologies in the very near future.
-- Peter Wayner
7. E-books
Remember the paperless office? If so, you may recall a close cousin: the e-book, which promised access to entire libraries of documents in easily readable formats -- an obvious boon to the enterprise knowledge worker on the go. As did many ideas debuting midway through the dot-com boom, it failed spectacularly.
And yet a visit to Sony's Connect eBooks suggests that rumors of the e-book's demise have been exaggerated. For a cool $350, you can pick up the Sony Reader and start collecting from more than 11,000 titles.
But what does a shelf's worth of Michael Crichton in your pocket have to do with the enterprise? Not unlike the path to adoption taken by many devices permeating today's mobile enterprise, the e-book's "proof of concept" phase will play out on the consumer stage. And it may just be copyright protection and distribution -- rather than any paper vs. LCD debate -- that determines the technology's long-term prospects.
"Another issue, besides the prohibitive cost and cumbersome nature of e-documents, concerns the vast portion of the contracts that were signed and agreed upon before e-books came onto the scene,says litigation lawyer Esther Lim, a partner at Finnegan Henderson. "That raises questions not just in terms of what rights the user has, but what rights the publisher has vis-a-vis the copyright holder."