12 crackpot tech ideas that could transform the enterprise

These technologies straddle the divide between harebrained and brilliant as they promise to shake the pillars of tomorrow's enterprise

Technologies that push the envelope of the plausible capture our curiosity almost as quickly as the would-be crackpots who dare to concoct them become targets of our derision.

Tinkering along the fringe of possibility, hoping to solve the impossible or apply another's discovery to a real-world problem, these free thinkers navigate a razor-thin edge between crackpot and visionary. They transform our suspicion into admiration when their ideas are authenticated with technical advances that reshape how we view and interact with the world.

[See also: Slideshow on 12 crackpot technologies; Plus: Share your own crackpot tech stories or nominate your favorites.]

IT is no stranger to this spirit of experimentation. An industry in constant flux, IT is pushed forward by innovative ideas that yield advantage when applied to real-world scenarios. Sure, not every revolutionary pose sets the IT world afire. But for every dozen paper-based storage clunkers, there's an ARPAnet to rewrite IT history -- itself a time line of what-were-they-thinkings and who-would-have-thoughts.

It's in that tenor that we take a level-headed look at 12 technologies that have a history of raising eyebrows and suspicions. We assess the potential each has for transforming the future of the enterprise.

1. Superconducting computing

How about petaflops performance to keep that enterprise really humming? Superconducting circuits -- which are frictionless and therefore generate no heat -- would certainly free you from any thermal limits on clock frequencies. But who has the funds to cool these circuits with liquid helium as required? That is, of course, assuming someone comes up with the extremely complex schemes necessary to interface this circuitry with the room-temperature components of an operable computer.

Of all the technologies proposed in the past 50 years, superconducting computing stands out as psychoceramic. IBM's program, started in the late 1960s, was cancelled by the early 1980s, and the Japan Ministry of Trade and Industry's attempt to develop a superconducting mainframe was dropped in the mid-1990s. Both resulted in clock frequencies of only a few gigahertz.

Yet the dream persists in the form of the HTMT (Hybrid Technology Multi-Threaded) program, which takes advantage of superconducting rapid single-flux quantum logic and should eventually scale to about 100GHz. Its proposed NUMA (non-uniform memory access) architecture uses superconducting processors and data buffers, cryo-SRAM (static RAM) semiconductor buffers, semiconductor DRAM main memory, and optical holographic storage in its quest for petaflops performance. Its chief obstacle? A clock cycle that will be shorter than the time it takes to transmit a signal through an entire chip.

So, unless you're the National Security Agency, which has asked for $400 million to build an HTMT-based prototype, don't hold your breath waiting for superconducting's benefits. In fact, the expected long-term impact of superconducting on the enterprise remains in range of absolute zero.

-- Martin Heller

2. Solid-state drives

Solid-state storage devices -- both RAM-based and NAND (Not And) flash-based -- have held promise as worthwhile alternatives to conventional disk drives for some time despite the healthy dose of skepticism they inspire. By no means new, their integration into IT will only happen when the technologies fulfill their potential and go mainstream.

Volatility and cost have been the Achilles' heel of external RAM-based devices for the past decade. Most come equipped with standard DIMMs, batteries, and possibly hard drives, all connected to a SCSI bus. And the more advanced models can run without power long enough to move data residing on the RAM to the internal disks, ensuring nothing is lost. Extremely expensive, the devices promise speed advantages that, until recently, were losing ground to faster SCSI and SAS drives. Recent advances, however, suggest RAM-based storage devices may pay off eventually.

As for flash-based solid-state devices, early problems -- such as slow write speeds and a finite number of writes per sector -- persist. Advances in flash technology, though, have reduced these negatives. NAND-based devices are now being introduced in sizes that make them feasible for use in high-end laptops and, presumably, servers. Samsung's latest offerings include 32GB and 64GB SSD (solid-state disk) drives with IDE and SATA interfaces. At $1,800 for the 32GB version, they're certainly not cheap, but as volume increases, pricing will come down. These drives aren't nearly the speed demons their RAM-based counterparts are, but their read latency is significantly faster than that of standard hard drives.

The state of the solid-state art may not be ready for widespread enterprise adoption yet, but it's certainly closer than skeptics think.

-- Paul Venezia

3. Autonomic computing

A datacenter with a mind of its own -- or more accurately, a brain stem of its own that would regulate the datacenter equivalents of heart rate, body temperature, and so on. That's the wacky notion IBM proposed when it unveiled its autonomic computing initiative in 2001.

Of the initiative's four pillars, which included self-configuration, self-optimization, and self-protection, it was self-healing -- the idea that hardware or software could detect problems and fix itself -- that created the most buzz. The idea was that IBM would sprinkle autonomic-computing fairy dust on a host of products, which would then work together to reduce maintenance costs and optimize datacenter utilization without human intervention.

Ask IBM today, and it will hotly deny that autonomic computing is dead. Instead it will point to this product enhancement (DB2, WebSphere, Tivoli) or that standard (Web Services Distributed Management, IT Service Management). But look closely, and you'll note that products such as IBM's Log and Trace Analyzer have been grandfathered in. How autonomic is that?

The fact is that virtualization has stolen much of the initiative's value-prop thunder: namely, resource optimization and efficient virtual server management. True, that still involves humans. But would any enterprise really want a datacenter with reptilian rule over itself?

-- Eric Knorr

4. DC power

The warm, humming bricks that convert AC from the wall to the DC used by electronics are finally drawing some much deserved attention -- from datacenter engineers hoping to save money by wasting less energy. The waste must often be paid for twice: first to power equipment, then to run the air conditioner to remove the heat produced. One solution is to create a central power supply that distributes pure DC current to rack-mounted computers. But will cutting out converters catch on, or is the buzz surrounding DC to the datacenter destined to fizzle?

Researchers at the Department of Energy's Lawrence Berkeley National Laboratory  have built a prototype rack filled with computers that run directly off 380-volt DC. Bill Tschudi, principal investigator at the lab, says that the system uses 15 percent less power than do servers equipped with today's most efficient power supplies -- and that there can be even greater savings when replacing the older models still in use in most enterprises. If the server room requires cooling, as it does everywhere except in northern regions in the winter, the savings can double, because the air-conditioning bill also can be cut by 15 percent.

Others are working on bringing additional DC savings to the enterprise. Nextek Power, for instance, is building a system that integrates the traditional power grid, rooftop solar panels, and computer hardware using DC power. Choosing this standard avoids the inefficiencies of converting the DC produced by the panels to AC, then back to DC when it reaches the computers.

"It's a big opportunity, because we've shown that there's big energy savings," Tschudi says of the prospects of DC. "And it's also got more reliability because there are fewer points of failure."

Cost savings? Reliability? The prospects for DC to the datacenter are looking up.

-- Peter Wayner

5. Holographic and phase-change storage

What enterprise wouldn't benefit from a terabyte USB dongle on every key chain and every episode of Magnum, P.I. on a single disc? Thanks to phase-change memory and holographic storage, today's pipe dream is shaping up to be tomorrow's reality.

Currently under development by IBM, Macronix, and Qimonda, phase-change storage is being touted as 500 times faster and a magnitude smaller than traditional "floating gate" flash technology. Whereas flash memory involves the trapping of electrons, phase-change memory achieves its speed by heating a chalcogenide alloy, altering its phase from crystalline to amorphous.

This technology could prove critical in embedded computing apps, as memory cell degradation has forced many appliance developers to add expensive NVRAM (nonvolatile RAM) to store configuration information, rather than risk premature flash failure. Once realized, it could dramatically drive down the cost of appliances and push new capabilities into enterprise handsets.

Holographic storage, on the other hand, could quickly change the way we think about CDs and DVDs. So quickly, in fact, that enterprise archiving may bypass slow-to-ship dual-layer optical drives altogether and head straight to holographic optical.

InPhase Technologies is already shipping engineering prototypes of a holographic disc storage system with 60 times the storage capacity of today's DVDs. The advent of 3-D optical storage could herald the era of sending a copy of your entire corporate database off-site affordably. Think of what holographic storage could do for personal records portability: a durable ID card that contains your entire medical file in your wallet.

Regardless of which technology ships first, the enterprise is likely to benefit from both soon. Maybe those data crystals from Babylon 5 aren't so far off after all.

-- Brian Chee

6. Artificial intelligence

Few terms carry as much emotional and technical baggage as AI (artificial intelligence). And while science-fiction authors probe AI's metaphysical boundaries, researchers are producing practical results. We may not have a robot for every task, but we do have cell phones that respond to our voice, data-mining tools that optimize vast industries, and thousands of other measurable ways AI-influenced computing enhances how the enterprise gets work done.

That said, AI itself remains elusive, and the measure of AI's position on the enterprise crackpot scale depends wholly on where you set the goals. Restricted to applying templates and well-defined theorems to sets of data with precise definitions, computers are, after all, becoming very adept at using statistics to make educated guesses about the world. And though speech-recognition software, for example, may not hear the actual message, whatever that means, it does know that a certain pattern of sounds and frequencies almost always corresponds to a particular word.

Greg Hager, a professor of computer science researching machine vision at Johns Hopkins University, says, "For a long time, people thought that the way you would solve those problems was to understand how people would solve those things and then write a program that would do what people do."

That approach has yet to produce much success, but as Hager points out, less sophisticated, more statistical algorithms that take educated guesses are becoming increasingly accurate. Some of the best algorithms for recognizing objects in images, for instance, look for salient features, waiting until enough key points are recognized. Such an algorithm could recognize a Ford sedan from multiple angles but wouldn't be smart enough to use that experience to recognize a Chevrolet.

"It's a paper-thin notion of intelligence," Hager says, but one that's still useful in many basic cases. Expect the enterprise to benefit from similar AI-inspired computing paradigms and technologies in the very near future.

-- Peter Wayner

7. E-books

Remember the paperless office? If so, you may recall a close cousin: the e-book, which promised access to entire libraries of documents in easily readable formats -- an obvious boon to the enterprise knowledge worker on the go. As did many ideas debuting midway through the dot-com boom, it failed spectacularly.

And yet a visit to Sony's Connect eBooks suggests that rumors of the e-book's demise have been exaggerated. For a cool $350, you can pick up the Sony Reader and start collecting from more than 11,000 titles.

But what does a shelf's worth of Michael Crichton in your pocket have to do with the enterprise? Not unlike the path to adoption taken by many devices permeating today's mobile enterprise, the e-book's "proof of concept" phase will play out on the consumer stage. And it may just be copyright protection and distribution -- rather than any paper vs. LCD debate -- that determines the technology's long-term prospects.

"Another issue, besides the prohibitive cost and cumbersome nature of e-documents, concerns the vast portion of the contracts that were signed and agreed upon before e-books came onto the scene,says litigation lawyer Esther Lim, a partner at Finnegan Henderson. "That raises questions not just in terms of what rights the user has, but what rights the publisher has vis-a-vis the copyright holder."

If these issues aren't resolved, the e-book market may fail to attract the kind of vendor investment necessary to overcome the technology's lingering cost and usability concerns.

So, until e-books have their day in court, the jury remains out on their viability for the enterprise.

-- Richard Gincel

8. Desktop Web applications

When asked whether a full-featured desktop app can be delivered via the Web, most people picture standard HTML forms, possibly with Java or JavaScript thrown in for aesthetics and minimal functionality, and laugh the idea off. But the full-scale apps being built for the browser using scripting languages and Adobe's Flash and Shockwave development tools will soon prove them wrong.

Flash apps started out as rudimentary games with lackluster input methods and a cartoonlike look and feel. More and more, however, they resemble native apps. Take Gliffy, for instance -- a very attractive, stable Flash app that drives like Microsoft Visio, providing full diagramming capabilities in the browser with nothing more than Flash 7 required on the client side.

Another worthwhile example is EyeOS, which looks like a Flash app but is built on PHP and JavaScript and runs off a standard Apache Web server. The array of options and eye candy in EyeOS is staggering for such a new project, clearly pushing the envelope of what such apps can do.

These projects, and others popping up all over the Net, represent the next step in Web app delivery, one that will break free of the HTML form and into interfaces that resemble fat apps. Vendors such as Scalent are already writing their UIs in Flash -- and are reaping the benefits of a simpler deployment, arguably greater cross-platform support than Java, and a more seamless, attractive user experience to boot.

As the options diversify and improve, it's a safe bet that Web-based desktop apps will reshape the enterprise soon.

-- Paul Venezia

9. Project Blackbox

A portable datacenter may seem like pie in the sky, but in fact, Sun Microsystems has already constructed it. Whether Project Blackbox, which Sun calls the first virtualized datacenter, catches on remains to be seen, but for some, the concept is compelling.

Take a 20-foot shipping container; provide it with integrated cooling, networking, and power distribution; add external hookups for hot and cold water, 208-volt three-phase AC power, and Ethernet networking; integrate sensors, alarms, and GPS; fill its eight 19-inch shock-tolerant racks with servers -- either 120 Sun Fire T2000 servers or 250 Sun Fire T1000 systems -- and you've got one or two thousand processor cores, 7TB of memory, and more than 2PB of storage. Connect them all as a grid, for simplicity.

According to Sun, this configuration can support 10,000 simultaneous desktops without requiring an administrator, and it can be located almost anywhere: on a rooftop, in a parking garage, in a secure warehouse. It can be delivered rapidly, even to theaters of operation or catastrophe areas. What's more, Sun claims that a Project Blackbox datacenter is a tenth the price of a standard datacenter and that it can be turned on and configured in a day.

So if you find yourself unable to build or power or cool a datacenter fast enough to keep up with your enterprise's growth, or you're in need of a server farm on the go or at a hard-to-reach outpost such as an oil rig, you may find yourself in the market for this deliverable soon.

-- Martin Heller

10. Quantum computing and quantum cryptography

The manipulation of subatomic particles at the quantum level has raised eyebrows in computer science research departments lately -- so much so that several approaches to incorporating quantum mechanics into computing have been launched to varying degrees of success.

The most advanced field of research is quantum cryptography, a bit of a misnomer given that it doesn't rely on anything resembling traditional codes or ciphers. Instead of locking up data in a mathematical safe, the technique encodes messages in the clear by tweaking the quantum properties of photons -- a 1 may transform into a photon with "left" spin; a 0, into a photon with "right" spin.

The technique offers security because it is believed to be impossible to detect the spin of a photon without destroying or significantly altering it. So any eavesdropper would annihilate the message or change it enough for the recipient to notice. Two leaders in the field, IBM and Los Alamos National Laboratory, have built working devices and have demonstrated the transmission of photon streams through fiber optics and even the air.

Another technology based on the principles of quantum mechanics, quantum computing, attempts to model computation with quantum states. The field has produced tantalizing theoretical results that show how such a computer instantly could solve some of the most complicated problems such as factoring exceedingly large numbers.

Quantum computing is much further from having an impact in the lab or the enterprise than quantum cryptography. No one has built a particularly useful quantum computer yet, although some researchers have built machines that work with one or two bits. One group recently announced it is building machines that work with problems that take around 1,000 bits to describe.

-- Peter Wayner

11. Semantic Web

Originally designed for document distribution, the Web has yet to realize its full potential for distributing data. XML has done its part. Yet every XML document requires an XML Schema -- and relating them isn't easy. Until a viable means for surfacing and linking data is established and adopted, humans will remain the Web's core categorizing agents.

Enter the Semantic Web, an effort spearheaded by Tim Berners-Lee in 1999 to extend the Web to enable machines to take this mantle. At the outset, the idea -- to transform the Web into something machines can readily analyze -- seemed hopelessly academic. Yet with significant public data sets surfacing in Semantic Web form, the once crazy notion now stands to revolutionize how enterprise IT accesses and disseminates data via the Web.

RDF (Resource Description Framework) -- the Semantic Web's standard format for data interchange -- extends the URI linking structure of the Web beyond naming the two ends of a link, allowing relationships among all manner of resources to be delineated. But the key to the Semantic Web -- and where most people's eyes glaze over -- is its use of ontologies. If specialized communities can successfully create ontologies for classifying data within their domains of expertise, the Semantic Web can knit together these ontologies, which are written using RDF Schemas, SKOS (Simple Knowledge Organization System), and OWL (Web Ontology Language), thereby facilitating machine-based discovery and distribution of data.

Buy-in is essential to the success of the Semantic Web. And if it continues to show promise, that buy-in seems likely.

-- Martin Heller

12. Total information awareness

When the DoD's Information Awareness Office rolled out its high-tech scheme to track down terrorists in 2002, the program had all the hallmarks of a government boondoggle, invoking imagined -- and sometimes unimaginable -- future technologies to solve an immediate problem.

First, there was the hyperbolic, Orwellian name, Total Information Awareness (TIA); then there was the project leader, convicted Iran-Contra felon Rear Admiral John Poindexter. And finally there was the bloated goal: To aggregate, store, and analyze public and private data on an unimaginably massive scale, applying a predictive model that would correlate past activities to predict future acts. Minority Report, anyone?

The project eventually got a PR makeover, emerging as "Terrorism Information Awareness." Even so, the idea was still technically far-fetched. To create a system that could scoop up and analyze citizens' or foreign nationals' credit card transactions, medical records, Web site activity, travel itineraries, e-mails, or anything with an electronic fingerprint, Poindexter called for a "total reinvention of technologies for storing and accessing information." That's the IT equivalent of a Hail Mary pass.

Ultimately, the technical hurdles became moot. Privacy advocates howled, public sentiment turned, and the Feds officially pulled the plug in 2003. Yet for all its sci-fi underpinnings, many of the technologies that constituted TIA aren't as nutty as they sound.

For instance, companies such as Teradata offer solutions that can migrate petabytes of data from disparate databases to a massive, integrated data repository, where customers can employ sophisticated data mining. Meanwhile, CallMiner and other speech analytics software enable companies to mine customer phone calls for business intelligence. And although today's predictive analysis tools may not be able to foretell a terrorist attack, they can, for example, analyze the failure rates of mechanical parts so that companies can adjust their inventories accordingly. Not too bad a technical legacy for such a mixed bag of seemingly crackpot notions.

-- Steve Fox

From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies