Crackpot technologies that could shake up IT

Eight more technologies that straddle the divide between harebrained and brilliant -- each with a promise to transform the future of the enterprise

It doesn't take a genius to catch on to the fact that in IT, innovation is a mandate. Push the envelope of what's possible, or find yourself relegated to wayside. But, to borrow a favored David St. Hubbins Spinal Tap aphorism, there's a fine line between clever and crackpot when it comes to making good on technological breakthroughs in the enterprise.

It is in that spirit that we revisit last year's level-headed look at crackpot technologies that could transform the enterprise, putting the screws to a new rack of could-be enterprise contenders. But before you write off nanotech or direct brain interfaces as the next big enterprise thing, consider this:

[ For an in-depth look at last year's crackpot tech assessments, see: 12 crackpot tech ideas that could transform the enterprise ]

Of the dozen technologies we examined last year, several made significant enterprise-minded strides since we first assessed their IT prospects. Desktop Web apps, for one, lent credence to the conjecture that fat productivity suites might just have a shorter future than previously thought; solid-state drives popped up everywhere, from ultramobile laptops to the datacenter; and Sun's datacenter-in-a-box proved compelling enough for Google to move first to patent it. Even quantum cryptography received a vote of confidence, with Switzerland tapping the technology to protect parliamentary election transmissions.

Sure, some of this year's out-there technologies may prove fruit for future high-profile tech flops lists, but without forward-thinking, there would be no worthwhile enterprise advancement. So raise your eyebrows or your suspicions as you deem fit, and join us in assessing the potential each of the following technologies has to earn IT's respect or derision.

• Nanotechnology
• Optical computing
• Pervasive computing
• Wireless power
• The $100 laptop
• Direct brain interfaces
• Enterprise supercomputing
• Virtual worlds

Nanotechnology

No technology has the potential to revolutionize enterprise computing like nanotechology -- at least that's the impression given by the breadth and intensity of experiments in going small these days. Practical or not, nearly every corner of the enterprise stockyard is being injected with nanotech -- displays, computers, even light bulbs. In fact, today I took nanotech to the slopes, skiing on Sterling skis with a "nano-carbon" base from World Cup technology.

But is there enough substance beneath the science to move nanotechnology beyond crackpot and into the enterprise? The answer, of course, depends on where you look.

Nanotech and quantum computing are closely related, but emerging nanotechnologies for storage, batteries, and even chip cooling are showing promise, at least in the labs.

Arizona State University’s Center for Applied Nanoionics (CANi) has developed insight into nanostorage by examining two leading nanotech solutions simultaneously: tapping special materials and switching from a charge-based to a resistance-based framework.

Published in the October 2007 issue of IEEE Transactions on Electron Devices under the title "Bipolar and Unipolar Resistive Switching in Cu-doped SiO2," the concept uses materials already common to chip design (silicon dioxide and copper) but does so in a new way. By doping the silicon dioxide with the copper, the technology creates a leap in memory, according to Michael Kozicki, director of CANi.

"Because it is so low energy, we can pack a lot of memory and not drain battery power; and it’s not volatile -- you can switch everything off and retain information," Kozicki says. "What makes this significant is that we are using materials that are already in use in the semiconductor industry to create a component that’s never been thought of before."

As the insatiable appetite for computational power in ever-increasing types of devices increases, so will the need for low-power, abundant storage. Although not likely to emerge in commercial applications in the next year or two, nanostorage is a research area that shows significant promise.

Of course, powering these devices is another issue -- and another area where nanotechnology may come to the rescue.

In December, Stanford announced a breakthrough in lithium ion (Li-on) energy storage using silicon nanowires that will increase the potential storage of Li-on batteries nearly tenfold, according to Yi Cui, assistant professor of materials science and engineering and the leader of the research team.

Li-on battery capacities are limited by the lithium that can be held in the battery's anode. Typically, anodes are carbon, but silicon can be used for much higher capacity.Silicon has a drawback, however: It swells as it absorbs the lithium during charging and shrinks during discharge. This expand/contract cycle causes the silicon to break down over time, degrading the battery. Yi Cui and his team used silicon nanowires instead. The team found that although the nanowires expand four times their normal size during charging, they do not fracture while discharging.

This technology could emerge on the market relatively soon, especially if an established battery firm partners with the Stanford team.

With all this increased computing power comes an increase in heat, and a need to dissipate it. Researchers at Birck Nanotechnology Center in Discovery Park at Purdue have approached this problem in a new way, growing carbon nanotubes on top of microprocessor chips, allowing them to provide heat conduction away from the computational core. Described by the researchers as a "Velcro-like nanocarpet," the collection of tubes pulls heat away from the chip and into the heat sink for dissipation.

Performance, capacity, and efficiency have long been the markers of worthwhile ingenuity in the enterprise. And nanotech is aiming to deliver all three. Although these solutions are not imminent, their emergence is likely to change the face of the enterprise as they find practical means for integrating with emerging computing technologies.

-- Stephen Sven Hultquist

Main page of crackpot ideas | Next crackpot idea

Optical computing

For years, chipmakers have bumped against the ceiling of Moore's Law. Current fabrication techniques can't keep CPU speeds climbing at the meteoric rates of decades past. Because of this, today's advances focus on multiple cores and power savings, rather than raw speed. But what if there was a new way to build chips -- one that would accelerate processing literally to the speed of light?

Using light to turbocharge data transfer is not new. SANs have benefited from high-speed fiber-optic links for years, and U.S. telecommunications providers have begun using similar technology to offer customers Internet bandwidth comparable to that of a dedicated LAN.

Building entire computers with optical circuitry would certainly provide significant advantages. First, particles of light, aka photons, travel much faster than electrons. Perhaps more importantly, they do not radiate energy the way electrons do, even at high frequencies. Thus, not only would purely optical chips outperform anything previously known, the cooling problems that plague electronic microprocessors would virtually disappear.

It's no surprise, then, that scientists have pursued optical computing for decades. For many years, their efforts yielded only purpose-built devices suited to specialized tasks. Then, in 1993, researchers at the University of Colorado announced the first experimental general-purpose optical computer. Although the prototype proved optical computing was technically feasible, a practical design with real-world applications remains elusive.

For starters, the Colorado device could move only one photon at a time -- unlike traditional supercomputers, which process thousands of operations simultaneously. Worse, it relied on large bursts of photons to control the individual photons that made up its data, making the computer extremely energy-inefficient.

But the quest isn't over. In 2007, the Lukin Group at Harvard announced an optical circuit that needed a single photon to operate. Still, 15 years after the University of Colorado experiment, we're no closer to optical computers for mainstream use, let alone a working prototype that bests an ordinary computer. Photon-powered processing may yet be in your future, but you just might have to travel at the speed of light to get there.

-- Neil McAllister

Previous crackpot idea | Main page of crackpot tech | Next crackpot idea

Pervasive computing

How many times have you missed an important phone call because you were in a meeting? Or wasted 15 minutes searching the building for one of your co-workers? Or sent an important job to the printer, only find that it's out of toner? Fear not; top technology minds are at work on these problems, and more.

Imagine a meeting room that records the identities of everyone who enters it and updates an availability database, allowing urgent calls to be routed accordingly. Every office could do the same, making it possible to track down anyone, anywhere in the building. Meanwhile, your networked printer could monitor its own toner levels and compare them to past usage patterns, then use its internal Internet connection to place an order for replacement toner when the time is right. With automation like this, your office could be running like clockwork in no time.

Welcome to pervasive computing, a future that promises a chip in every pot and a smart card in every garage. Instead of a desktop PC that acts as the hub for your digital needs, pervasive computing envisions a seamless web of embedded devices, all working behind the scenes to track and manage your day-to-day tasks.

In a fully pervasive computing world, even the concept of a digital device begins to blur. Computers won't be limited to metal boxes, glowing screens, and arcane UIs. Instead, they'll be hidden away in your clothes, your wallet, and your office walls.

It sounds like something out of the Jetsons, and indeed, pervasive computing is an idea almost as old as the personal computer itself. But it may be closer than you think. Historians credit Mark Weiser of Xerox Parc with inventing it in the 1980s, and since then, researchers at the likes of HP, IBM, and MIT's Media Lab have built on Weiser's work. Recent years have already seen such innovations as computerized car locks and RFID-enabled passports, for example.

But just because we can build it doesn't mean we should. Ignore the privacy implications for now. In today's economy, what company would be willing to pay for it? Building a mesh of tiny microprocessors into everyday objects is a frivolous expense compared with the laundry list of tasks facing today's cash-strapped IT departments. After all, just how hard is it to order printer toner, really?

-- Neil McAllister

Previous crackpot idea | Main page of crackpot tech | Next crackpot idea

Wireless power

Ever since Nikola Tesla lit bulbs on his grounds without wires, the promise of wireless power has stirred hope and interest but little success. Today, with the wide array of electronic devices that require power, the promise of powering the enterprise through the air seems as crackpot as it is unlikely.

Yet practical implementations of wireless power have emerged in the past few years, including the induction charging of an electric toothbrush and wireless extension cords ThinkGeek introduced in 2006 -- on April Fool's Day.

In fact, a few companies are actively pursuing wireless power, including Splashpower and Powercast.

Splashpower has developed charging systems using magnetic induction. Placing enabled devices on the charging station will charge the device without mechanical connection. Just as with that toothbrush, power in the charging platform creates an electro-magnetic field that effectively transmits power to the embedded module in the device.

Powercast has taken a different approach, as reflected in the first commercial implementation of its technology: an LED-lit wireless Christmas tree created with Philips and sold last Christmas season in the high-end Frontgate catalog.

Rather than tapping induction, Powercast uses RF (radio frequency) transmissions with high-efficiency transmitters and receivers to broadcast power to the devices. Powercast is targeting a broad range of devices, including such widely varying items as hearing aids, biological weapons sensors, and wireless headsets.

Tesla's early patents on wireless transmission of power were awarded in 1900. His diagrams show inductive coils that can transmit large amounts of power over great distances. Carrying out his invention on an industrial scale, he said, would result in electricity for cities from places where cheaper power was attainable. Given the rising cost of energy, the ability to feed a datacenter with power from a distant, inexpensive source such as a hydroelectric dam would have considerable impact on the enterprise.

In June 2007, a team from MIT's Department of Physics, Department of Electrical Engineering and Computer Science, and Institute for Soldier Nanotechnologies experimentally demonstrated wireless power using magnetically coupled resonances. Using resonance, the team demonstrated a much more efficient transmission of power over distance than is typical. Of course, the team's devices are a bit on the bulky side, but the demonstration of the concept shows promise.

Although we have yet to see the full fruition of Tesla's brilliant vision, some early work is beginning to demonstrate that there may, in fact, be some hope for moving power without wires. And hopefully without dire health consequences created by the interaction of human beings with it.

-- Stephen Sven Hultquist

Previous crackpot idea | Main page of crackpot tech | Next crackpot idea

The $100 laptop

The One Laptop Per Child (OLPC) program hasn't fared as well as its founders may have hoped. Not only did the group considerably overshoot its targeted $100 price tag; it's been plagued with manufacturing problems and commercial competition.

That doesn't mean the dream of a sub-$100, low-power laptop is unachievable. In some ways, the OLPC's XO serves as an "alpha" model of where the PC market could head -- and not just in the developing world. In fact, the company's former CTO sees a future for low-cost, highly power-efficient machines in the commercial market.

And the more you look at the cost of outfitting enterprise end-users with computing resources beyond their needs, the less crackpot leaning on OLPC-like laptops in the enterprise will one day seem -- especially in terms of energy efficiency.

These days, a solid, all-purpose laptop averages $1,440, yet the machines pack far more power than the average end-user requires -- as much as 80 to 90 percent more. Thus, while the user takes advantage of just 10 to 20 percent of a system's power, the machine continues to draw upwards of 280 watts of energy. Beyond cutting down on the laptop’s battery life, those wasted watts translate to wasted dollars in powering the system -- and to cool rooms where PCs congregate.

The OLPC XO, on the other hand, was designed with power-efficiency in mind. Initially targeted at users with spotty access to electricity, these babies can run on less than two watts of power, resulting in an estimated battery life of 21 hours.

The tradeoff for low energy consumption is a less-powerful CPU. Yet the trend toward thin clients in the enterprise is fast proving that the average end-user can get along just fine, productivity-wise, with an inexpensive commodity processor (the XO comes with 433MHz chip) -- especially if the system isn't bogged down with a fat OS platform and applications. Thanks to the evolution of Web services, such as Google Apps and Salesforce.com, lightweight desktop apps can be supplemented or replaced, as appropriate, with a browser and ubiquitous Internet access.

Meanwhile on the platform side, even Microsoft, king of the fat OS, must foresee a future of lightweight PCs, as it is working to trim the excess code from Windows to match sleeker Linux alternatives.

They may look like toys today, but today's "$100 laptops" may very well serve as prototypes for a significant portion of tomorrow's enterprise computing paradigm, especially as the various trends and cost constraints of delivering apps to end-users specialize and evolve and specialize. After all, designing technology in support of policy -- as was the case with the OLPC -- should catch on in the corporate world as well, especially as companies come to realize that centering purchasing policies on energy efficiency just makes sense.

-- Ted Samson

Previous crackpot idea | Main page of crackpot tech | Next crackpot idea

Direct brain interfaces

Ready to think away that backlog of IT tasks to a more manageable stack? Or to get a handle on the hot new IT skill without lifting a finger? If scientists are successful, such power could be within IT's grasp, as the computers of the future will plug directly into your brain.

Technological telepathy has been the stuff of science fiction for years. In the 1957 film Forbidden Planet, for example, alien machines could bring any thought to life, while characters in the more recent Matrix trilogy bypassed years of tedious education via instant brain uploads. Although such tricks are a ways off, experimental brain-computer interfaces exist today.

For now, the goal of most direct brain interface research is to develop assistive technologies for the physically disabled. Researchers at Brown University, for example, have created a brain implant that allows quadriplegic patients to move a mouse cursor around a computer screen using thought alone. And in a separate experiment conducted at Boston University, scientists have been able to recreate audible sounds by processing data gathered from the speech centers of the brain of a paralyzed man, with what they claim is 80 percent accuracy.

Mind reading, it isn't -- not yet, anyway. But our understanding of the electrical workings of the brain has advanced so rapidly in recent years that we've scarcely had time to ponder the ethical questions raised by these new technologies, let alone which enterprise tasks we'd dream of gearing them toward.

Few would argue against using high tech to enhance the quality of life of the disabled; cochlear implants that interface directly with auditory nerves, for example, are now routinely used to treat total deafness. But what if similar technology could be used to enhance the hearing of normal, healthy people to superhuman levels? What if future implants could enhance cognition using microprocessors to create a kind of "human calculator"? Would it be moral to plug a super-calculating admin into a server to monitor financial transactions? These and similar questions have spawned an entire, new field of philosophy, which New York Times columnist William Safire has dubbed "neuroethics."

And that's to say nothing of the implications should we ever gain the ability to plug our brains directly into the Internet. Imagine waking up in the morning to a headful of spam or finding out that your "little black book" has been phished by the person sitting next to you on the subway.

Advancing the direct brain technology to the point of feasibility in corporate settings is one thing. Governance and privacy concerns within and outside the corporate context are quite another. Don't expect to be thinking away your IT to-do list anytime soon.

-- Neil McAllister

Previous crackpot idea | Main page of crackpot tech | Next crackpot idea

Enterprise supercomputing

A modern, global enterprise is incredibly complex. Balancing materials availability forecasts with predicted sales trends and seasonal marketing strategies can seem like pure wizardry. But what if you had some help, in the form of a massive electronic brain that could handle the number-crunching for you?

Until recently, supercomputers were the exclusive domain of large universities and government research labs. Massive, arcane, and impossibly expensive, they required operational and maintenance skills far beyond the capabilities of your average enterprise IT department. But new developments in HPC (high-performance computing) technology are putting supercomputer-level performance within the enterprise's reach. The only question is, does the enterprise have use for it?

The HPC field has changed dramatically over the past decade. Today, distributed-processing software allows even desktop PCs to join compute clusters and crunch numbers in their idle moments. Networked parallel processing technology makes it possible to build supercomputer-class systems from mainstream, off-the-shelf hardware and open source software. And in the past few years, companies such as IBM and Sun Microsystems have begun offering time-shared HPC services at affordable rates.

This is great news for the oil and gas, finance, and insurance industries, which have long relied on HPC for intensive calculations and complex mathematical modeling. But for more typical enterprises, supercomputing technology remains a tough sell. The promise is enticing, but the hurdles to overcome call into question the number of businesses that realistically need to perform calculations on the order of those necessary to predict global weather patterns or model the stock market.

And cost is not the only barrier to entry for HPC. Before any massively parallel supercomputing application can run, it first needs a data set to process. As any IT manager can attest, enterprise data is too often scattered throughout multiple, disparate systems, each with its own interface and data formats. As the growing market for data integration and SOA (service-oriented architecture) technology attests, unifying this data is no easy task. Relying on it for serious computational modeling is out of the question.

So, while raw processing power may be available and affordable like never before, don't expect HPC to become a line item on your budget anytime soon. For most enterprise IT departments, those dollars will be better spent on traditional expenditures such as middleware and data warehousing, leaving mass-market supercomputing relegated to the category of the possible, but impractical.

-- Neil McAllister

Previous crackpot idea | Main page of crackpot tech | Next crackpot idea

Virtual worlds

The likelihood of Second Life having a long-term impact on the enterprise may appear virtually nonexistent, but consider this: Education, collaboration, and networking -- three productivity mandates for today's enterprise -- are fast catching on in the virtual world.

Before laughing and glancing sideways at your well-worn copy of Snow Crash, know that even old-guard institutions such as Harvard University have a Second Life presence, with virtual campuses where learning, discussion, and content creation occur.

thumb95305.jpg
Click for larger view.
Training, for one, has real ROI potential in Second Life, as virtual worlds expose participants to RL (real life) learning scenarios that would otherwise be too expensive or dangerous to explore. Take dealing with a pandemic flu, for example. Medical students are already tapping virtual worlds to learn how best to respond. No need to pay for a trip to a foreign country to learn language basics. Virtual immersive language study allows you to travel to worlds where only that language is spoken, with all signs and advertisements written in the language being learned.

Collaboration and networking are two other sweet spots for companies to make use of virtual worlds. Tech heavy hitters such as Dell, IBM, Microsoft, and Sun Microsystems are already tapping Second Life as a platform for development, conferences, and forums. IBM, which has established a Business Center in Second Life, boasts nearly 4,000 employees with Second Life avatars to date, with about 1,000 routinely conducting company business inside Second Life.

But what of the many technologies already serving companies' collaboration, networking, and training needs? How can virtual worlds find a long-term place in the mix?

"The 3D aspects and the ability to put a whole group of people in the same 'space' at a distance, where everyone can hear everyone else as you would in a real hall or space, gives SL an advantage over other social networking systems, chat systems, or conference calls," says Todd Cochrane, of the Wellington Institute of Technology in New Zealand. "People seem to be more engaged."

And that is the immeasurable edge virtual worlds may have over traditional modes of training and collaboration: user engagement. Perhaps more so, as Generation Y grows up with virtual technologies such as Second Life.

thumb95306.jpg
Click for larger view.

Of course, anonymity, which people tend to prefer in the virtual world, hinders collaboration carryover into the real world. Moreover, plugging in to Second Life for business-grade collaboration has other detractors, such as quality of experience (SL is consistently slowing down and crashing for a variety of reasons), privacy (often, depending on the type of conversation, others can "hear" you), and security. But as the technology matures, these issues will no doubt be addressed.

Either way, crackpot or not, tapping virtual worlds such as Second Life in a corporate setting has already drawn significant interest.

"Once more we have the very strong feeling that [Second Life] will have a huge impact on business, society and our personal lives, although none of us can quite predict what that impact will be," Irving Wladawsky-Berger, chairman emeritus of the IBM Academy of Technology and visiting professor of engineering systems at MIT, wrote in a blog more than a year ago. "It will be fascinating to see where this ride takes us in the future."

-- J. Peter Bruzzese

Previous crackpot idea | Main page of crackpot tech

1 2 3 4 5 6 7 8 Page 1
Page 1 of 8
InfoWorld Technology of the Year Awards 2023. Now open for entries!