Whatever happened to yesterday's hot technologies?

Ten new economy darlings that never quite lived up to their hype

Remember push technology? Or virtual reality for the Web? Or Microsoft Bob? Some ideas are probably better left consigned to history. And yet the roadside of the information superhighway is littered with ideas that sounded promising but never quite made it to revolution status before dropping off IT’s radar.

Want examples? Look no further. InfoWorld proudly presents a top 10 list of hits that might have been but never really were. But you never know; if the right people are listening, some of these dreams might yet become reality.

The death of the mainframe

Experts have been sounding the death knell for the venerable mainframe since before the release of Windows 3.1. More than a decade later, the days of users meekly approaching the mainframe priesthood to ask for processor time are gone, to be sure, but those hulking boxes in the datacenter are far from dead.

In the 1990s, numerous companies launched IT projects designed to move applications off the mainframe and onto less expensive, more modern hardware. Unfortunately, many of those projects are still on the calendar or have been scrapped altogether.

As it turns out, some mainframe applications are quite difficult to replicate as Web applications, and the data-migration costs are astronomical. Then there’s the cost of retraining, the cost of software licensing, and the very real business risk of downtime. Despite the best of intentions, mainframes keep running.

To its credit, IBM has done more with big iron than simply maintain the status quo. Its latest mainframe hardware line, ironically named after dinosaurs, even supports Linux and Java, in addition to traditional z/OS software.

So are those customers who didn’t quit on the mainframe laughing all the way to the bank? Maybe -- if depositing funds to pay their leasing fees counts. But until the last of those legacy apps goes dark, we’ll consider reports of the mainframe’s death to be greatly exaggerated.

-- Paul Venezia

Java everywhere

From its inception, Java was meant to conquer the world. Highly object-oriented, it was more elegant than earlier languages. You could write Java code once, and it would run anywhere. It was fast. It was reliable. It was secure. It was ... well, just about anything you could want from a development tool.

Given such outrageous hype to live up to, the extent to which Java has actually succeeded is truly incredible. And yet it’s hard to ignore its list of disappointments.

Applets were Java’s first dud. Macromedia snatched away the rich-media market, relegating client-side Java to the niche of cross-platform utilities and management tools. The language eventually found its audience on the server side. But by that time, years of ever-changing SDKs and elusive, often stillborn APIs had muddied its once-elegant design, confounding neophytes and making compatibility with earlier versions hopeless.

Java 2 brought us Enterprise Java, arguably the first mature version. And yet many shops still prefer common servlets to the more complex EJB architecture, the supposed crown jewel of J2EE. Meanwhile, those who take the J2EE plunge largely sacrifice Java’s write-once, run-anywhere promise in favor of their app server’s proprietary extensions. Today, with even staunch open source developers signing on to competing technologies such as Microsoft’s C# (via Mono), Java’s window of opportunity for world domination may be closing fast.

-- Neil McAllister

Mobile broadband

Four years ago, when the economy was in full nosedive, the U.S. mobile telcos claimed that wireless broadband delivered over 3G networks was just around the corner. As it turned out, the industry had little stomach for the huge costs of deploying 3G. Instead, using infrastructure upgrades they needed for voice traffic anyway, the carriers trotted out an alphabet soup of so-called 2.5G solutions. But the resulting speeds were closer to dial-up than true broadband -- even though Sprint and Qualcomm liked to call their technology “the first phase of 3G.”

Thankfully, the mobile broadband train is finally arriving in the station. Verizon has already deployed its EvDO (evolution, data optimized) service -- with downstream data rates of 300Kbps to 500Kbps -- to six major U.S. metropolitan areas, and it plans to cover the nation by the end of the year. Sprint will be next, although it lags behind Verizon in EvDO deployment.

Meanwhile, Cingular has inherited UMTS (Universal Mobile Telecommunications System) technology from its AT&T Wireless acquisition, which is slightly slower than EvDO but also available in six major cities. For faster-than-EvDO speeds, however, Cingular will probably be first, using HSDPA (High Speed Data Packet Access), which also features lower latency.

The caveat for all these services: Users share cell-tower bandwidth; so, the more popular 3G data becomes, the bigger the potential performance hit.

-- Eric Knorr

Voice recognition

As far back as the 1960s, people have been using voice recognition to control computers -- well, in the world of science fiction, at least. What’s interesting, however, is that the voice-driven UI of the computers in Star Trek may really not be that far off, as far as technology is concerned.

Dictation software, such as Dragon Naturally Speaking and IBM ViaVoice, has made dramatic improvements in the computer’s capability of translating spoken sounds into text. Meanwhile, XML-based standards such as VoiceXML and SALT (Speech Application Language Tags) make it easier than ever for companies to integrate speech recognition into their existing application infrastructures.

In March 2004, James Mastan, director of marketing at Microsoft’s speech server group, announced, “Our goal is to make speech-recognition technologies mainstream." You don’t get much more mainstream than an endorsement from Microsoft. And yet, even though Speech Server 2004 is a year old, most of us probably still use a mouse more often than a microphone.

It may well be that truly universal speech-based UIs have taken a back burner to a different human priority: peace and quiet. Imagine the cacophony of an entire floor of office workers barking orders into their desktop PCs. Until we solve that problem, voice control will likely remain the province of niche markets such as telecom, health care, the disabled, and Federation starships.

-- N.M.

Microsoft Passport

Five years after its launch, Passport, Microsoft’s single sign-in scheme, is down but not out. Microsoft still uses it, for example.

Everyone remembers Passport. What you may not recall is that it was conceived as a model for federated identity -- the authentication pillar of Microsoft’s .Net Web services venture, eventually code-named HailStorm. Unfortunately, while Chairman and Chief Software Architect Bill Gates was hailing Passport, a storm was swiftly brewing.

Relying on a “network of trust,” Passport is designed to allow users to travel across Web sites without having to re-enter sensitive personal data, using a single log-in and password. But trust is a fragile thing. Developers and privacy watchers who gave Passport a full-cavity search at the gate were alarmed by what they found.

Two years after Passport embarked on its bumpy ride, Bugtoaster.com, a Windows testing site, reported an OS design flaw that allowed hackers to steal user names and passwords of Passport account holders. In August 2002, the Federal Trade Commission ordered Microsoft to shore up security “inadequacies,” asserting that the company had collected data without notifying users.

Resolute promises from Redmond ensued, but the damage had been done. Adoption rates for Passport headed south, and gold-plated partners such as Monster.com and eBay withdrew. Microsoft took down its online directory of participating sites last year.

-- Richard Gincel

Improving the Internet

We all know the Internet is far from perfect. IP’s addressing problems have been discussed again and again, yet widespread adoption of IPv6 in the United States doesn’t seem to be any closer. And the IP address shortage isn’t the only problem Internet engineers could be solving.

Security is one concern that’s on everyone’s mind. End-to-end encryption on an Internetwide scale has been proposed, but there’s been little movement in that direction. And while former national cybersecurity czar Richard Clarke spoke of the potential for a “digital Pearl Harbor” in 2001, his warnings have gone largely unheeded.

Going further back, the IETF first demonstrated the potential of IP Multicast technology for broadcasting in March 1992. More than 10 years later, even though we’ve seen explosive growth of consumer interest in multimedia broadcasting and streaming, no reliable method of large-scale multicasting over the Internet -- as opposed to private research networks -- exists. And for that matter, what about Internet2? Its stated purpose is to deploy “advanced network applications and technologies for research and higher education, accelerating the creation of tomorrow’s Internet.” So when is “tomorrow,” exactly?

Of course, infrastructure upgrades are the big impediment to any major change in the Internet. The bigger the Net gets, the harder it is for new technologies to become ubiquitous enough to be practical. Still, don’t we have to start somewhere?

-- N.M.

The paperless office

As alluring as it may have sounded, the paperless office was one of the great hoaxes of the 20th century, in the same league as cold fusion and the Laetrile cancer cure. The promise was that technology would make paper-based records obsolete; every bit of data an office worker or executive required would be at his or her fingertips, through the magic of computers, telephony, and other forms of electronic communication.

In fact, the paperless office actually predates the PC revolution of the 1980s. The Memex, first detailed in the early 1930s by Vannevar Bush, President Franklin D. Roosevelt’s science advisor, is considered the first known description of the “office of the future,” outside of science fiction. While visionary, the Memex was equally impractical in its pre-Internet dependence on microfilm and the postal service for information exchange.

Sixty years later, the paperless office remains a pipe dream. If anything, the technologies unleashed in recent years have made it effortless to generate paper. The multifunction printer -- which scans, copies, collates, and e-mails -- is the hottest thing in office technology.

Until people find reading from an illuminated display to be as soothing as reading from warm, tactile paper, the paperless office will remain as elusive as a unicorn. Not to mention that paper has heft: Nothing thumps a boardroom table like a hard-copy report.

-- P.J. Connolly

The Semantic Web

If you’re the man who invented the Web, what do you do for an encore? If you’re Tim Berners-Lee, you try to do it one better.

Berners-Lee launched his next brainchild, dubbed the Semantic Web, with the aim of solving a basic problem of online information organization. The Web links islands of information into a single, vast network, accessible by people all over the globe. But it’s a disorganized linkage: comprehensible to humans, but very difficult for automated systems to understand.

The Semantic Web aims to solve that problem through the use of metadata that describes the content of pages, expressed in new languages designed specifically for machine consumption. The eventual goal is to make it much easier for search engines, automated agents, and other content-sorting tools to find the specific information you need.

It’s a great idea, and in the years since Berners-Lee published his first road map in September 1998, the Semantic Web has blossomed into a full-fledged W3C Activity and has been much discussed in the media. Other than research applications, however, actual working examples still seem a long way off.

But should we be surprised? In the end, the real paradigm shift of the Semantic Web may not be the technology but the idea of thinking of the Web as a library -- instead of just the “new TV.”

-- N.M.

Artificial intelligence

The term artificial intelligence was coined at Dartmouth College in 1956. During the next two or three decades, it blossomed into a huge industry, with countless research projects devoted to exploring its possibilities. Yet today it seems to have all but disappeared.

In actuality, AI was composed of three main branches: the KBS (knowledge-based system), RBS (rule-based system), and ANN (artificial neural network). To date, only the RBS has survived in commerce, where it is commonly found in the form of BRMSes (business rules management systems).

The biggies in this category have evolved into massive systems that can do almost anything a developer might want. The problem is that it takes years to learn all the tricks and traps; by then, there’s a new system to learn.

Customers and vendors spend far too much time comparing standard benchmarks, rather than taking the time to design tests that are truly representative of real-world problems. The trouble with a BRMS is that it takes 50 percent to 70 percent of the time up front to think things out -- before anyone writes a line of code -- while most project managers prefer immediate results.

All told, there are very few real BRMS success stories when compared with the many attempts that never made it past Phase Two. Is this truly the best AI had to offer?

-- James Owen

B-to-B e-commerce

As the Web economy began to cool in 2000, any savvy entrepreneur who was still in the dot-com game latched onto b-to-b as the hand to play. The hot buzzword of the next-generation online business model was “disintermediation” -- nothing less than cutting out the middleman (and not much more). As it turned out, however, using the Web to wed partners and wheedle deals behind the scenes wasn’t a recipe for runaway success, any more than the consumer-facing e-commerce model that came before it.

For one thing, the integration technologies necessary for it to work -- such as Web services -- simply weren’t in place in 2000. For another, there wasn’t much incentive for the supply side to participate -- why pay a b-to-b vendor to enter a bidding war in which you’d be forced to reduce your prices and compete for customers you thought you’d already won? Finally, b-to-b’s basic premise failed to recognize that, unlike the world of consumer sales, business relationships are predicated on a lot more than just cost and availability.

But it may be too soon to label b-to-b as down for the count. The growing market for SOA (service-oriented architecture) and modular applications may prove to be an ideal opportunity for the enterprising infrastructure players -- provided that, this time, they can check their ambitions with a little realism.

-- N.M.

Mobile Security Insider: iOS vs. Android vs. BlackBerry vs. Windows Phone
Join the discussion
Be the first to comment on this article. Our Commenting Policies