In the land of IT, the one thing you can count on is a slick vendor presentation and a whole lot of hype. Eras shift, technologies change, but the sales pitch always sounds eerily familiar.
In virtually every decade there's at least one transformational technology that promises to revolutionize the enterprise, slash operational costs, reduce capital expenditures, align your IT initiatives with your core business practices, boost employee productivity, and leave your breath clean and minty fresh.
[ InfoWorld reveals tech's 25 biggest flops of all time, plus the dirty vendor tricks to watch out for. | Snake oil, shenanigans, and snafus -- Robert X. Cringely's Notes from the Field blog is your guide to the fun and crazy world of tech. ]
Today, cloud computing, virtualization, and tablet PCs are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot.
Looking back with 20/20 hindsight, however, it's easy to see how we got sucked into believing claims that simply weren't true -- whether they were about the promise of artificial intelligence or the practicality of ERP.
Here then are six technologies -- five old, one new -- that have earned the dubious distinction of being the hype kings of their respective eras. Think about them the next time a sales team comes knocking on your door.
1. Artificial intelligence
Era: Late 1950s to present
The pitch: "Some day we will build a thinking machine. It will be a truly intelligent machine. One that can see and hear and speak. A machine that will be proud of us." -- Thinking Machines Corp., year unknown
Once upon a time, machines were going to do our thinking for us. And then, of course, they'd grow tired of doing humanity's bidding and exterminate us.
The good news? We haven't been offed by the machines (yet). The bad news is that artificial intelligence has yet to fully deliver on its promises. Like, for example, in 1964, when researchers at the newly created Stanford Artificial Intelligence Lab assured the Pentagon that a truly intelligent machine was only about a decade away. Guess what? It's still at least 10 years away.
Several times in the decades since, companies have tried to build an industry around AI computers -- largely supported by government research dollars -- only to run headfirst into an "AI winter" when the funding dried up.
"The longest failing technology has been AI and expert systems," says John Newton, CTO of Alfresco, an open source enterprise content management vendor. Newton began his tech career intent on studying AI, but ended up working with databases instead. "For 40 years, AI has not delivered."
Or maybe it has, but in subtler ways than the so-called thinking machines first envisioned when Stanford computer scientist John McCarthy coined the term "artificial intelligence" in 1956. Technologies that seem mundane to us now would have looked a lot like artificial intelligence 30 or 40 years ago, notes Doug Mow, senior VP of marketing at Virtusa, an IT systems consultancy.
The ability of your bank's financial software to detect potentially fraudulent activity on your accounts or alter your credit score when you miss a mortgage payment are just two of many common examples of AI at work, says Mow. Speech and handwriting recognition, business process management, data mining, and medical diagnostics -- they all owe a debt to AI.
Now AI is going through another resurgence. Backed by Google and NASA, AI research may provide the intelligence behind next-generation search engines and interplanetary travel systems. The next game-changing technologies could emerge not from MIT or Stanford but Ray Kurzweil's Singularity University, an institution dedicated to the convergence of machine and man. The hype continues to flow freely.
Who knows? We may yet earn the respect of our computers -- at least, enough for them to keep us around a little longer.
2. Computer-aided software engineering (CASE)
Era: Mid-1980s to mid-1990s
The pitch: "The key benefits of CASE are increases in software quality and development productivity. ... The result is better vision and understanding of the business problem and how the system works, and a clearer understanding of the system's design. With their disciplined, highly structured engineering approach and emphasis on rigid design rules, CASE tools verify consistency and completeness at early stages of the development process." -- Douglas Menendez, Internal Auditor, 1991
In the 1980s, computer-aided coding was thought to be the wave of the programming future. According to the sales pitch, software engineers would become largely obsolete, thanks to CASE tools that generated code automatically using intelligent algorithms.
One problem? Automated tools were able to perform basic tasks that saved programmers time, but they did not eliminate significant amounts of coding, let alone the programmers themselves, says Mark Shavlik, CEO of Shavlik Technologies, a provider of automated solutions for critical IT management. Shavlik says he worked on a CASE project for Boeing in the mid-1980s, until it was ultimately abandoned.
The other big problem: CASE did nothing to solve the "garbage in, garbage out" issue, says business blogger and former programmer Philip McLean.
"The idea of CASE was to produce better code faster by having a computer do it," says McLean. "Just feed your specifications into the front end, and it'll spit out flawless code. The vendors counted on customers who did not realize that the biggest problem in these projects is bad specifications, and they found a lot of those customers. So, people fed bad specs in one end and got bad code out of the other."
CASE was also supposed to eliminate an enterprise's applications backlog, so the apps would always be current with what the business needed, says Virtusa's Mow.
"That was a complete fallacy," he says. "Your business needs change so rapidly that there has to be a backlog at all times or your business becomes stagnant."
Where are CASE tools today?
"Today there are some very basic forms of automatic programming in the Microsoft development tools," says Shavlik. "They speed initial development of commonly used tasks, but they're very limited. I believe we found out that creating advanced computer programs is much more difficult than what people thought it was in the early 1980s. Maybe it was a false belief in the upside of AI at the time."
3. Thin clients
Era: Early 1990s to present
The pitch: "The financial case is clear: Thin-client computing can save 30 to 70 percent of your IT costs. Centralizing servers and server support staff leads directly to higher utilization levels. Simplified software deployment radically reduces rollout costs. Longer lifetimes of Windows-based terminals reduces capital expenditure. ... All the benefits of centralised servers and support staff are realized as are most of the benefits of powerful PCs on desks, including popular Windows applications." -- Newburn Consulting, March 2002
By distributing cheap data terminals throughout an enterprise and concentrating computing power at the network hub, thin-client systems were supposed to be easier and cheaper to maintain than a fleet of PCs -- all while being virtually free of malware and user-induced hiccups. In the mid- to late 1990s, they were promoted as a panacea for overtaxed IT departments looking to slash their total cost of ownership.
When you got up close, however, thin clients proved a bit chubbier than anyone realized, says analyst Rob Enderle, principal analyst for the Enderle Group.
"The problem was that the servers that supported the platform weren't designed for the kind of massive I/O that the concept required, and network bandwidth wasn't up to the task, either," he says. "This made the conversion cost very high. For a solution whose primary benefit was economics, it was economically unattractive for most."
Worse, users resented giving up control over their machines, adds Mike Slavin, partner and managing director responsible for leading TPI's Innovation Center. "The technology underestimated the value users place upon having their own 'personal' computer, rather than a device analogous -- stretching to make a point here -- to the days of dumb terminals," he says.
As a result, thin clients had the biggest impact on factory floors, retail operations, call centers, and other environments where PCs did not already have a foothold.
These days thin clients are making a bit of a comeback, says Enderle, but not strictly for enterprises -- and not under the name "thin clients." "The technology is making its way into game systems, set-top boxes, connected TVs, and connected digital picture frames," he says. "Far from dead, it's undergoing a resurgence. But people are being careful not to call them 'thin clients' or even connect it to a PC experience."
Era: Mid-1990s to present
The pitch: "Enterprise resource planning systems appear to be a dream come true. ... The latest generation of commercially available software packages promise seamless integration of all information flows -- financial and accounting, human resource, operations, supply chain, and customer information. This provides for a unified view of the business, encompassing all functions and departments by establishing a single enterprise-wide database in which all business transactions are entered, recorded, processed, monitored, and reported." -- Elizabeth and Michael Umble, Baylor University, January 2002
Ah, the promise of ERP. All your essential businesses processes wrapped up into a single package that everyone in the company -- from the boardroom to the mailroom -- can easily access. One enterprise app to rule them all.
Yeah, right.
"Where ERP fails," says Roger Hockenberry, EVP for IT services provider Criterion Systems, "is when you try to apply best-practices business process that are empirically created to an organically grown enterprise. Replacing process that was developed internally, over a long period of time is not an easy thing for any organization."
Your choice? Either change your people and processes to match the software (good luck with that) or customize the software to match your business (hope you brought your checkbook).
As a result, implementing an ERP project is "like teaching an elephant to do the hootchy-kootchy," in the words of CIO magazine senior editor Thomas Wailgum. Millions of dollars later, most massive ERP projects end up half finished or largely ignored by the people they were supposed to help.
A 2006 report by the Cutter Consortium found that application software packages -- predominately ERP -- are fully implemented less than a third of the time. Forty percent of businesses reported that squeezing the promised benefits out of this software ranged from "quite difficult" to "extremely difficult." More than 90 percent of ERP projects take longer than expected to implement, and nearly 60 percent go over budget, according to the Panorama Consulting Group.
But many of ERP's failings lie with enterprises that weren't mature enough to handle the concept of integrated data flow, says Sue Metzger, management information instructor at Villanova University's School of Business. "The Y2K crisis of the late 1990s gave [ERP vendors] a great opportunity to come into enterprises and replace existing technology," she says. "Sure, some of them may have oversold and overcharged. Those were fat and happy times for many ERP providers. Organizations that had the fear of Y2K instilled in them felt ERP was the way to go, even if they weren't mature enough to handle it."
Though they never fully delivered on their promise, ERP systems are now common across large enterprises, as is the concept of data integration. "The challenge today is what to do with all that data," she says. "How do you make decisions based on that information?"
5. B-to-b marketplaces
Era: 1999 to 2002
The pitch: "By 2005, 35 percent of the Internet b-to-b trade volume will be conducted via a net market or a consortium of buyers or sellers. ... The value proposition of the Internet is on a grander scale for the b-to-b space; the sheer size of b-to-b trade, coupled with inefficient processes, makes the Internet migration of business strategies very attractive." -- Jupiter Research, June 2000
They were supposed to revolutionize the way organizations did business on the Net -- matching buyers and sellers in a dynamic, interactive bazaar that would dramatically cut procurement costs while boosting efficiency. According to Gartner, by the middle of this decade some $8.5 trillion of trades would occur annually via automated B-to-B exchanges.
Well, not quite. The b-to-b bubble burst shortly after it formed. Covisint -- a massive auto components marketplace jointly owned by the big carmakers -- stalled almost immediately. It was purchased by Compuware in 2004 and now functions as an "on-demand collaboration platform" for a wide range of industries. Dell opened its own exchange for parts suppliers in November 2000. Four months later it was shuttered. Others, such as GlobalNetXchange (GNX) and the WorldWide Retail Exchange (WWRE), survived by pooling their resources.