It's springtime at last for cognitive computing

Artificial intelligence suffered a long winter, but a new name -- cognitive computing -- and a flood of data, innovation, and compute power now has thousands smart applications flourishing

I’ve been in the IT industry since the mid-1980s, and I’ve seen technology fads come and go. I’m as jaded as the next industry veteran.

For most of my career, artificial intelligence (AI) was generally derided as a research-driven fad that failed to achieve its early promise. When you hear about the so-called AI winter, it refers to the successive waves of AI hype and disillusionment triggered by the underwhelming commercial adoption of supposedly intelligent systems. The net effect of these waves had been to harden longtime industry observers in their cynicism and keep the AI niche from growing into a substantial piece of the IT mainstream.

Cut to the early 2010s and the emergence of cognitive computing, as exemplified by IBM Watson. If you take the long view on such technologies, you can't help but notice that cognitive computing is essentially the latest generation of AI.

Cognitive computing is AI that feeds on big data. In this new AI era, automated systems derive intelligence by feeding big data into machine learning and other statistical algorithms, contextualizing it all in rich metadata, and using it to generate the analytics that drive decision engines and process automation in real time.

It’s understandable that many industry veterans might be slow to get behind this latest wave -- and you can understand why many might anticipate yet another underperforming AI wave in the making. That skepticism has been rife in some circles of the big data analytics industry until only recently. But it feels like the backlash against cognitive computing is waning, for a good reason: It’s being adopted widely and applied to a wide range of commercial applications.

You might say that this time around we’ve avoided another AI winter and are now moving into the brighter days of a “cognitive spring.” Evidence for this hopeful new season in AI’s commercial development is everywhere:

Data scientists are the prime tillers of the big data topsoil from which cognitive computing apps are springing. Extending this horticultural metaphor a bit further, machine learning is the plow that unlocks the intelligence lying fallow in the big data substrate.

In this emerging era, the fruits of cognitive computing are everywhere we look: in cloud-based intelligence that drives practical applications of social media, smartphones, online commerce, streaming media, computer vision, voice recognition, smart cars, sensor grids, global positioning, real-time surveillance, and much more.

Could a popular backlash against cognitive computing still be in the works? If so, what might cause it? Just as important, what’s the likelihood that any such backlash might slow this technology’s momentum pushing more deeply into the mainstream of consumer, business, industrial, and other big data applications?

On some level, the widespread unease with the algorithmic foundations of the modern economy might slow the adoption of cognitive computing, which is algorithmic down to its very marrow. But that zeitgeist trend isn’t likely to chill the cognitive spring so much as ramp up calls for “algorithmic accountability” in the public and private sectors.

Actually, you might regard growing calls for such accountability as another indicator of cognitive computing’s intensifying adoption.

In other words, it’s another undeniable sign that the cognitive spring is well under way.

Copyright © 2015 IDG Communications, Inc.