Chances are you've heard that Moore's Law may be running out of headroom as it bumps against the speed of light and other inflexible laws of physics. But you probably haven't heard of another obstacle to continued innovation in microprocessor and system development: the Von Neumann bottleneck. Simply put, the time, energy, and bandwidth needed to move data between program and data memory and the CPU is overwhelming the ability of processors to perform their jobs. In the era of big data, this bottleneck is becoming more and more of a problem.
What brings this to mind is a talk I heard this week by John Kelly, the head of IBM Research. Kelly and his colleagues are thinking about a new paradigm (sorry for that word, but it fits) of computing. They call it cognitive computing. Part of that paradigm is design. In the first chapter (you can read an excerpt here) of an unpublished book, Kelly and coauthor Steve Hamm explain it this way:
Data processing should be distributed throughout the computing system, rather than centralized in a CPU. The processing and the memory should be closely integrated -- so there will be less shuttling of data and instructions back and forth. And discrete processing tasks should be executed simultaneously, rather than linearly.
[ Download InfoWorld's Big Data Analytics Deep Dive for a comprehensive, practical overview of this booming field. | Stay ahead of the key tech business news with InfoWorld's Today's Headlines: First Look newsletter. ]
As interesting as a fundamentally new computer architecture may be, there's much more to cognitive computing. In his talk at the Computer History Museum in Mountain View, Calif., Kelly said that cognitive systems "can learn, understand natural language, interact with humans, and have a perception of their surroundings."
No, that's not just blue sky. Watson, the computer program that humbled the competition in a highly publicized "Jeopardy" match two years ago, represents the first generation of cognitive computing. That version ran on 100 powerful, but off-the-shelf, IBM Power 750 servers.
Watson is already out of the lab and into the field; it's also gotten smaller. IBM has partnerships with Memorial Sloan Kettering Cancer Center and the Cleveland Clinic in cancer treatment and with WellPoint in health care management. Big Blue just announced that Watson will be available as a cloud service called the IBM Watson Engagement Advisor.
By nature, researchers are optimists, and Kelly's prediction that we'll see "alternative materials like graphene, carbon nanotubes and nano wires, and things like phase-change memory and the quantum (probably hybrid) devices emerging in 15 years" may well be too optimistic. He is, of course, charged with boosting IBM's brand through such blue-sky notions. Even so, fundamental changes in computing don't happen very often, so in whatever timeline it actually occurs, this is stimulating stuff to think about.
The debut of Watson as a service
Although IBM would be happy to sell Power systems to anyone who cares to buy them, a cloud-based Watson makes much more sense. But because it is so complex, using Watson in the cloud won't be anything like simply plugging into Amazon Web Services, says Michael Karasick, director of IBM's Almaden Research Center.
"Companies will first bring data to Watson," he tells me -- not data about specific problems, but information that will teach Watson about the company's business. For now, Watson, which can actually read, has been trained in eight or so domains, including call centers and financial services.