IBM's next revolution: Computers that see, smell, hear, and taste

Following in Watson's footsteps, computers will let users fondle virtual silk shirts and translate baby talk in next five years

Computers will be capable of predicting colds, devising healthy food recipes, translating baby talk, and replicating the feel of textures over the next five years, according to IBM's Next 5 in 5, a forecast of how computers will mimic the senses by 2017.

Citing Watson -- IBM's supercomputer "Jeopardy" champion -- as the first step in developing computers capable of learning from interactions with big data and humans, IBM Chief Innovation Officer Bernard Meyerson writes "one of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain. New technologies make it possible for machines to mimic and augment the senses."

Those senses include touch, sight, hearing, taste, and smell, according to IBM. In terms of touch, IBM researchers noted it's already possible to re-create a sense of texture through vibration, such as in rumble packs found in game controls. "Those vibrations haven't been translated into a lexicon, or dictionary of textures that match the physical experience," according to Robyn Schwartz, associate director of IBM Research Retail Analytics; Dhandapani Shanmugam, solutions architect; and Siddique A. Mohammed, software architect of IBM Software Group Industry Solutions. "By matching variable-frequency patterns of vibration to physical objects so that when a shopper touches what the webpage says is a silk shirt, the screen will emit vibrations that match what our skin mentally translates to the feel of silk."

Using digital image processing and digital image correlation, the researchers wrote, it will be possible to capture texture qualities in a PIM (product information management) system, which retailers could use to match textures with their products and their products' data, such as sizes, ingredients, and dimensions.

Other applications for this touch technology include gaining understanding of our environment. "Take farming, for example. Farmers could use a mobile device to determine the health of their crop by comparing what they're growing to a dictionary of healthy options that they feels through a tablet," they wrote.

There could be health applications as well, such as be able to send an image of an injury to a doctor, who could use the more data-rich image to quickly render a diagnosis.

IBM's John Smith, senior manager of Intelligent Information Management, outlined how computers will be able to learn from seeing to help us understand the 500 billion photos we take every year. For example, by showing a computer thousands of pictures of beaches, it will learn over time to detect the patterns that go into properly identifying seaside photos.

This sort of technology has health implications, according to Smith. "Take dermatology. Patients often have visible symptoms of skin cancer by the time they see a doctor. By having many images of patients from scans over time, a computer then could look for patterns and identify situations where there may be something pre-cancerous, well before melanomas become visible," he wrote.

1 2 Page
Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies