Why touchscreens are taking over
Touchscreens have been around for decades and they won't be replaced anytime soon, says Gartner analyst Ken Dulaney. The real power behind touchscreens is the software with which users can interact, he said. "Pointing to something is human nature," he said. And Dulaney notes that one highly touted alternative -- speech recognition -- isn't perfect and if a word or two is missed the entire context could be changed.
In the short term, Dulaney said, improving the accuracy of the interfaces and reducing onscreen fingerprints will be on the minds of developers. However, he imagines that transparent displays might become popular in the future. Users could simply hold their phones up and content could be overlaid, similar to how today's augmented reality applications use a phone's camera, he said.
At Ceatec 2010 in Japan, TDK showed off transparent screens and according to a May 2011 press release, the company has begun mass production of them. Called electroluminescent displays by TDK, the screens have a resolution of 320 by 240 pixels and are "mainly intended for use as the main display panel in mobile phones and other mobile devices."
Mind control for devices
Brain control interfaces abandon touch and gesture control and rely solely on the power of thought. Researchers at Riken, Japan's government-run research body, have developed a brain machine interface (BMI) that lets users control a wheelchair using thought. The thought patterns are picked up by electroencephalography (EEG) sensors mounted on a user's head. The data is then relayed to a laptop, which interprets it and and sends the control signals to the wheelchair. The system needs about three hours of training per day for a week to achieve a 95-percent accuracy rate, Riken said.
Plans to use the technology in rehabilitation and therapy are already under way, said Andrzej Cichocki, head of the Laboratory for Advanced Brain Signal Processing at Riken.
Based on the same principles, one company showed off a BMI that let users type by just concentrating on letters they want to use. Shown at Cebit 2011, Guger Technologies presented IntendiX, a system that consists of a skullcap with electrodes, a pocket-sized brainwave amplifier and a Windows application that analyzes and decodes the brain waves. To enter a letter, the user must stare at that letter on a virtual keyboard. The software flashes the columns and rows of the keyboard and the system tries to detect a response in the brain when the desired letter is flashed. The system looks for brainwaves that are triggered 300 milliseconds after a stimulus.
"The signal is called P300, it is just a usual signal," said company spokesman Markus Bruckner. "For example, when you drive behind a car and it steps on its brakes and the red light flashes you have the same response." It takes quite a bit of concentration and time to type out just a few letters, but for someone who has no other way of typing, it could bring new opportunities to communicate.
The company hopes to improve the response time and said it's down to one second in the lab.
Martyn Williams and Jay Alabaster in Tokyo contributed to this report.