Beyond touch: A look at next-gen user interfaces

Innovative interfaces offer 3D and even touch-free interaction with devices

Touchscreens could be extinct if researchers pioneering new human-computer interfaces have anything to say about it. From brain-controlled machines to gesture-driven devices, there's a range of technologies in development that may find their way into everyday electronic devices.

Several conferences this year have given a great glimpse into innovative interfaces and what the future may hold.

[ Developing for iOS, Android, or Windows 8? Check out InfoWorld's guide to developing touch-based apps. | Keep up on the day's tech news headlines with InfoWorld's Today's Headlines: Wrap Up newsletter. ]

Screens that get slippery or sticky -- as feedback
Touchscreens are somewhat limited in giving feedback to a user. The screen may vibrate when tapped, but that's just about all it can do. At this year's CHI (Computer Human Interaction) conference in Vancouver, B.C., in May, a researcher from the University of British Columbia showed a way to completely change the feeling of a screen, at times making it slippery and other times making it sticky. The prototype screen has four actuators that shake the screen.

"This is actually the same technology used in many cell phones or other devices, but it runs at a higher frequency so you don't feel the vibration itself," said Vincent Levesque, a postdoctoral fellow. "It pushes your finger away from the piece of glass, a bit like an air hockey table."

Levesque's team had a demonstration set up with basic file folders on screen. When a folder is selected, the screen becomes slippery. When it is dragged over another folder or the trash, the screen became sticky.

The prototype occupied a sizeable section of the table on which it sat. Wires protruded and circuit boards were visible, making it too bulky to integrate into any mobile devices. The system uses lasers to determine the position of the finger. As the team continues work on the project, it hopes to reduce the system's size and replace the lasers with a capacitive touchscreen.

At the CHI conference, university students and research groups dreamed up most of the projects on display and shared them with potential employers who could license the technology and invest in developing it.

Gestures without touch
Texas A&M University's Interface Ecology Lab favors gestures over touch, creating a gesture-controlled system called ZeroTouch. It looks like an empty picture frame, and the edges are lined with 256 infrared sensors pointing toward the center. The frame is connected to a computer and the computer to a digital projector. "I like to consider it an optical force field," said Jonathan Moeller, a research assistant in the lab.

When the spiderweb of light created by the sensors is broken, the computer interprets the size and depth of the break and displays it as a brushstroke. If just a pencil breaks the beam, the brushstroke will be thin. If an entire arm or head breaks the beam, the stroke will be thick. While painting on the digital canvas, users hold an iPhone on which they can select the color of the brush.

Drawing in the air is just a proof of concept. When ZeroTouch is placed over a traditional computer screen it becomes a touchscreen. Instead of creating brushstrokes, the system moves a cursor. Moeller started working on the project in 2009. It was born out of research that used a projection screen and a camera. He said he thought the system was bulky and wanted to reduce its size.

He considers two-dimensional interaction just the beginning."You can stack layers [of ZeroTouch] together to get depth sensing," he said. The system could then sense objects in a 3D space, as well as allow users to hover over objects. Typically, hovering isn't available with touch systems because a finger would occlude what it's hovering over, he said.

3D interaction
If ZeroTouch becomes the new technology to create 3D objects, the Snowglobe project could provide a way to view and interact with them.

Snowglobe is a large acrylic ball that has an image projected on its inner walls from a hole in the bottom. Two Microsoft Kinect sensors are pointed at users, and when they approach and move around the ball, the object on the inside follows them. When users stretch out their hands, their gestures can control the orientation and size of the object inside the globe. The image is cast by a 3D projector so wearing 3D glasses adds another dimension to the experience.

John Bolton, with the Human Media Lab at Queens University, came up with the idea, on show at CHI 2011, and had been working on it for two years. "If we nest an object inside, we can present all 360 degrees of that object if somebody walks around the display," he explained. "So opposed to just sitting there with a mouse you can walk around and you're presented with the correct view as your position changes." As is true for many of the projects at CHI, there are no immediate plans for commercialization.

Japanese mobile phone operator NTT DoCoMo has shown a project that let users control a music player by moving their eyes, first at the Ceatec 2009 conference near Tokyo and the next year at Mobile World Congress in Barcelona. The prototype includes earbuds that measure the changes in electrical state when a user's eye moves. Those impulses can then be translated into actions like skipping to the next track or turning up the volume. Although it was a crowd-pleaser, NTT DoCoMo says it has no plans to comemrcialize the technology.

Germany's Hasso Plattner Institute took a different approach to gesture interaction. Led by Patrick Baudisch, the Berlin-based group has developed what it calls imaginary interfaces that allow users to interact with mobile devices when they're not in front of them. Imagine hearing your phone ring in your pocket, but instead of taking it out, you hold up your palm and swipe your finger across it to ignore the call.

The prototype system won't be portable anytime soon. It uses depth-sensing cameras mounted above the users, or sometimes on the users' shoulder, to locate where their fingers are and what they're touching.

Baudisch credited Apple with replacing styli with touchscreens, but he and his team wanted to take it one step further. "Why don't we leave this [stylus] out and retrieve no devices at all for these tiny interactions such as turning off an alarm or picking up a phone call or sending to voicebox?" he suggested during CHI 2011. "People will interact directly on the palm of their hand." The system could work because users can remember about 70 percent to 80 percent of their 20 home screen icons and where they're located, he said.

Just as there is an acclimation period when switching from a mobile device with a keyboard to one with only a touchscreen, Baudisch imagined that there would be a similar adjustment to using a device users can't see.

Why touchscreens are taking over
Touchscreens have been around for decades and they won't be replaced anytime soon, says Gartner analyst Ken Dulaney. The real power behind touchscreens is the software with which users can interact, he said. "Pointing to something is human nature," he said. And Dulaney notes that one highly touted alternative -- speech recognition -- isn't perfect and if a word or two is missed the entire context could be changed.

In the short term, Dulaney said, improving the accuracy of the interfaces and reducing onscreen fingerprints will be on the minds of developers. However, he imagines that transparent displays might become popular in the future. Users could simply hold their phones up and content could be overlaid, similar to how today's augmented reality applications use a phone's camera, he said.

At Ceatec 2010 in Japan, TDK showed off transparent screens and according to a May 2011 press release, the company has begun mass production of them. Called electroluminescent displays by TDK, the screens have a resolution of 320 by 240 pixels and are "mainly intended for use as the main display panel in mobile phones and other mobile devices."

Mind control for devices
Brain control interfaces abandon touch and gesture control and rely solely on the power of thought. Researchers at Riken, Japan's government-run research body, have developed a brain machine interface (BMI) that lets users control a wheelchair using thought. The thought patterns are picked up by electroencephalography (EEG) sensors mounted on a user's head. The data is then relayed to a laptop, which interprets it and and sends the control signals to the wheelchair. The system needs about three hours of training per day for a week to achieve a 95-percent accuracy rate, Riken said.

Plans to use the technology in rehabilitation and therapy are already under way, said Andrzej Cichocki, head of the Laboratory for Advanced Brain Signal Processing at Riken.

Based on the same principles, one company showed off a BMI that let users type by just concentrating on letters they want to use. Shown at Cebit 2011, Guger Technologies presented IntendiX, a system that consists of a skullcap with electrodes, a pocket-sized brainwave amplifier and a Windows application that analyzes and decodes the brain waves. To enter a letter, the user must stare at that letter on a virtual keyboard. The software flashes the columns and rows of the keyboard and the system tries to detect a response in the brain when the desired letter is flashed. The system looks for brainwaves that are triggered 300 milliseconds after a stimulus.

"The signal is called P300, it is just a usual signal," said company spokesman Markus Bruckner. "For example, when you drive behind a car and it steps on its brakes and the red light flashes you have the same response." It takes quite a bit of concentration and time to type out just a few letters, but for someone who has no other way of typing, it could bring new opportunities to communicate.

The company hopes to improve the response time and said it's down to one second in the lab.

Martyn Williams and Jay Alabaster in Tokyo contributed to this report.

Nick Barber covers general technology news in both text and video for IDG News Service. Email him at Nick_Barber@idg.com and follow him on Twitter at @nickjb.

From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies