While users go crazy for touchscreens on their smartphones and tablets, Minnesota startup Tarsier is working on what it calls the interface for the next 30 years: a touchscreen you don't have to touch.
Tarsier's MoveEye technology lets users reach out and manipulate icons, windows, or images on a screen as if they're floating in the air, according to co-founder Shafa Wala. The company is looking toward TVs as a natural place to implement MoveEye, because users normally look at TVs from several feet away. The company will demonstrate some of the MoveEye technology at the Demo Fall conference this week in Santa Clara, California.
[ Understand how to both manage and benefit from the consumerization of IT with InfoWorld's "Consumerization Digital Spotlight" PDF special report. | Subscribe to InfoWorld's Consumerization of IT newsletter today, then join our #CoIT discussion group at LinkedIn. ]
Despite years of promises of interactive TV, the systems that have hit the market, such as Google TV and Microsoft's earlier WebTV, have never gained much traction.
"Today, we don't really have a way to interact with our TVs from a distance," Wala said. "The remote control, the mouse, and the keyboard are really not the ideal input devices for interacting with TVs."
In place of those familiar PC tools, MoveEye uses a special pair of glasses, a "media box" to run the interface, and software. The glasses have a built-in stereoscopic pair of cameras pointed at the screen, sensors to detect the viewer's eye movements, and Wi-Fi to talk to the media box. Together, these give the illusion of viewing and touching the interface in thin air.
"You just point at what you want to interact with, or you just grab something that's projecting out of the screen," Wala said.
The first version of MoveEye will be two-dimensional, with the user manipulating objects on a plane suspended in space. Later, Tarsier will add 3D capability so users can feel as if they're reaching into an interface that has more than one layer.
Using gestures to control something on a screen is not new. Both Nintendo's Wii and Microsoft's Kinect let users play games and carry out other tasks with movements and gestures. But MoveEye can better read hand gestures as well as where those gestures are directed, Wala said. That precision is necessary for the full range of actions on a computer, such as grabbing a small object, he said.
MoveEye achieves this by viewing the screen from the user's perspective. While the Wii system follows a controller in the user's hand and Kinect faces the user and watches body movements, MoveEye captures the user's own perspective through the stereoscopic camera in its glasses. That, and the eye-tracking sensors, help to tell the system precisely where each element on the screen appears from the user's perspective, Wala said. Algorithms and software developed by Tarsier do the rest.
Though games on big-screen TVs are one obvious use for MoveEye, Tarsier has higher ambitions as well. A 3D MoveEye interface could be used for military applications such as combat simulations and controlling bomb-defusing robots, Wala said. In medicine, it could allow doctors to virtually reach inside 3D MRI scans, he said.