Taking Touch beyond the Touch Screen
A prototype tablet can sense gestures, and objects placed next to it.
A tablet computer developed collaboratively by researchers at Intel, Microsoft, and the University of Washington can be controlled not only by swiping and pinching at the screen, but by touching any surface on which it is placed.
Finding new ways to interact with computers has become an important area of research among computer scientists, especially now that touch-screen smart phones and tablets have grown so popular. The project that produced the new device, called Portico, could eventually result in smart phones or tablets that take touch beyond the physical confines of the device.
"The idea is to allow the interactive space to go beyond the display space or screen space," says Jacob Wobbrock, an assistant professor at the University of Washington’s Information School, in Seattle, who helped develop the system. This is achieved with two foldout cameras that sit above the display on either side, detecting and tracking motion around the screen. The system detects the height of objects and determines whether they are touching the surrounding surface by comparing the two views captured by the cameras. The approach make it possible to detect hand gestures as well as physical objects so that they can interact with the display, says Wobbrock.
In touch: The spacecraft on this tablet’s screen can be controlled by maneuvering the toy on the table next to it.
Credit: Credit: Intel