New System Detects Touch, Gesture on any Surface
This Research in Action article was provided to LiveScience in partnership with the National Science Foundation.
This image shows how a touch-activated system developed by Purdue University researchers computes finger and hand positions.
The new system, developed in part with support from the National Science Foundation, projects onto walls and any other plain surface. It also recognizes hand posture and gestures, revealing individual users by their unique traits.
It allows more than one person to interact with the "screen" at the same time and also recognizes two-handed touch, distinguishing between left and right hands.
"Imagine having giant iPads everywhere," says assistant professor of electrical and computer engineering Niklas Elmqvist, "on any wall in your house or office, every kitchen counter, without using expensive technology.
"You can use any surface, even a dumb physical surface like wood. You don't need to install expensive LED displays and touch-sensitive screens."
The researchers say the system is 98 percent accurate in determining hand posture, which is critical to recognizing gestures and carrying out commands.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
The technology has many possible applications, said Karthik Ramani, Purdue's Donald W. Feddersen Professor of Mechanical Engineering.
"Basically, it might be used for any interior surface to interact virtually with a computer," he said. "You could use it for living environments, to turn appliances on, in a design studio to work on a concept or in a laboratory, where a student and instructor interact."
The system uses a Microsoft Kinect camera, which senses space in 3D.
Read more about the new touch-activated system.
Editor's Note: Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. See the Research in Action archive.