Sign In

Communications of the ACM

ACM TechNews

Computing With a Wave of the Hand

View as: Print Mobile App Share:
MIT gestural interface

MIT Media Lab researchers demonstrate a laboratory mockup of a thin-screen LCD display with built-in optical sensors.

Credit: Matthew Hirsch, Douglas Lanman, Ramesh Raskar, Henry Holtzman / MIT Media Lab

Massachusetts Institute of Technology Media Lab researchers have devised a way to turn liquid crystal displays (LCDs) into lens-less cameras through the use of embedded optical sensors. At ACM's SIGGRAPH Asia conference on Dec. 19 the researchers will demonstrate a gestural interface through which users can manipulate on-screen images. "The goal with this is to be able to incorporate the gestural display into a thin LCD device and to be able to do it without wearing gloves or anything like that," says Media Lab Ph.D. candidate Matthew Hirsch.

The system uses an array of liquid crystals backed by an array of optical sensors. The crystals display a black-and-white pattern of squares that directs light to the sensors behind the crystals. This pattern enables the system to computationally disentangle the images, capturing the same depth information as a pinhole array would, only much faster. Lab experiments showed that the researchers could manipulate on-screen objects with hand gestures, and seamlessly switch back and forth between gestural control and ordinary touch screen control.

"[This system] is much better than just figuring out just where the fingertips are or a kind of motion-capture situation," says Paul Debevec with the University of Southern California's Institute for Creative Technologies. "It's really a full three-dimensional image of the person's hand that's in front of the display."

View videos of MIT Media Lab's BiDi (bi-directional) screen.

From MIT News
View Full Article


Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account