Interactive Surfaces

[MIT Media Laboratory Banner]

One of our research areas involves the development of sensor suites for tracking user gesture atop large interactive surfaces (e.g., large rear-projection walls, tabletops, windows, floors, etc.). We have fielded several systems for such applications. These have included:



Acoustic "Tap Tracking"


In this project, we characterize and track the location of knocks, bangs, and taps on a surface by using distributed acoustic sensors and time-of-arrival algorithms.


The "LaserWall"


A very simple and inexpensive scanning laser rangefinder system has been developed for use as a precise gestural interface in front of a "smart" interactive surface. This device works by detecting the phase shift between the emitted laser and the detected reflection off of a bare hand. A simple microprocessor locates peaks in the intensity data corresponding to different objects, and outputs the angle and range information for each one over a serial connection. Unlike computer vision systems, our 2D scanner measurement is unambiguous, it requires essentially no processing, and is unaffected by background light.


The "Gesture Wall"


The Gesture Wall is an interactive smart-wall installation built for the Brain Opera that uses electric field sensing to track hand and bulk body gesture for interacting with projected video and parametric music.




The "Magic Carpet"


The Media Lab's first smart floor, using a grid of piezoelectric wires beneath a thin capet layer. We also tracked upper-body motion with a Doppler radar system. This system is now a permanent installation at the MIT Mueseum.



Return to the Responsive Environments Group Projects Page







Joe Paradiso

Ari Adler
Nisha Checka
Jeff Hayashida
Kai-Yuh Hsiao
Che King Leo
Josh Lifton
Hong Ma
Josh Strickon
W. Wichakool
Chris Yang