Event

Andrea Colaço Dissertation Defense

Monday
March 10, 2014

Location

MIT Media Lab, E14-633

Description

The space of applications supported by our mobile devices has dwarfed their original purpose thereby creating richer interaction experiences for users. As computing capabilities grow and form-factors evolve, the most pronounced limitation with mobile devices is display size. Touchscreen interfaces have several limitations: the act of touching the screen occludes the display, interface elements like keyboards consume precious display real estate, and even simple tasks like document navigation – which the user performs effortlessly using a mouse and keyboard – require repeated actions like pinch-and-zoom with touch input. This thesis is motivated by these inherent limitations of using touch input to interact with mobile devices, and the technical constraints of low power budget, minimal computation and accurate performance that are necessary to bring alternate input to mobile devices.
This thesis explores the space around the device as a means of touchless gestural input to devices with small or no displays. Capturing gestural input in the surrounding volume requires sensing the human hand. To achieve gestural input Colaço presents a novel compact, low-power 3D sensor for short-range gestural control of small display devices. Her system, Mime provides fast and accurate 3D gesture information and is built using standard, low-cost opto-electronic components. Colaço's sensor is based on a novel 3D acquisition method based on parametric signal processing. She also combines the proposed low-power time-of-flight (TOF) sensing for 3D hand motion tracking with RGB image-based computer vision algorithms for finer shape-based gestural control.
Colaço demonstrates two main applications of 3D spatial input using close-range gestures, on-the-go interaction, and operation in cluttered environments and in broad daylight conditions. Live Trace, implemented on the Google Glass is an app for manipulating visual content using gesture while a picture is being taken and the scene and desired view is fresh and alive. For smart phones, she presents a side-facing back-to-the-desktop configuration. The Mime sensor on the phone allows the table surface next to the phone to be mapped to conventional desktop windows, and the phone’s display is a small viewport onto this desktop. Moving the hand is like moving the mouse, and as the user shifts into another part of the desktop, the phone viewport display moves with it. Instead of writing new applications to use smart surfaces, existing applications can be readily controlled with the hands.

Host/Chair: Chris Schmandt

Participant(s)/Committee

Joseph A. Paradiso, Vivek K. Goyal

More Events