Sensor(y) Landscapes: Technologies for an Extended Perceptual Presence
Committee:
Professor Joseph A. Paradiso
Alexander W. Dreyfoos (1954) Professor of Media Arts and Sciences
Professor Hiroshi Ishii
Jerome B. Wiesner Professor of Media Arts and Sciences
Glorianna Davenport
Visiting Scientist, MIT Media Lab
Founder, Living Observatory
My dissertation introduces a set of sensed and sounding places, spaces, and devices I built to investigate and cultivate a relationship between technology and perceptual presence. I envision sensor(y) landscapes, sites that meld distributed sensing and sensory perception. There, latent sensory superpowers extend our perceptual abilities through information networks. Connected devices already provide us efficient access to data across distance, time, and scale. In their effectiveness to task, however, the predominant tools of ubiquitous computing are supplanting undirected, curiosity-driven exploration in the world. I argue for the development of alternative technologies that would extend perceptual presence and heighten perceptual sensibilities, drawing insight from a set of original projects.
First, a good sensory superpower is built on rich and compelling sense data in and of the world. As such, a foundational part of this thesis involved deploying sensor infrastructure in beautiful places. My projects center on a wetland restoration site, called Tidmarsh, where ecological data are densely and continuously collected and streamed. This collaboration seeks to make senseable the ecological processes that support a complex ecosystem.
Using sound and vibration as medium and nature as a setting, I advance two approaches. The first, which I call transpresence, constructs environments suffused with sensing and built for being present in. My work in this space comprises sensor-driven virtual worlds, glass elevator sound installations, and vibrating forests that give oral histories. Building on lessons from transpresence, my second approach uses auditory augmented reality to create situated perceptions of data. I developed a bone-conducting device, called HearThere, that produces a spatial live soundscape from distributed microphones and sensors that merges with the user's natural hearing. HearThere combines its wearer's inferred listening behavior with classification output from an AI engine to adjust the mix and spatial rendering of different virtual audio sources. The device was developed based on findings from lab studies into spatial hearing and auditory attention, and evaluated in a human subjects study with a panel of domain experts.
Finally, I construct a framework for designing perceptual interfaces to distributed media and sensor data. In my projects, I found that deriving meaning in the medium is a matter of possessing or developing perceptual sensibilities, intuitions for how the permeated data can be teased out and contemplated. How do users make sense of these new dimensions of perception, and how can technologies be designed to facilitate perceptual sense-making?