Responsive Environments
Augmenting and mediating human experience, interaction, and perception with sensor networks.
We explore how sensor networks augment and mediate human experience, interaction and perception, while developing new sensing modalities and enabling technologies that create new forms of interactive experience and expression. Our current research encompasses the development and application of various types of sensor networks, energy harvesting and power management, and the technical foundation of ubiquitous computing. Our work is highlighted in diverse application areas, which have included automotive systems, smart highways, medical instrumentation, RFID, wearable computing, and interactive media.

Research Projects

  • Chain API

    Joseph A. Paradiso, Gershon Dublon, Brian Mayton and Spencer Russell

    RESTful services and the Web provide a framework and structure for content delivery that is scalable not only in size but more importantly in use cases. As we in Responsive Environments build systems to collect, process, and deliver sensor data, this project serves as a research platform that can be shared between a variety of projects both inside and outside the group. By leveraging hyperlinks between sensor data clients can browse, explore and discover their relationships and interactions in ways that can grow over time.

  • Circuit Stickers

    Joseph A. Paradiso, Jie Qi, Nan-wei Gong and Leah Buechley

    Circuit Stickers is a toolkit for crafting electronics using flexible and sticky electronic pieces. These stickers are created by printing traces on flexible substrates and adding conductive adhesive. These lightweight, flexible, and sticky circuit boards allow us to begin sticking interactivity onto new spaces and interfaces such as clothing, instruments, buildings, and even our bodies.

  • Circuit Stickers Activity Book

    Leah Buechley and Jie Qi

    The Circuit Sticker Activity Book is a primer for using circuit stickers to create expressive electronics. Inside are explanations of the stickers, and circuits and templates for building functional electronics directly on the pages of the book. The book covers five topics, from simple LED circuits to crafting switches and sensors. As users complete the circuits, they are also prompted with craft and drawing activities to ensure an expressive and artistic approach to learning and building circuits. Once completed, the book serves as an encyclopedia of techniques to apply to future projects.

  • Data-Driven Elevator Music

    Joe Paradiso, Gershon Dublon, Brian Dean Mayton and Spencer Russell

    Our glass building lets us see across spaces–through the walls that enclose us and beyond. Yet invisibly, networks of sensors inside and out capture the often imperceivable dimensions of the built and natural environment. Our project uses multi-channel spatial sound to bring that data into the utilitarian experience of riding the glass elevator. In the past, we've mixed live sound from microphones throughout the building with sonification of sensor data, using a pressure sensor to provide fine-grained altitude for control. In its present form, the elevator is displaying data from the Living Observatory, a wetland restoration site 60 miles away. Each string pluck represents a new data point streaming in; its pitch corresponds to the temperature at the sensor and its timbre reflects the humidity. Live and recorded sound reflect the real ambience of the remote wetland.

  • DoppelLab: Experiencing Multimodal Sensor Data

    Joe Paradiso, Gershon Dublon and Brian Dean Mayton
    Homes and offices are being filled with sensor networks to answer specific queries and solve pre-determined problems, but no comprehensive visualization tools exist for fusing these disparate data to examine relationships across spaces and sensing modalities. DoppelLab is a cross-reality virtual environment that represents the multimodal sensor data produced by a building and its inhabitants. Our system encompasses a set of tools for parsing, databasing, visualizing, and sonifying these data; by organizing data by the space from which they originate, DoppelLab provides a platform to make both broad and specific queries about the activities, systems, and relationships in a complex, sensor-rich environment.
  • Experiential Lighting: New User-Interfaces for Lighting Control

    Joseph A. Paradiso, Matthew Aldrich and Nan Zhao

    We are evaluating new methods of interacting and controlling solid-state lighting based on our findings of how participants experience and perceive architectural lighting in our new lighting laboratory (E14-548S). This work, aptly named "Experiential Lighting," reduces the complexity of modern lighting controls (intensity/color/space) into a simple mapping, aided by both human input and sensor measurement. We believe our approach extends beyond general lighting control and is applicable in situations where human-based rankings and preference are critical requirements for control and actuation. We expect our foundational studies to guide future camera-based systems that will inevitably incorporate context in their operation (e.g., Google Glass).

  • FingerSynth: Wearable Transducers for Exploring the Environment through Sound

    Joseph A. Paradiso and Gershon Dublon

    The FingerSynth is a wearable musical instrument made up of a bracelet and set of rings that enables its players to produce sound by touching nearly any surface in their environments. Each ring contains a small, independently controlled audio exciter transducer. The rings sound loudly when they touch a hard object, and are silent otherwise. When a wearer touches their own (or someone else's) head, the contacted person hears sound through bone conduction, inaudible to others. A microcontroller generates a separate audio signal for each ring, and can take user input through an accelerometer in the form of taps, flicks, and other gestures. The player controls the envelope and timbre of the sound by varying the physical pressure and the angle of their finger on the surface, or by touching differently resonant surfaces. The FingerSynth encourages players to experiment with the materials around them and with one another.

  • Hacking the Sketchbook

    Joseph A. Paradiso and Jie Qi

    In this project we investigate how the process of building a circuit can be made more organic, like sketching in a sketchbook. We integrate a rechargeable power supply into the spine of a traditional sketchbook, so that each page of the sketchbook has power connections. This enables users to begin creating functioning circuits directly onto the pages of the book and to annotate as they would in a regular notebook. The sequential nature of the sketchbook allows creators to document their process for circuit design. The book also serves as a single physical archive of various hardware designs. Finally, the portable and rechargeable nature of the book allows users to take their electronic prototypes off of the lab bench and share their creations with people outside of the lab environment.

  • ListenTree: Audio-Haptic Display in the Natural Environment

    V. Michael Bove, Joseph A. Paradiso, Gershon Dublon and Edwina Portocarrero

    ListenTree is an audio-haptic display embedded in the natural environment. Visitors to our installation notice a faint sound emerging from a tree. By resting their heads against the tree, they are able to hear sound through bone conduction. To create this effect, an audio exciter transducer is weatherproofed and attached to the tree's roots, transforming it into a living speaker, channeling audio through its branches, and providing vibrotactile feedback. In one deployment, we used ListenTree to display live sound from an outdoor ecological monitoring sensor network, bringing a faraway wetland into the urban landscape. Our intervention is motivated by a need for forms of display that fade into the background, inviting attention rather than requiring it. We consume most digital information through devices that alienate us from our surroundings; ListenTree points to a future where digital information might become enmeshed in material.

  • Living Observatory: Sensor Networks for Documenting and Experiencing Ecology

    Glorianna Davenport, Joe Paradiso, Gershon Dublon, Pragun Goyal and Brian Dean Mayton

    Living Observatory is an initiative for documenting and interpreting ecological change that will allow people, individually and collectively, to better understand relationships between ecological processes, human lifestyle choices, and climate change adaptation. As part of this initiative, we are developing sensor networks that document ecological processes and allow people to experience the data at different spatial and temporal scales. Low-power sensor nodes capture climate and other data at a high spatiotemporal resolution, while others stream audio. Sensors on trees measure transpiration and other cycles, while fiber-optic cables in streams capture high-resolution temperature data. At the same time, we are developing tools that allow people to explore this data, both remotely and onsite. The remote interface allows for immersive 3D exploration of the terrain, while visitors to the site will be able to access data from the network around them directly from wearable devices.

  • Mobile, Wearable Sensor Data Visualization

    Joseph A. Paradiso, Gershon Dublon, Donald Haddad, Brian Mayton and Spencer Russell

    As part of the Living Observatory ecological sensing initiative, we've been developing new approaches to mobile, wearable sensor data visualization. The Tidmarsh app for Google Glass visualizes real-time sensor network data based on the wearer's location and gaze. A user can approach a sensor node to see 2D plots of its real-time data stream, and look across an expanse to see 3D plots encompassing multiple devices. On the back-end, the app showcases our Chain API, crawling linked data resources to build a dynamic picture of the sensor network. Besides development of new visualizations, we are building in support for voice queries, and exploring ways to encourage distributed data collection by users.

  • Prosthetic Sensor Networks: Factoring Attention, Proprioception, and Sensory Coding

    Gershon Dublon

    Sensor networks permeate our built and natural environments, but our means for interfacing to the resultant data streams have not evolved much beyond HCI and information visualization. Researchers have long experimented with wearable sensors and actuators on the body as assistive devices. A user’s neuroplasticity can, under certain conditions, transcend sensory substitution to enable perceptual-level cognition of “extrasensory” stimuli delivered through existing sensory channels. But there remains a huge gap between data and human sensory experience. We are exploring the space between sensor networks and human augmentation, in which distributed sensors become sensory prostheses. In contrast, user interfaces are substantially unincorporated by the body—our relationship to them never fully pre-attentive. Attention and proprioception are key, not only to moderate and direct stimuli, but also to enable users to move through the world naturally, attending to the sensory modalities relevant to their specific contexts.

  • Sambaza Watts

    Joe Paradiso, Ethan Zuckerman, Rahul Bhargava, Pragun Goyal, Alexis Hope and Nathan Matias

    We want to help people in nations where electric power is scarce to sell power to their neighbors. We’re designing a piece of prototype hardware that plugs into a diesel generator or other power source, distributes the power to multiple outlets, monitors how much power is used, and uses mobile payments to charge the customer for the power consumed.

  • techNailogy

    Cindy Hsin-Liu Kao, Artem Dementyev, Chris Schmandt

    techNailogy is a nail-mounted gestural input surface. Using capacitive sensing on printed electrodes, the interface can distinguish on-nail finger swipe gestures with high accuracy. techNailogy works in real time: we miniaturized the system to fit on the fingernail, while wirelessly transmitting the sensor data to a mobile phone or PC. techNailogy allows for one-handed and always-available input, while being unobtrusive and discreet. Inspired by commercial nail stickers, the device blends into the user’s body, is customizable, fashionable, and even removable. We show example applications of using the device as a remote controller when hands are busy and using the system to increase the input space of mobile phones.

  • Ubiquitous Sonic Overlay

    Joseph A. Paradiso and Spencer Russell

    With our Ubiquitous Sonic Overlay, we are working to place virtual sounds in the user's environment, fixing them in space even as the user moves. We are working toward creating a seamless auditory display indistinguishable from the user's actual surroundings. Between bone-conduction headphones, small and cheap orientation sensors, and ubiquitous GPS, a confluence of fundamental technologies is in place. However, existing head-tracking systems either limit the motion space to a small area (e.g., Occulus Rift), or sacrifice precision for scale using technologies like GPS. We are seeking to bridge the gap to create large outdoor spaces of sonic objects.