Responsive Environments
How sensor networks augment and mediate human experience, interaction, and perception.
We explore how sensor networks augment and mediate human experience, interaction and perception, while developing new sensing modalities and enabling technologies that create new forms of interactive experience and expression. Our current research encompasses the development and application of various types of sensor networks, energy harvesting and power management, and the technical foundation of ubiquitous computing. Our work is highlighted in diverse application areas, which have included automotive systems, smart highways, medical instrumentation, RFID, wearable computing, and interactive media.

Research Projects

  • A Cuttable Multi-Touch Sensor

    Simon Olberding, Nan-Wei Gong, John Tiab, Jürgen Steimle, and Joseph A. Paradiso

    We propose cutting as a novel paradigm for ad hoc customization of printed electronic components. As a first example we created a printed, capacitive, multi-touch sensor, which can be cut by the end-user to modify its size and shape. This very direct manipulation allows the end-user to easily make real-world objects and surfaces touch-interactive, to augment physical prototypes and to enhance paper craft. We contribute a set of technical principles for the design of printable circuitry that makes the sensor more robust against cuts, damages, and removed areas. This includes novel physical topologies and printed forward error correction. A technical evaluation compares different topologies and shows that the sensor remains functional when cut to a different shape.

  • Circuit Stickers

    Joseph A. Paradiso, Jie Qi, Nan-wei Gong and Leah Buechley

    Circuit Stickers is a toolkit for crafting electronics using flexible and sticky electronic pieces. These stickers are created by printing traces on flexible substrates and adding conductive adhesive. These lightweight, flexible, and sticky circuit boards allow us to begin sticking interactivity onto new spaces and interfaces such as clothing, instruments, buildings, and even our bodies.

  • Circuit Stickers Activity Book

    Leah Buechley and Jie Qi

    The Circuit Sticker Activity Book is a primer for using circuit stickers to create expressive electronics. Inside are explanations of the stickers, and circuits and templates for building functional electronics directly on the pages of the book. The book covers five topics, from simple LED circuits to crafting switches and sensors. As users complete the circuits, they are also prompted with craft and drawing activities to ensure an expressive and artistic approach to learning and building circuits. Once completed, the book serves as an encyclopedia of techniques to apply to future projects.

  • Customizable Sensate Surface for Music Control

    Joe Paradiso, Nan-Wei Gong, Pragun Goyal and Nan Zhao

    We developed a music control surface which enables integration between any musical instruments via a versatile, customizable, and inexpensive user interface. This sensate surface allows capacitive sensor electrodes and connections between electronics components to be printed onto a large roll of flexible substrate unrestricted in length. The high dynamic range capacitive sensing electrodes can not only infer touch, but near-range, non-contact gestural nuance in a music performance. With this sensate surface, users can “cut” out their desired shapes, “paste” the number of inputs, and customize their controller interfaces, which can then send signals wirelessly to effects or software synthesizers. We seek to find a solution for integrating the form factor of traditional music controllers seamlessly on top of one’s instrument while adding expressiveness to performance by sensing and incorporating movements and gestures to manipulate the musical output.

  • Data-Driven Elevator Music

    Joe Paradiso, Gershon Dublon, Nicholas Joliat, Brian Mayton and Ben Houge (MIT Artist in Residence)

    Our new building lets us see across spaces, extending our visual perception beyond the walls that enclose us. Yet, invisibly, networks of sensors, from HVAC and lighting systems to Twitter and RFID, control our environment and capture our social dynamics. This project proposes extending our senses into this world of information, imagining the building as glass in every sense. Sensor devices distributed throughout the Lab transmit privacy-protected audio streams and real-time measurements of motion, temperature, humidity, and light levels. The data are composed into an eight-channel audio installation in the glass elevator that turns these dynamic parameters into music, while microphone streams are spatialized to simulate their real locations in the building. A pressure sensor in the elevator provides us with fine-grained altitude to control the spatialization and sonification. As visitors move from floor to floor, they hear the activities taking place on each.

  • DoppelLab: Experiencing Multimodal Sensor Data

    Joe Paradiso, Gershon Dublon and Brian Dean Mayton
    Homes and offices are being filled with sensor networks to answer specific queries and solve pre-determined problems, but no comprehensive visualization tools exist for fusing these disparate data to examine relationships across spaces and sensing modalities. DoppelLab is a cross-reality virtual environment that represents the multimodal sensor data produced by a building and its inhabitants. Our system encompasses a set of tools for parsing, databasing, visualizing, and sonifying these data; by organizing data by the space from which they originate, DoppelLab provides a platform to make both broad and specific queries about the activities, systems, and relationships in a complex, sensor-rich environment.
  • Ergonomic Micro-Gesture Recognition and Interaction Evaluation

    Joseph A. Paradiso and David Way

    Using machine learning, computer vision, and wrist-worn, smaller, time-of-flight cameras, we can recover hand pose and micro-gesture (small movements of the fingers and thumb). It is clear that ubiquitous wearables will need a similar eyes-free user interface–but how should this interface be designed? We are examining interaction through user tests–what gesture set designs work well for text entry or focus selection? How can we predict user experience and the usability of such systems? We hope to answer such questions through the EMGRIE system and experimental application design.

  • Experiential Lighting: New User-Interfaces for Lighting Control

    Joseph A. Paradiso, Matthew Aldrich and Nan Zhao

    We are evaluating new methods of interacting and controlling solid-state lighting based on our findings of how participants experience and perceive architectural lighting in our new lighting laboratory (E14-548S). This work, aptly named "Experiential Lighting," reduces the complexity of modern lighting controls (intensity/color/space) into a simple mapping, aided by both human input and sensor measurement. We believe our approach extends beyond general lighting control and is applicable in situations where human-based rankings and preference are critical requirements for control and actuation. We expect our foundational studies to guide future camera-based systems that will inevitably incorporate context in their operation (e.g., Google Glass).

  • Feedback Controlled Solid State Lighting

    Joe Paradiso, Matthew Aldrich and Nan Zhao

    At present, luminous efficacy and cost remain the greatest barriers to broad adoption of LED lighting. However, it is anticipated that within several years, these challenges will be overcome. While we may think our basic lighting needs have been met, this technology offers many more opportunities than just energy efficiency: this research attempts to alter our expectations for lighting and cast aside our assumptions about control and performance. We will introduce new, low-cost sensing modalities that are attuned to human factors such as user context, circadian rhythms, or productivity, and integrate these data with atypical environmental factors to move beyond traditional lux measurements. To research and study these themes, we are focusing on the development of superior color-rendering systems, new power topologies for LED control, and low-cost multimodal sensor networks to monitor the lighting network as well as the environment.

  • FingerSynth: Wearable Transducers for Exploring the Environment through Sound

    Joseph A. Paradiso and Gershon Dublon

    The FingerSynth is a wearable musical instrument made up of a bracelet and set of rings that enable its player to produce sound by touching nearly any surface in their environment. Each ring contains a small, independently controlled audio exciter transducer. The rings sound loudly when they touch a hard object, and are silent otherwise. When a wearer touches their own (or someone else's) head, the contacted person hears sound through bone conduction, inaudible to others. A microcontroller generates a separate audio signal for each ring, and can take user input through an accelerometer in the form of taps, flicks, and other gestures. The player controls the envelope and timbre of the sound by varying the physical pressure and the angle of their finger on the surface, or by touching differently resonant surfaces. The FingerSynth encourages players to experiment with the materials around them and with one another.

  • FreeD

    Joe Paradiso and Amit Zoran

    The FreeD is a hand-held, digitally controlled milling device that is guided and monitored by a computer while still preserving the craftsperson's freedom to sculpt and carve. The computer will intervene only when the milling bit approaches the planned model. Its interaction is either by slowing down the spindle speed or by drawing back the shaft; the rest of the time it allows complete freedom, letting the user to manipulate and shape the work in any creative way.

  • Gestures Everywhere

    Joseph A. Paradiso and Nicholas Gillian

    Gestures Everywhere is a multimodal framework for supporting ubiquitous computing. Our framework aggregates the real-time data from a wide range of heterogeneous sensors, and provides an abstraction layer through which other ubiquitous applications can request information about an environment or a specific individual. The Gestures Everywhere framework supports both low-level spatio-temporal properties, such as presence, count, orientation, location, and identity; in addition to higher-level descriptors, including movement classification, social clustering, and gesture recognition.

  • Hackable, High-Bandwidth Sensory Augmentation

    Joe Paradiso and Gershon Dublon

    The tongue has extremely dense sensing resolution, as well as an extraordinary degree of neuroplasticity–the ability to adapt to and internalize new input. Research has shown that electro-tactile tongue displays paired with cameras can be used as vision prosthetics for the blind or visually impaired; users quickly learn to read and navigate through natural environments, and many describe the signals as an innate sense. However, existing displays are expensive and difficult to adapt. Tongueduino is an inexpensive, vinyl-cut tongue display designed to interface with many types of sensors besides cameras. Connected to a magnetometer, for example, the system provides a user with an internal sense of direction, like a migratory bird. Plugged into weighted piezo whiskers, a user can sense orientation, wind, and the lightest touch. Through tongueduino, we hope to bring electro-tactile sensory substitution beyond vision replacement, towards open-ended sensory augmentation.

  • Hacking the Sketchbook

    Joseph A. Paradiso and Jie Qi

    In this project we investigate how the process of building a circuit can be made more organic, like sketching in a sketchbook. We integrate a rechargeable power supply into the spine of a traditional sketchbook, so that each page of the sketchbook has power connections. This enables users to begin creating functioning circuits directly onto the pages of the book and to annotate as they would in a regular notebook. The sequential nature of the sketchbook allows creators to document their process for circuit design. The book also serves as a single physical archive of various hardware designs. Finally, the portable and rechargeable nature of the book allows users to take their electronic prototypes off of the lab bench and share their creations with people outside of the lab environment.

  • Human Factors and Lighting

    Joseph A. Paradiso, Matthew Aldrich, Nan Zhao, Eun Young Lim (Visiting Researcher, Samsung)

    In a series of psychometric experiments, we tested subjects' perception of lighting in a virtual environment to assess the possibility of describing and subsequently controlling lighting in a dimension other than brightness. Our findings suggest that human perception of lighting is also explained by variables other than brightness. These data are used to design a lighting control system that simultaneously maps the spatial and visual characteristics of the room into a more natural and intuitive form of control.

  • ListenTree: Audio-Haptic Display in the Natural Environment

    V. Michael Bove, Joseph A. Paradiso, Gershon Dublon and Edwina Portocarrero

    ListenTree is an audio-haptic display embedded in the natural environment. A visitor to our installation notices a faint sound appearing to emerge from a tree. By resting their head against the tree, they are able to hear sound through bone conduction. To create this effect, an audio exciter transducer is weatherproofed and attached to the tree's roots, transforming it into a living speaker that channels audio through its branches and provides vibrotactile feedback. In one deployment, we used the ListenTree to display live sound from an outdoor ecological monitoring sensor network, bringing a faraway wetland into the urban landscape. Our intervention is motivated by a need for forms of display that fade into the background, inviting attention rather than requiring it. We consume most digital information through devices that alienate us from our surroundings; ListenTree points to a future where digital information might become enmeshed in material.

  • Living Observatory: Sensor Networks for Documenting and Experiencing Ecology

    Glorianna Davenport, Joe Paradiso, Gershon Dublon, Pragun Goyal and Brian Dean Mayton

    Living Observatory is an initiative for documenting and interpreting ecological change that will allow people, individually and collectively, to better understand relationships between ecological processes, human lifestyle choices, and climate change adaptation. As part of this initiative, we are developing sensor networks that document ecological processes and allow people to experience the data at different spatial and temporal scales. Low-power sensor nodes capture climate and other data at a high spatiotemporal resolution, while others stream audio. Sensors on trees measure transpiration and other cycles, while fiber-optic cables in streams capture high resolution temperature data. At the same time, we are developing tools that allow people to explore this data, both remotely and onsite. The remote interface allows for immersive 3D exploration of the terrain, while visitors to the site will be able to access data from the network around them directly from wearable devices.

  • People Tracking System

    Joseph A. Paradiso, Nicholas Gillian and Sara Pfenninger

    Over the last decade, tracking people has attracted considerable research interest. Cameras enable efficient tracking of people but it is a nontrivial task, especially when the sensors have a non-overlapping range. The goal is to track–that is, both locate and identify–people around a multi-story building with a sparse network formed by non-overlapping cameras and RFID sensors.

  • PrintSense: A Versatile Sensing Technique to Support Flexible Surface Interaction

    Joseph A. Paradiso and Nan-Wei Gong

    Touch sensing has become established for a range of devices and systems both commercially and in academia. In particular, multi-touch scenarios based on flexible sensing substrates are popular for products and research projects. We leverage recent developments in single-layer, off-the-shelf, inkjet-printed conductors on flexible substrates as a practical way to prototype the necessary electrode patterns, and combine this with our custom-designed PrintSense hardware module which uses the full range of sensing techniques. Not only do we support touch detection, but in many scenarios also pressure, flexing, and close proximity gestures.

  • Prosthetic Sensor Networks: Factoring Attention, Proprioception, and Sensory Coding

    Gershon Dublon

    Sensor networks permeate our built and natural environments, but our means for interfacing to the resultant data streams have not evolved much beyond HCI and information visualization. Researchers have long experimented with wearable sensors and actuators on the body as assistive devices. A user’s neuroplasticity can, under certain conditions, transcend sensory substitution to enable perceptual-level cognition of “extrasensory” stimuli delivered through existing sensory channels. But there remains a huge gap between data and human sensory experience. We are exploring the space between sensor networks and human augmentation, in which distributed sensors become sensory prostheses. In contrast, user interfaces are substantially unincorporated by the body—our relationship to them never fully pre-attentive. Attention and proprioception are key, not only to moderate and direct stimuli, but also to enable users to move through the world naturally, attending to the sensory modalities relevant to their specific contexts.

  • Sambaza Watts

    Joe Paradiso, Ethan Zuckerman, Rahul Bhargava, Pragun Goyal, Alexis Hope and Nathan Matias

    We want to help people in nations where electric power is scarce to sell power to their neighbors. We’re designing a piece of prototype hardware that plugs into a diesel generator or other power source, distributes the power to multiple outlets, monitors how much power is used, and uses mobile payments to charge the customer for the power consumed.

  • Scalable and Versatile Surface for Ubiquitous Sensing

    Joe Paradiso, Nan-Wei Gong and Steve Hodges (Microsoft Research Cambridge)

    We demonstrate the design and implementation of a new versatile, scalable, and cost-effective sensate surface. The system is based on a new conductive inkjet technology, which allows capacitive sensor electrodes and different types of RF antennas to be cheaply printed onto a roll of flexible substrate that may be many meters long. By deploying this surface on (or under) a floor, it is possible to detect the presence and whereabouts of users through both passive and active capacitive coupling schemes. We have also incorporated GSM and NFC electromagnetic radiation sensing and piezoelectric pressure and vibration detection. We believe that this technology has the potential to change the way we think about covering large areas with sensors and associated electronic circuitry–not just floors, but potentially desktops, walls, and beyond.

  • SEAT-E: Solar Power for People, Big Data for Cities

    Kent Larson, Joseph A. Paradiso, Sandra Richter, Nan Zhao and Ines Gaisset

    SEAT-E provides free access to renewable energy to charge smart phones and small electronic devices in cities, bringing cities one step closer to fulfilling a key UN goal: sustainable energy access for all. The seats are off-grid and entirely autonomous. Fully integrated solar panels store energy in Li-ion batteries and can be accessed through weatherproof USB ports. The batteries also power lighting and sensing. Each seat has an ID and forms part of the SEAT-E network. The seats gather location-based data on air quality; cities typically measure air quality only at one or two locations, but levels vary significantly depending on traffic and other factors. As a result, policymakers and citizens are often uninformed. Public engagement with this sensor data has the potential to create a platform for real dialogue between cities and their citizens about the air we share.

  • Sensor Fusion for Gesture Analyses of Baseball Pitching

    Joseph A. Paradiso, Carolina Brum Medeiros and Michael Lapinski

    Current sports-medicine practices for understanding the motion of athletes while engaged in their sport of choice are limited to camera-based marker tracking systems that generally lack the fidelity and sampling rates necessary to make medically usable measurements; they also typically require a structured, stable "studio" environment, and need considerable time to set up and calibrate. The data from our system provides the ability to understand the forces and torques that an athlete's joints and body segments undergo during activity. It also allows for precise biomechanical modeling of an athlete's motion. The application of sensor fusion techniques is essential for optimal extraction of kinetic and kinematic information. Also, it provides an alternative measurement method that can be used in out-of-lab scenarios.

  • Virtual Messenger

    Joe Paradiso and Nick Gillian

    The virtual messenger system acts as a portal to subtly communicate messages and pass information between the digital, virtual, and physical worlds, using the Media Lab’s Glass Infrastructure system. Users who opt into the system are tracked throughout the Media Lab by a multimodal sensor network. If a participating user approaches any of the Lab’s Glass Infrastructure displays they are met by their virtual personal assistant (VPA), who exists in DoppelLab’s virtual representation of the current physical space. Each VPA acts as a mediator to pass on any messages or important information from the digital world to the user in the physical world. Participating users can interact with their VPA through a small subset of hand gestures, allowing the user to read any pending messages or notices, or inform their virtual avatar not to bother them until later.

  • Wearable, Wireless Sensor System for Sports Medicine and Interactive Media

    Joe Paradiso, Michael Thomas Lapinski, Dr. Eric Berkson and MGH Sports Medicine
    This project is a system of compact, wearable, wireless sensor nodes, equipped with full six-degree-of-freedom inertial measurement units and node-to-node capacitive proximity sensing. A high-bandwidth, channel-shared RF protocol has been developed to acquire data from many (e.g., 25) of these sensors at 100 Hz full-state update rates, and software is being developed to fuse this data into a compact set of descriptive parameters in real time. A base station and central computer clock the network and process received data. We aim to capture and analyze the physical movements of multiple people in real time, using unobtrusive sensors worn on the body. Applications abound in biomotion analysis, sports medicine, health monitoring, interactive exercise, immersive gaming, and interactive dance ensemble performance.