Project

Experiential Lighting: New User Interfaces for Lighting Control

Groups

The vision of pervasive computing is now mainstream. These connected devices permeate every aspect of our lives. Yet, we remain tethered to arcane user interfaces. Unlike consumer devices, building appliances and utilities perpetuate this outdated vision. Lighting control is a prime example. Here, we show how a data-driven methodology—using people and sensors—enables an entirely new method of lighting control.

We are evaluating new methods of interacting and controlling solid-state lighting based on our findings of how participants experience and perceive architectural lighting in our new lighting laboratory (E14-548S). This work, aptly named "Experiential Lighting," reduces the complexity of modern lighting controls (intensity/color/space) into a simple mapping, aided by both human input and sensor measurement. We believe our approach extends beyond general lighting control and is applicable in situations where human-based rankings and preference are critical requirements for control and actuation. We expect our foundational studies to guide future camera-based systems that will inevitably incorporate context in their operation (e.g., Google Glass).

The vision of pervasive computing is now mainstream. These connected devices permeate every aspect of our lives. Yet, we remain tethered to arcane user interfaces. Unlike consumer devices, building appliances and utilities perpetuate this outdated vision. Lighting control is a prime example. Here, we show how a data-driven methodology—using people and sensors—enables an entirely new method of lighting control.

We are evaluating new methods of interacting and controlling solid-state lighting based on our findings of how participants experience and perceive architectural lighting in our new lighting laboratory (E14-548S). This work, aptly named "Experiential Lighting," reduces the complexity of modern lighting controls (intensity/color/space) into a simple mapping, aided by both human input and sensor measurement. We believe our approach extends beyond general lighting control and is applicable in situations where human-based rankings and preference are critical requirements for control and actuation. We expect our foundational studies to guide future camera-based systems that will inevitably incorporate context in their operation (e.g., Google Glass).

Principle Investigator: Joseph Paradiso

Research Group: Responsive Environments group at the MIT Media Lab

Research Assistants: Matt Aldrich, Nan Zhao

Collaborators: Susanne Seitinger from Philips Lighting

Project at a glance

Person People
Nan Zhao
Research Scientist
Past Member
Person People
Matthew Aldrich
Research Assistant
Person People
Joseph A. Paradiso
Alexander W Dreyfoos (1954) Professor