Center for Terrestrial Sensing


The mission of the Media Lab's new Center for Terrestrial Sensing is to explore unconventional ways to sense and visualize inaccessible natural environments–places where it is impossible for humans to go physically, such as underground, undersea oil fields, and the atmosphere. How people connect with, navigate, and interact with large amounts of geoscience information is an area with both world-changing potential and deep challenges. The Center for Terrestrial Sensing aims to connect the People to the Planet.

Research Vision


The Center’s themes will address the fundamental challenges and opportunities presented by global sensing and user interfaces of volumetric data visualization. We will also explore remote collaboration to address connection and interaction challenges among people located at different spaces and times.

We aim to redesign, develop, and build new visualization tools that reveal the unseen and relay information that lies beyond human perception. To this end, the Center explores new ways to create novel 3D user-interface interaction models for instrumented global environments by focusing on three main research pivots: sensing, visualisation, and remote collaboration.

We will tackle these challenges by addressing the following questions:

--How to represent and interact with multi-dimensional (>=3D) information and models?
--How to seamlessly integrate inter-personal and 3D shared workspaces for geographically distributed collaboration?
--How to create efficient novel mobile devices and tools for field access to 3D data and simulations as well as collaboration?

Primary Investigator: 
Hiroshi Ishii
Primary Investigator: 
Joseph A. Paradiso
Primary Investigator: 
Pattie Maes
Primary Investigator: 
Ramesh Raskar
Primary Investigator: 
V. Michael Bove Jr.
Primary administrative contact: 
Aymar Guespereau


Aymar Guespereau
Assistant Director


Related Research Groups

Fluid Interfaces
Designs and develops interfaces that are a more natural extension of our minds, bodies, and behavior.

Tangible Media
Seamlessly coupling the worlds of bits and atoms by giving dynamic physical form to digital information and computation.

Object-Based Media
Makes systems that explore how sensing, understanding, and new interface technologies can change everyday life.

Responsive Environments
Explores how sensors networks augment and mediate human experience, interaction and perception.

Camera Culture
Focuses on creating tools to better capture and share visual information.


2D screens, even stereoscopic ones, limit our ability to interact with and collaborate on 3D data. We believe that an augmented reality solution, where 3D data is seamlessly integrated in the real world, is promising. We are exploring a collaborative augmented reality system for visualizing and manipulating 3D data using a head-mounted, see-through display, that allows for communication and data manipulation using simple hand gestures.

more ›

The use of fluorescent probes and the recovery of their lifetimes allow for significant advances in many imaging systems, in particular medical imaging systems. Here, we propose and experimentally demonstrate reconstructing the locations and lifetimes of fluorescent markers hidden behind a turbid layer. This opens the door to various applications for non-invasive diagnosis, analysis, flowmetry, and inspection. The method is based on a time-resolved measurement which captures information about both fluorescence lifetime and spatial position of the probes. To reconstruct the scene, the method relies on a sparse optimization framework to invert time-resolved measurements. This wide-angle technique does not rely on coherence, and does not require the probes to be directly in line of sight of the camera, making it potentially suitable for long-range imaging.

more ›

Shape displays can be used to render both 3D physical content and user interface elements. We propose to use shape displays in three different ways to mediate interaction: facilitate, providing dynamic physical affordances through shape change; restrict, guiding users through dynamic physical constraints; and manipulate, actuating passive physical objects on the interface surface. We demonstrate this on a new, high-resolution shape display.

more ›

Living Observatory is an initiative for documenting and interpreting ecological change that will allow people, individually and collectively, to better understand relationships between ecological processes, human lifestyle choices, and climate change adaptation. As part of this initiative, we are developing sensor networks that document ecological processes and allow people to experience the data at different spatial and temporal scales. Low-power sensor nodes capture climate and other data at a high spatiotemporal resolution, while others stream audio. Sensors on trees measure transpiration and other cycles, while fiber-optic cables in streams capture high-resolution temperature data. At the same time, we are developing tools that allow people to explore this data, both remotely and onsite. The remote interface allows for immersive 3D exploration of the terrain, while visitors to the site will be able to access data from the network around them directly from wearable devices.

more ›

ShowMe is an immersive mobile collaboration system that allows remote users to communicate with peers using video, audio, and gestures. With this research, we explore the use of head-mounted displays and depth sensor cameras to create a system that (1) enables remote users to be immersed in another person's view, and (2) offers a new way of sending and receiving the guidance of an expert through 3D hand gestures. With our system, both users are surrounded in the same physical environment and can perceive real-time inputs from each other.

more ›

The ability to record images with extreme temporal resolution enables a diverse range of applications, such as time-of-flight depth imaging and characterization of ultrafast processes. Here we present a demonstration of the potential of single-photon detector arrays for visualization and rapid characterization of events evolving on picosecond time scales. The single-photon sensitivity, temporal resolution, and full-field imaging capability enables the observation of light-in-flight in air, as well as the measurement of laser-induced plasma formation and dynamics in its natural environment. The extreme sensitivity and short acquisition times pave the way for real-time imaging of ultrafast processes or visualization and tracking of objects hidden from view.

more ›

TRANSFORM fuses technology and design to celebrate its transformation from still furniture to a dynamic machine driven by a stream of data and energy. TRANSFORM aims to inspire viewers with unexpected transformations and the aesthetics of the complex machine in motion. First exhibited at LEXUS DESIGN AMAZING MILAN (April 2014), the work comprises three dynamic shape displays that move over one thousand pins up and down in real time to transform the tabletop into a dynamic tangible display. The kinetic energy of the viewers, captured by a sensor, drives the wave motion represented by the dynamic pins. The motion design is inspired by dynamic interactions among wind, water, and sand in nature, Escher's representations of perpetual motion, and the attributes of sand castles built at the seashore. TRANSFORM tells of the conflict between nature and machine, and its reconciliation, through the ever-changing tabletop landscape.

more ›


ShowMe: A Remote Collaboration System that Supports Immersive Gestural Communication
[download | PDF, 6 pages]
Judith Amores, Xavier Benavides, Pattie Maes
Fluid Interfaces Group

Member Companies