Object-Based Media
Changing storytelling, communication, and everyday life through sensing, understanding, and new interface technologies.
We explore the future of electronic visual communication and expression, and how the distribution of computational intelligence throughout video and audio communication systems can make a richer connection between the people at the ends of the systems, whether a broadcast system or a peer-to-peer environment. We also develop hardware and software technologies to support the requirements of such a scenario, with particular focus on new input and output technologies, advanced interfaces for consumer electronics, and self-organization among smart devices.

Research Projects

  • 3D Telepresence Chair

    V. Michael Bove Jr. and Daniel Novy

    An autostereoscopic (no glasses) 3D display engine is combined with a "Pepper's Ghost" setup to create an office chair that appears to contain a remote meeting participant. The system geometry is also suitable for other applications such as tabletop or automotive heads-up displays.

  • Ambi-blinds

    V. Michael Bove and Ermal Dreshaj

    Ambi-blinds are solar-powered, sunlight-driven window blinds. A reinvention of a common household item, Ambi-blinds use the level of sunlight striking the window to automatically control the tilt of the blinds, effectively controlling how much sunlight is cast into a room depending on time of day. Sleep studies dictate that waking up with the sunlight regularly promotes wellness and quality of sleep, regulating our circadian rhythm throughout the day. By automatically regulating the user's exposure to sunlight, Ambi-blinds promote the well-being of the user in a non-invasive way, and close at night to allow for privacy.

  • Awakened Apparel

    V. Michael Bove, Laura Perovich and Philippa Mothersill

    This project investigates soft mechanisms, origami, and fashion. We created a modified Miura fold skirt that changes shape through pneumatic actuation. In the future, our skirt could dynamically adapt to the climatic, functional, and emotional needs of the user—for example, it might become shorter in warm weather.

  • BigBarChart

    V. Michael Bove and Laura Perovich

    BigBarChart is an immersive 3D bar chart that provides a new physical way for people to interact with data. It takes data beyond visualizations to map out a new area—data experiences—which are multisensory, embodied, and aesthetic interactions. BigBarChart is made up of a number of bars that extend up to 10 feet to create an immersive experience. Bars change height and color in response to interactions that are direct (a person entering the room), tangible (pushing down on a bar to get meta information), or digital (controlling bars and performing statistical analyses through a tablet). BigBarChart helps both scientists and the general public understand information from a new perspective. Early prototypes are available.

  • Bottles&Boxes: Packaging with Sensors

    Ermal Dreshaj and Daniel Novy

    We have added inexpensive, low-power, wireless sensors to product packages to detect user interactions with products. Thus, a bottle can register when and how often its contents are dispensed (and generate side effects like causing a music player to play music when the bottle is picked up, or generating an automatic refill order when near-emptiness is detected). A box can understand usage patterns of its contents. Consumers can vote for their favorites among several alternatives simply by handling them more often.

  • Calliope

    V. Michael Bove Jr., Edwina Portocarrero and Ye Wang

    Calliope is the follow-up to the NeverEnding Drawing Machine. A portable, paper-based platform for interactive story making, it allows physical editing of shared digital media at a distance. The system is composed of a network of creation stations that seamlessly blend analog and digital media. Calliope documents and displays the creative process with no need to interact directly with a computer. By using human-readable tags and allowing any object to be used as material for creation, it offers opportunities for cross-cultural and cross-generational collaboration among peers with expertise in different media.

  • Consumer Holo-Video

    V. Michael Bove Jr., Bianca Datta, Ermal Dreshaj and Sundeep Jolly
    The goal of this project, building upon work begun by Stephen Benton and the Spatial Imaging group, is to create an inexpensive desktop monitor for a PC or game console that displays holographic video images in real time, suitable for entertainment, engineering, or medical imaging. To date, we have demonstrated the fast rendering of holo-video images (including stereographic images, which, unlike ordinary stereograms, have focusing consistent with depth information) from OpenGL databases on off-the-shelf PC graphics cards; current research addresses new optoelectronic architectures to reduce the size and manufacturing cost of the display system.
  • Crystal Ball

    Amir Lazarovich, Dan Novy, Andy Lippman, Michael Bove

    A physical interface designed for simultaneous social interaction with visual material. We built a hemispherical, multi-person, interactive touch display that allows a small group of people in the same place or in equivalently equipped ones to jointly interact on the same surface. We created an application that runs on this platform and presents a selection of visual media and offers recommendations for common viewing.

  • Digital Synesthesia

    V. Michael Bove and Santiago Eloy Alfaro

    Digital Synesthesia looks to evolve the idea of Human-Computer Interfacing and give way for Human-World Interacting. It aims to find a way for users to experience the world by perceiving information outside of their sensory capabilities. Modern technology already offers the ability to detect information from the world that is beyond our natural sensory spectrum; however, there is still no real way for our brains and body to incorporate this new information as a part of our sensory toolkit, so that we can understand our surrounding world in new and undiscovered ways. The long-term vision of this work is to give users the ability to turn senses on and off depending on the desired experience. This project is part of the Ultimate Media initiative and will be applied to the navigation and discovery of media content.

  • Direct Fringe Writing of Computer-Generated Holograms

    V. Michael Bove Jr., Sundeep Jolly and University of Arizona College of Optical Sciences

    Photorefractive polymer has many attractive properties for dynamic holographic displays; however, the current display systems based around its use involve generating holograms by optical interference methods that complicate the optical and computational architectures of the systems, and limit the kinds of holograms that can be displayed. We are developing a system to write computer-generated diffraction fringes directly from spatial light modulators to photorefractive polymers, resulting in displays with reduced footprint and cost, and potentially higher perceptual quality.

  • Dressed in Data

    V. Michael Bove and Laura Perovich

    This project steps beyond data visualizations to create data experiences. It aims to engage not only the analytic mind, but also the artistic and emotional self. In this project, chemicals found in people’s bodies and homes are turned into a series of fashions. Quantities, properties, and sources of chemicals are represented through various parameters of the fashion, such as fabric color, textures, and sizes. Wearing these outfits allows people to live the data–to experience tangibly the findings from their homes and bodies. This is the first project in a series of works that seek to create aesthetic data experiences that prompt researchers and laypeople to engage with information in new ways.

  • Drift Bottle

    V. Michael Bove, Lingyun Sun and Zhejiang University

    How can emotions be conveyed, expressed, and felt? Drift Bottle is a project exploring interfaces that allow users to "feel" others’ emotions to promote their communication. We have developed a voice message-exchange web service. Based on that, we design and develop several terminals with different interfaces which convey emotions via media such as light, smell, and motion. One solution is to convey the emotions in voice messages via different colors of light. Our latest effort is conveying emotions via smells, with the intention of arousing the same emotions in the receivers.

  • DUSK

    V. Michael Bove, Bianca Datta and Ermal Dreshaj

    DUSK was created as part of the MIT Media Lab Wellness Initiative (a Robert Wood Johnson Foundation grant), to create private, restful spaces for people at the workplace. DUSK promotes a vision of a new type of “nap pod”, where workers are encouraged to use the structure for regular breaks and meditation on a daily basis. The user is provided the much-needed privacy to take a phone call, focus, or rest inside the pod for short periods during the day. The inside can be silent, or filled by binaural beats audio, pitch black, or illuminated by a sunlamp–whatever works for the user to get that necessary rest and relaxation so they can continue to be healthy and productive. DUSK is created with a parametric press-fit design, making it scalable and suitable for fabrication customizable on a per-user basis.

  • EmotiveModeler: An Emotive Form Design CAD Tool

    V. Michael Bove and Philippa Mothersill

    We associate different tastes to specific emotions and shapes. Flavorful Forms uses the tactile allegory emotive design framework to generate designs for spice bottles whose forms express the character of the flavor contained within. The framework is integrated into a 3D modeling software into which users can input higher-level emotive descriptions of the flavors to modify the design of a bottle to express the character of the flavor through its form.

  • EmotiveModeler: Tactile Allegory Design Framework

    V. Michael Bove and Philippa Mothersill

    We have an unconscious understanding of the meaning of different physical objects through our extensive interactions with them. Designers can extend and adapt the existing symbolic meanings through the design of these objects, adding a layer of emotive expression by manipulating their forms. Tactile Allegory explores the physical design language encoded into objects and asks: how can objects be computationally designed to communicate specific information through their very forms? This research explores the underlying design "grammar" of the form of objects, particularly how objects can communicate information to us through their form. This framework is used to create a computational design tool to help people design expressively shaped objects that can express higher-level sentiments of their ideas via aesthetic forms.

  • Everything Tells a Story

    V. Michael Bove Jr., David Cranor and Edwina Portocarrero
    Following upon work begun in the Graspables project, we are exploring what happens when a wide range of everyday consumer products can sense, interpret into human terms (using pattern recognition methods), and retain memories, such that users can construct a narrative with the aid of the recollections of the "diaries" of their sporting equipment, luggage, furniture, toys, and other items with which they interact.
  • Guided-Wave Light Modulator

    V. Michael Bove Jr., Bianca Datta and Sunny Jolly
    We are developing inexpensive, efficient, high-bandwidth light modulators based on lithium niobate guided-wave technology. These modulators are suitable for demanding, specialized applications such as holographic video displays, as well as other light modulation uses such as compact video projectors.
  • Holoshop

    Paula Dawson, Masa Takatsuka, Hiroshi Yoshikawa, Brian Rogers, V. Michael Bove Jr.

    This project aims to make it easy to create 3D drawings that have the highly nuanced qualities of handmade drawings. Typically, 2D drawing relies on the conjunction of the friction and pressure of the medium (pencil and paper) to enable a sensitive registration of the gesture. However, when drawing in 3D there is not necessarily a “support.” Holoshop software uses forces and magnetism of open and closed fields to enable the user to locate fixed and semipermeable “supports” within the 3D environment. Holoshop is being developed for use in conjunction with a haptic device, the Phantom, enabling the user to navigate 3D space though both touch and vision. Also, the real-time modulation of lines from velocity and pressure enable responsive drawings which can be exported for holograms, 3D prints, and other 3D displays. This research is supported under Australian Research Council's Discovery Projects funding scheme (DP1094613).

  • Infinity-by-Nine

    V. Michael Bove Jr. and Daniel Novy

    We are expanding the home-video viewing experience by generating imagery to extend the TV screen and give the impression that the scene wraps completely around the viewer. Optical flow, color analysis, and heuristics extrapolate beyond the screen edge, where projectors provide the viewer's perceptual vision with low-detail dynamic patterns that are perceptually consistent with the video imagery and increase the sense of immersive presence and participation. We perform this processing in real time using standard microprocessors and GPUs.

  • ListenTree: Audio-Haptic Display in the Natural Environment

    V. Michael Bove, Joseph A. Paradiso, Gershon Dublon and Edwina Portocarrero

    ListenTree is an audio-haptic display embedded in the natural environment. Visitors to our installation notice a faint sound emerging from a tree. By resting their heads against the tree, they are able to hear sound through bone conduction. To create this effect, an audio exciter transducer is weatherproofed and attached to the tree's roots, transforming it into a living speaker, channeling audio through its branches, and providing vibrotactile feedback. In one deployment, we used ListenTree to display live sound from an outdoor ecological monitoring sensor network, bringing a faraway wetland into the urban landscape. Our intervention is motivated by a need for forms of display that fade into the background, inviting attention rather than requiring it. We consume most digital information through devices that alienate us from our surroundings; ListenTree points to a future where digital information might become enmeshed in material.

  • Narratarium

    V. Michael Bove Jr., Fransheska Colon, Catherine Havasi, Katherine (Kasia) Hayden, Daniel Novy, Jie Qi and Robert H. Speer

    Narratarium augments printed and oral stories and creative play by projecting immersive images and sounds. We are using natural language processing to listen to and understand stories being told, and analysis tools to recognize activity among sensor-equipped objects such as toys, then thematically augmenting the environment using video and sound. New work addresses the creation and representation of audiovisual content for immersive story experiences and the association of such content with viewer context.

  • Networked Playscapes: Dig Deep

    V. Michael Bove and Edwina Portocarrero

    Networked Playscapes re-imagine outdoor play by merging the flexibility and fantastical of the digital world with the tangible, sensorial properties of physical play to create hybrid interactions for the urban environment. Dig Deep takes the classic sandbox found in children's playgrounds and merges it with the common fantasy of "digging your way to the other side of the world" to create a networked interaction in tune with child cosmogony.

  • Pillow-Talk

    V. Michael Bove Jr., Edwina Portocarrero and David Cranor

    Pillow-Talk is the first of a series of objects designed to aid creative endeavors through the unobtrusive acquisition of unconscious, self-generated content to permit reflexive self-knowledge. Composed of a seamless recording device embedded in a pillow, and a playback and visualization system in a jar, Pillow-Talk crystallizes that which we normally forget. This allows users to capture their dreams in a less mediated way, aiding recollection by priming the experience and providing no distraction for recall and capture through embodied interaction.

  • ProtoTouch: Multitouch Interfaces to Everyday Objects

    V. Michael Bove Jr. and David Cranor

    An assortment of everyday objects is given the ability to understand multitouch gestures of the sort used in mobile-device user interfaces, enabling people to use such increasingly familiar gestures to control a variety of objects, and to "copy" and "paste" configurations and other information among them.

  • ShAir: A Platform for Mobile Content Sharing

    Yosuke Bando, Daniel Dubois, Konosuke Watanabe, Arata Miyamoto, Henry Holtzman, and V. Michael Bove

    ShAir is a platform for instantly and easily creating local content-shareable spaces without requiring an Internet connection or location information. ShAir-enabled devices can opportunistically communicate with other mobile devices and optional pervasive storage devices such as WiFi SD cards whenever they enter the radio range of each other. Digital content can hop through devices in the background without user intervention. Applications that can be built on top of the platform include ad-hoc photo/video/music sharing and distribution, opportunistic social networking and games, digital business card exchange during meetings and conferences, and local news article sharing on trains and buses.

  • ShakeOnIt

    V. Michael Bove Jr. and David Cranor

    We are exploring ways to encode information exchange into preexisting natural interaction patterns, both between people and between a single user and objects with which he or she interacts on a regular basis. Two devices are presented to provoke thoughts regarding these information interchange modalities: a pair of gloves that requires two users to complete a "secret handshake" in order to gain shared access to restricted information, and a doorknob that recognizes the grasp of a user and becomes operational only if the person attempting to use it is authorized to do so.

  • Slam Force Net

    V. Michael Bove Jr., Santiago Alfaro and Daniel Novy

    A basketball net incorporates segments of conductive fiber whose resistance changes with degree of stretch. By measuring this resistance over time, hardware associated with this net can calculate force and speed of a basketball traveling through the net. Applications include training, toys that indicate the force and speed on a display, “dunk competitions,” and augmented-reality effects on television broadcasts. This net is far less expensive and more robust than other approaches to measuring data about the ball (e.g., photosensors or ultrasonic sensors) and the only physical change required for the hoop or backboard is electrical connections to the net. Another application of the material is a flat net that can measure velocity of a ball hit or pitched into it (as in baseball or tennis); it can measure position as well (e.g., for determining whether a practice baseball pitch would have been a strike).

  • SurroundVision

    V. Michael Bove Jr. and Santiago Alfaro
    Adding augmented reality to the living-room TV, we are exploring the technical and creative implications of using a mobile phone or tablet (and possibly also dedicated devices like toys) as a controllable "second screen" for enhancing television viewing. Thus, a viewer could use the phone to look beyond the edges of the television to see the audience for a studio-based program, to pan around a sporting event, to take snapshots for a scavenger hunt, or to simulate binoculars to zoom in on a part of the scene. Recent developments include the creation of a mobile device app for Apple products and user studies involving several genres of broadcast television programming.
  • The "Bar of Soap": Grasp-Based Interfaces

    V. Michael Bove Jr. and Brandon Taylor
    We have built several handheld devices that combine grasp and orientation sensing with pattern recognition in order to provide highly intelligent user interfaces. The Bar of Soap is a handheld device that senses the pattern of touch and orientation when it is held, and reconfigures to become one of a variety of devices, such as phone, camera, remote control, PDA, or game machine. Pattern-recognition techniques allow the device to infer the user's intention based on grasp. Another example is a baseball that determines a user's pitching style as an input to a video game.
  • Ultra-High Tech Apparel

    V. Michael Bove, Philippa Mothersill, Laura Perovich, Christopher Bevans (CBAtelier), and Philipp Schmidt

    The classic lab coat has been a reliable fashion staple for scientists around the world. But Media Lab researchers are not only scientists—we are also designers, tinkerers, philosophers, and artists. We need a different coat! Enter the Media Lab coat. Our lab coat is uniquely designed for, and with, the Media Lab community. It features reflective materials, new bonding techniques, and integrated electronics. Each Labber has different needs. Some require access to Arduinos, others need moulding materials, yet others carry around motors or smart tablets. The lab coat is a framework for customization. The coat is just the start. Together with some of the innovative member companies of the MIT Media Lab, we are exploring protective eyewear, footwear, and everything in between.

  • Vision-Based Interfaces for Mobile Devices

    V. Michael Bove Jr. and Santiago Alfaro
    Mobile devices with cameras have enough processing power to do simple machine-vision tasks, and we are exploring how this capability can enable new user interfaces to applications. Examples include dialing someone by pointing the camera at the person's photograph, or using the camera as an input to allow navigating virtual spaces larger than the device's screen.