Affective Computing
Advancing wellbeing using new ways to communicate, understand, and respond to emotion.
The Affective Computing group aims to bridge the gap between human emotions and computational technology. Current research addresses machine recognition and modeling of human emotional expression, including the invention of new software and hardware tools to help people gather, communicate, and express emotional information, together with tools to help people better manage and understand the ways emotion impacts health, social interaction, learning, memory, and behavior. Our projects are diverse: from inventing ways to help people who face communication and emotion regulation challenges; to enabling customers to give rich emotional feedback; to quantifying patterns of autonomic activity (core emotional physiology) during seizures, stress-related disorders, and sleep.

Research Projects

  • Affective Response to Haptic signals

    Grace Leslie, Rosalind Picard, Simon Lui, Suranga Nanayakkara

    This study attempts to examine humans' affective responses to superimposed sinusoidal signals. These signals can be perceived either through sound, in the case of electronically synthesized musical notes, or through vibro-tactile stimulation, in the case of vibrations produced by vibrotactile actuators. This study is concerned with the perception of superimposed vibrations, whereby two or more sinusoisal signals are perceived simultaneously, producing a perceptual impression that is substantially different than of each signal alone, owing to the interactions between perceived sinusoidal vibrations that give rise to a unified percept of a sinusoidal chord. The theory of interval affect was derived from systematic analyses of Indian, Chinese, Greek, and Arabic music theory and tradition, and proposes a universal organization of affective response to intervals organized using a multidimensional system. We hypothesize that this interval affect system is multi-modal and will transfer to the vibrotactile domain.

  • An EEG and Motion-Capture Based Expressive Music Interface for Affective Neurofeedback

    Grace Leslie, Rosalind Picard, and Simon Lui

    This project examines how the expression granted by new musical interfaces can be harnessed to create positive changes in health and wellbeing. We are conducting experiments to measure EEG dynamics and physical movements performed by participants who are using software designed to invite physical and musical expression of the basic emotions. The present demonstration of this system incorporates an expressive gesture sonification system using a Leap Motion device, paired with an ambient music engine controlled by EEG-based affective indices. Our intention is to better understand affective engagement, by creating both a new musical interface to invite it, and a method to measure and monitor it. We are exploring the use of this device and protocol in therapeutic settings in which mood recognition and regulation are a primary goal.

  • Auditory Desensitization Games

    Rosalind W. Picard, Matthew Goodwin and Rob Morris
    Persons on the autism spectrum often report hypersensitivity to sound. Efforts have been made to manage this condition, but there is wide room for improvement. One approach—exposure therapy—has promise, and a recent study showed that it helped several individuals diagnosed with autism overcome their sound sensitivities. In this project, we borrow principles from exposure therapy, and use fun, engaging games to help individuals gradually get used to sounds that they might ordinarily find frightening or painful.
  • Automated Tongue Analysis

    Special Interest group(s): 
    Javier Hernandez Rivera, Weixuan 'Vincent' Chen, Akane Sano, and Rosalind W. Picard

    A common practice in Traditional Chinese Medicine (TCM) is visual examination of the patient's tongue. This study will examine ways to make this process more objective and to test its efficacy for understanding stress- and health-related changes in people over time. We start by developing an app that makes it comfortable and easy for people to collect tongue data in daily life together with other stress- and health-related information. We will obtain assessment from expert practitioners of TCM, and also use state-of-the art pattern analysis and machine learning to attempt to create state-of-the-art algorithms able to help provide better insights for health and prevention of sickness.

  • Automatic Stress Recognition in Real-Life Settings

    Special Interest group(s): 
    Rosalind W. Picard, Robert Randall Morris and Javier Hernandez Rivera

    Technologies to automatically recognize stress are extremely important to prevent chronic psychological stress and pathophysiological risks associated with it. The introduction of comfortable and wearable biosensors has created new opportunities to measure stress in real-life environments, but there is often great variability in how people experience stress and how they express it physiologically. In this project, we modify the loss function of Support Vector Machines to encode a person's tendency to feel more or less stressed, and give more importance to the training samples of the most similar subjects. These changes are validated in a case study where skin conductance was monitored in nine call center employees during one week of their regular work. Employees working in this type of setting usually handle high volumes of calls every day, and they frequently interact with angry and frustrated customers that lead to high stress levels.

  • Autonomic Nervous System Activity in Epilepsy

    Rosalind W. Picard and Ming-Zher Poh

    We are performing long-term measurements of autonomic nervous system (ANS) activity on patients with epilepsy. In certain cases, autonomic symptoms are known to precede seizures. Usually in our data, the autonomic changes start when the seizure shows in the EEG, and can be measured with a wristband (much easier to wear every day than wearing an EEG). We found that the larger the signal we measure on the wrist, the longer the duration of cortical brain-wave suppression following the seizure. The duration of the latter is a strong candidate for a biomarker for SUDEP (Sudden Unexpected Death in Epilepsy), and we are working with scientists and doctors to better understand this. In addition, bilateral changes in ANS activity may provide valuable information regarding seizure focus localization and semiology.

  • BioGlass: Physiological Parameter Estimation Using a Head-Mounted Wearable Device

    Special Interest group(s): 
    Rosalind W. Picard, Javier Hernandez Rivera, James M. Rehg (Georgia Tech) and Yin Li (Georgia Tech)

    What if you could see what calms you down or increases your stress as you go through your day? What if you could see clearly what is causing these changes for your child or another loved one? People could become better at accurately interpreting and communicating their feelings, and better at understanding the needs of those they love. This work explores the possibility of using sensors embedded in Google Glass, a head-mounted-wearable device, to robustly measure physiological signals of the wearer.

  • BioInsights: Extracting Personal Data from Wearable Motion Sensors

    Rosalind W. Picard, Javier Hernandez Rivera and Daniel McDuff

    Wearable devices are increasingly in long-term close contact with the body, giving them the potential to capture sensitive, unexpected, and surprising personal data. For instance, we have recently demonstrated that motion sensors embedded in a head-mounted wearable device like Google Glass can capture the heart rate and respiration rate from subtle motions of the head. We are examining additional signatures of information that can be read from motion sensors in wearable devices: for example, can a person's identity be validated from their subtle physiological motions, especially those related to their cardiorespiratory information? How robust are these motion-signatures to identifying a wearer, even when undergoing changes in posture, stress, and activity?

  • BioPhone: Physiology Monitoring from Peripheral Smartphone Motions

    Rosalind W. Picard, Javier Hernandez Rivera and Daniel McDuff

    The large-scale adoption of smartphones during recent years has created many opportunities to improve health monitoring and care delivery. This project explores whether motion sensors available in off-the-shelf smartphones can capture physiological parameters of a person during stationary postures, even while being carried in a bag or a pocket.

  • BioWatch: Estimation of Heart and Breathing Rates from Wrist Motions

    Rosalind W. Picard, Javier Hernandez Rivera and Daniel McDuff

    Most wrist-wearable smart watches and fitness bands include motion sensors; however, their use is limited to estimating physical activities such as tracking the number of steps when walking or jogging. This project explores how we can process subtle motion information from the wrist to measure cardiac and respiratory activity. In particular we study the following research questions: How can we use the currently available motion sensors within wrist-worn devices to accurately estimate heart rate and breathing rate? How do the wrist-worn estimates compare to traditional sensors and to state-of-the-art wearable physiological sensors? Does combining measurements from motion and traditional methods improve performance? How well do the proposed methods perform in daily life situations to provide unobtrusive physiological assessments?

  • Building the Just-Right-Challenge in Games and Toys

    Rosalind W. Picard and Elliott Hedman

    Working with the LEGO Group and Hasbro, we looked at the emotional experience of playing with games and LEGO bricks. We measured participants' skin conductance as they learned to play with these new toys. By marking the stressful moments we were able to see what moments in learning should be redesigned. Our findings suggest that framing is key: how can we help children recognize their achievements? We also saw how children are excited to take on new responsibilities but are then quickly discouraged when they aren't given the resources to succeed. Our hope for this work is that by using skin conductance sensors, we can help companies better understand the unique perspective of children and build experiences fit for them.

  • Cardiocam

    Ming-Zher Poh, Daniel McDuff and Rosalind W. Picard

    Cardiocam is a low-cost, non-contact technology for measurement of physiological signals such as heart rate and breathing rate using a basic digital imaging device such as a webcam. The ability to perform remote measurements of vital signs is promising for enhancing the delivery of primary healthcare.

  • Digging into Brand Perception with Psychophysiology

    Rosalind W. Picard and Elliott Hedman

    What do customers really think about your company or brand? Using skin conductance sensors, we measure what excites and frustrates customers when discussing topics relevant to your brand. For example, with the National Campaign to Prevent Teenage Pregnancy, we saw conversations about empowerment and abortion upset conservative families. However, talking about the importance of strong families excited and engaged them. Rather than rely on self-reports, physiological measurements allow us to pinpoint what words and concepts affect your customers. We hope work like this will help companies better reflect on how their actions and messaging affect their customer's opinion in more detailed and accurate ways.

  • EDA Explorer

    Sara Taylor, Natasha Jaques, Victoria Xia, and Rosalind W. Picard

    Electrodermal Activity (EDA) is a physiological indicator of stress and strong emotion. While an increasing number of wearable devices can collect EDA, analyzing the data to obtain reliable estimates of stress and emotion remains a difficult problem. We have built a graphical tool that allows anyone to upload their EDA data and analyze it. Using a highly accurate machine learning algorithm, we can automatically detect noise within the data. We can also detect skin conductance responses, which are spikes in the signal indicating a "fight or flight" response. Users can visualize these results and download files containing features calculated on the data to be used in their own analysis. Those interested in machine learning can also view and label their data to train a machine learning classifier. We are currently adding active learning, so the site can intelligently select the fewest possible samples for the user to label.

  • Emotion Prototyping: Redesigning the Customer Experience

    Rosalind W. Picard and Elliott Hedman

    You can test whether a website is usable by making wire frames, but how do you know if that site, product, or store is emotionally engaging? We build quick, iterative environments where emotions can be tested and improved. Emphasis is on setting up the right motivation (users always have to buy what they pick), pressures (can you buy the laptop in 10 minutes?), and environment (competitors’ products better be on the shelf too). Once we see where customers are stressed or miss the fun part, we change the space on a daily, iterative cycle. Within two to three weeks, we can tell how to structure a new offering for a great experience. Seldom do the emotions we hope to create happen on the first try; emotion prototyping delivers the experience we want. We hope to better understand the benefits of emotion prototyping, especially while using the skin conductance sensor.

  • Exploring Temporal Patterns of Smile

    Rosalind W. Picard and Mohammed Ehasanul Hoque

    A smile is a multi-purpose expression. We smile to express rapport, polite disagreement, delight, sarcasm, and often, even frustration. Is it possible to develop computational models to distinguish among smiling instances when delighted, frustrated, or just being polite? In our ongoing work, we demonstrate that it is useful to explore how the patterns of smile evolve through time, and that while a smile may occur in positive and in negative situations, its dynamics may help to disambiguate the underlying state.

  • Facial Expression Analysis Over the Web

    Rosalind W. Picard, Daniel Jonathan McDuff, and formerly: Affectiva and Forbes

    This work builds on our earlier work with FaceSense, created to help automate the understanding of facial expressions, both cognitive and affective. The FaceSense system has now been made available commercially by Media Lab spinoff Affectiva as Affdex. In this work we present the first project analyzing facial expressions at scale over the Internet. The interface analyzes the participants' smile intensity as they watch popular commercials. They can compare their responses to an aggregate from the larger population. The system also allows us to crowd-source data for training expression recognition systems and to gain better understanding of facial expressions under natural at-home viewing conditions instead of in traditional lab settings.

  • Fathom: Probabilistic Graphical Models to Help Mental Health Counselors

    Special Interest group(s): 
    Karthik Dinakar, Jackie Chen, Henry A. Lieberman, and Rosalind W. Picard

    We explore advanced machine learning and reflective user interfaces to scale the national Crisis Text Line. We are using state-of-the-art probabilistic graphical topic models and visualizations to help a mental health counselor extract patterns of mental health issues experienced by participants, and bring large-scale data science to understanding the distribution of mental health issues in the United States.

  • FEEL: A Cloud System for Frequent Event and Biophysiological Signal Labeling

    Yadid Ayzenberg and Rosalind W. Picard

    The wide availability of low-cost, wearable, biophysiological sensors enables us to measure how the environment and our experiences impact our physiology. This creates a new challenge: in order to interpret the collected longitudinal data, we require the matching contextual information as well. Collecting weeks, months, and years of continuous biophysiological data makes it unfeasible to rely solely on our memory for providing the contextual information. Many view maintaining journals as burdensome, which may result in low compliance levels and unusable data. We present an architecture and implementation of a system for the acquisition, processing, and visualization of biophysiological signals and contextual information.

  • Gesture Guitar

    Rosalind W. Picard, Rob Morris and Tod Machover
    Emotions are often conveyed through gesture. Instruments that respond to gestures offer musicians new, exciting modes of musical expression. This project gives musicians wireless, gestural-based control over guitar effects parameters.
  • Got Sleep?

    Special Interest group(s): 
    Akane Sano, Rosalind W. Picard

    Got Sleep? is an Android application to help people to be aware of their sleep-related behavioral patterns and tips about how they should change their behaviors to improve their sleep. The application evaluates people's sleep habits before they start using the app, tracks day and night behaviors, and provides feedback about what kinds of behavior changes they should make and whether the improvement is achieved or not.

  • IDA: Inexpensive Networked Digital Stethoscope

    Yadid Ayzenberg

    Complex and expensive medical devices are mainly used in medical facilities by health professionals. IDA is an attempt to disrupt this paradigm and introduce a new type of device: easy to use, low cost, and open source. It is a digital stethoscope that can be connected to the Internet for streaming physiological data to remote clinicians. Designed to be fabricated anywhere in the world with minimal equipment, it can be operated by individuals without medical training.

  • Large-Scale Pulse Analysis

    Special Interest group(s): 
    Weixuan 'Vincent' Chen, Javier Hernandez Rivera, Akane Sano and Rosalind W. Picard

    This study aims to bring objective measurement to the multiple "pulse" and "pulse-like" measures made by practitioners of Traditional Chinese Medicine (TCM). The measures are traditionally made by manually palpitating the patient's inner wrist in multiple places, and relating the sensed responses to various medical conditions. Our project brings several new kinds of objective measurement to this practice, compares their efficacy, and examines the connection of the measured data to various other measures of health and stress. Our approach includes the possibility of building a smartwatch application that can analyze stress and health information from the point of view of TCM.

  • Lensing: Cardiolinguistics for Atypical Angina

    Special Interest group(s): 
    Catherine Kreatsoulas (Harvard), Rosalind W. Picard, Karthik Dinakar, David Blei (Columbia) and Matthew Nock (Harvard)

    Conversations between two individuals--whether between doctor and patient, mental health therapist and client, or between two people romantically involved with each other--are complex. Each participant contributes to the conversation using her or his own "lens." This project involves advanced probabilistic graphical models to statistically extract and model these dual lenses across large datasets of real-world conversations, with applications that can improve crisis and psychotherapy counseling and patient-cardiologist consultations. We're working with top psychologists, cardiologists, and crisis counseling centers in the United States.

  • MACH: My Automated Conversation coacH

    M. Ehsan Hoque, Rosalind Picard

    MACH, My Automated Conversation coacH, is a system for people to practice social interactions in face-to-face scenarios. MACH consists of a 3D character that can "see," "hear," and make its own “decisions” in real time. The system was validated in the context of job interviews with 90 MIT undergraduate students. Students who interacted with MACH demonstrated significant performance improvement compared to the students in the control group. We are currently expanding this technology to open up new possibilities in behavioral health (e.g., treating people with Asperger syndrome, social phobia, PTSD) as well as designing new interaction paradigms in human-computer interaction and robotics.

  • Making Engaging Concerts

    Rosalind W. Picard and Elliott Hedman

    Working with the New World Symphony, we measured participant skin conductance as they attended a classical concert for the first time. With the sensor technology, we noted times when the audience reacted or engaged with the music and other times when they became bored and drifted away. Our overall findings suggest that transitions, familiarity, and visual supplements can make concerts accessible and exciting for new concert goers. We hope this work can help entertainment industries better connect with their customers and refine the presentation of their work so that it can best be received by a more diverse audience.

  • Mapping the Stress of Medical Visits

    Rosalind W. Picard and Elliott Hedman

    Receiving a shot or discussing health problems can be stressful, but does not always have to be. We measure participants' skin conductance as they use medical devices or visit hospitals and note times when stress occurs. We then prototype possible solutions and record how the emotional experience changes. We hope work like this will help bring the medical community closer to their customers.

  • Measuring Arousal During Therapy for Children with Autism and ADHD

    Rosalind W. Picard and Elliott Hedman

    Physiological arousal is an important part of occupational therapy for children with autism and ADHD, but therapists do not have a way to objectively measure how therapy affects arousal. We hypothesize that when children participate in guided activities within an occupational therapy setting, informative changes in electrodermal activity (EDA) can be detected using iCalm. iCalm is a small, wireless sensor that measures EDA and motion, worn on the wrist or above the ankle. Statistical analysis describing how equipment affects EDA was inconclusive, suggesting that many factors play a role in how a child's EDA changes. Case studies provided examples of how occupational therapy affected children's EDA. This is the first study of the effects of occupational therapy's in situ activities using continuous physiologic measures. The results suggest that careful case study analyses of the relation between therapeutic activities and physiological arousal may inform clinical practice.

  • Mobile Health Interventions for Drug Addiction and PTSD

    Rich Fletcher and Rosalind W. Picard

    We are developing a mobile phone-based platform to assist people with chronic diseases, panic-anxiety disorders, or addictions. Making use of wearable, wireless biosensors, the mobile phone uses pattern analysis and machine learning algorithms to detect specific physiological states and perform automatic interventions in the form of text/images plus sound files and social networking elements. We are currently working with the Veterans Administration drug rehabilitation program involving veterans with PTSD.

  • Mobisensus: Predicting Your Stress/Mood from Mobile Sensor Data

    Special Interest group(s): 
    Akane Sano and Rosalind Picard

    Can we recognize stress, mood, and health conditions from wearable sensors and mobile-phone usage data? We analyze long-term, multi-modal physiological, behavioral, and social data (electrodermal activity, skin temperature, accelerometer, phone usage, social network patterns) in daily lives with wearable sensors and mobile phones to extract bio-markers related to health conditions, interpret inter-individual differences, and develop systems to keep people healthy.

  • Modulating Peripheral and Cortical Arousal Using a Musical Motor Response Task

    Grace Leslie, Rosalind Picard, Simon Lui, Annabel Chen

    We are conducting EEG studies to identify the musical features and musical interaction patterns that universally impact measures of arousal. We hypothesize that we can induce states of high and low arousal using electrodermal activity (EDA) biofeedback, and that these states will produce correlated differences in concurrently recorded skin conductance and EEG data, establishing a connection between peripherally recorded physiological arousal and cortical arousal as revealed in EEG. We also hypothesize that manipulation of musical features of a computer-generated musical stimulus track will produce changes in peripheral and cortical arousal. These musical stimuli and programmed interactions may be incorporated into music technology therapy, designed to reduce arousal or increase learning capability by increasing attention. We aim to provide a framework for the neural basis of emotion-cognition integration of learning that may shed light on education and possible applications to improve learning by emotion regulation.

  • Multimodal Computational Behavior Analysis

    David Forsyth (UIUC), Gregory Abowd (GA Tech), Jim Rehg (GA Tech), Shri Narayanan (USC), Matthew Goodwin (NEU), Rosalind W. Picard, Javier Hernandez Rivera, Micah Eckhardt, Stan Scarloff (BU) and Takeo Kanade (CMU)

    This project will define and explore a new research area we call Computational Behavior Science: integrated technologies for multimodal computational sensing and modeling to capture, measure, analyze, and understand human behaviors. Our motivating goal is to revolutionize diagnosis and treatment of behavioral and developmental disorders. Our thesis is that emerging sensing and interpretation capabilities in vision, audition, and wearable computing technologies, when further developed and properly integrated, will transform this vision into reality. More specifically, we hope to: (1) enable widespread autism screening by allowing non-experts to easily collect high-quality behavioral data and perform initial assessment of risk status; (2) improve behavioral therapy through increased availability and improved quality, by making it easier to track the progress of an intervention and follow guidelines for maximizing learning progress; and (3) enable longitudinal analysis of a child's development based on quantitative behavioral data, using new tools for visualization.

  • Objective Asessment of Depression and Its Improvement

    Special Interest group(s): 
    Rosalind W. Picard, Szymon Fedor, Brigham and Women's Hospital and Massachusetts General Hospital

    Current methods to assess depression and then ultimately select appropriate treatment have many limitations. They are usually based on having a clinician rate scales, which were developed in the 1960s. Their main drawbacks are lack of objectivity, being symptom-based and not preventative, and requiring accurate communication. This work explores new technology to assess depression, including its increase or decrease, in an automatic, more objective, pre-symptomatic, and cost-effective way using wearable sensors and smart phones for 24/7 monitoring of different personal parameters such as physiological data, voice characteristics, sleep, and social interaction. We aim to enable early diagnosis of depression, prevention of depression, assessment of depression for people who cannot communicate, better assignment of a treatment, early detection of treatment remission and response, and anticipation of post-treatment relapse or recovery.

  • Panoply

    Special Interest group(s): 
    Rosalind W. Picard and Robert Morris

    Panoply is a crowdsourcing application for mental health and emotional wellbeing. The platform offers a novel approach to computer-based psychotherapy, one that is optimized for accessibility, engagement, and therapeutic efficacy. A three-week randomized-controlled trial with 166 participants compared Panoply to an active control task (online expressive writing). Panoply conferred greater or equal benefits for nearly every therapeutic outcome measure. Panoply also significantly outperformed the control task on all measures of engagement.

  • PongCam

    Special Interest group(s): 
    Rosalind Picard, Juliana Cherston, and Natasha Jaques

    PongCam is a wellbeing project that enables Media Lab ping pong players to save videos of their best ping pong shots on Youtube. The device is constantly capturing footage of the ping pong table, storing the most recent footage in a buffer. After a good shot, a player can hit a big red button and the last 30 seconds of footage will be uploaded to the PongCam highlights reel on YouTube. We observe how devices of this sort promote mental and physical wellbeing in the Lab.

  • Predicting Students' Wellbeing from Physiology, Phone, Mobility, and Behavioral Data

    Natasha Jaques, Sara Taylor, Akane Sano, and Rosalind Picard

    The goal of this project is to apply machine learning methods to model the wellbeing of MIT undergraduate students. Extensive data is obtained from the SNAPSHOT study, which monitors students on a 24/7 basis, collecting their location, smartphone logs, sleep schedule, phone and SMS communications, academics, social networks, and even physiological markers like skin conductance, skin temperature, and acceleration. We extract features from this data and apply a variety of machine learning algorithms including Multiple Kernel Learning, Gaussian Mixture Models, and Transfer Learning, among others. Interesting findings include: when participants visit novel locations they tend to be happier; when they use their phones or stay indoors for long periods they tend to be unhappy; and when several dimensions of wellbeing (including stress, happiness, health, and energy) are learned together, classification accuracy improves.

  • Real-Time Assessment of Suicidal Thoughts and Behaviors

    Special Interest group(s): 
    Rosalind W. Picard, Szymon Fedor, Harvard and Massachusetts General Hospital

    Depression correlated with anxiety is one of the key factors leading to suicidal behavior, and is among the leading causes of death worldwide. Despite the scope and seriousness of suicidal thoughts and behaviors, we know surprisingly little about what suicidal thoughts look like in nature (e.g., How frequent, intense, and persistent are they among those who have them? What cognitive, affective/physiological, behavioral, and social factors trigger their occurrence?). The reason for this lack of information is that historically researchers have used retrospective self-report to measure suicidal thoughts, and have lacked the tools to measure them as they naturally occur. In this work we explore use of wearable devices and smartphones to identify behavioral, affective, and physiological predictors of suicidal thoughts and behaviors.

  • Reinventing the Retail Experience

    Elliott Hedman and Rosalind W. Picard

    With skin conductance sensors, we map out what frustrates and excites customers as they shop—from layout to wanting to touch the product. Our work has helped a variety of large retailers innovate on what it means to shop. Findings have focused on reducing the stress of choices and learning while surprising customers in new ways. With the sensor technology we can pinpoint moments when customers are overwhelmed and then build out new ways to make retail engaging again.

  • SenseGlass: Using Google Glass to Sense Daily Emotions

    Special Interest group(s): 
    Rosalind W. Picard and Javier Hernandez Rivera

    For over a century, scientists have studied human emotions in laboratory settings. However, these emotions have been largely contrived–elicited by movies or fake “lab” stimuli, which tend not to matter to the participants in the studies, at least not compared with events in their real lives. This work explores the utility of Google Glass, a head-mounted wearable device, to enable fundamental advances in the creation of affect-based user interfaces in natural settings.

  • SmileTracker

    Special Interest group(s): 
    Natasha Jaques, Weixuan 'Vincent' Chen and Rosalind Picard

    SmileTracker is a system designed to capture naturally occurring instances of positive emotion during the course of normal interaction with a computer. A facial expression recognition algorithm is applied to images captured with the user's webcam. When the user smiles, both a photo and a screenshot are recorded and saved to the user's profile for later review. Based on positive psychology research, we hypothesize that the act of reviewing content that led to smiles will improve positive affect, and consequently, overall wellbeing.

  • SNAPSHOT Expose

    Miriam Zisook, Sara Taylor, Akane Sano and Rosalind Picard

    In this project, we apply what we have learned from the SNAPSHOT study to the problem of changing behavior. We explore the design of user-centered tools that can harness the experience of collecting and reflecting on personal data to promote healthy behaviors, including stress management and sleep regularity. We will draw on commonly used theories of behavior change as the inspiration for distinct conceptual designs for a behavior change application based on the SNAPSHOT study. This approach will enable us to compare the types of visualization strategies that are most meaningful and useful for acting on each theory.

  • SNAPSHOT Study

    Special Interest group(s): 
    Akane Sano, Amy Yu, Sara Taylor, Cesar Hidalgo and Rosalind Picard

    The SNAPSHOT study seeks to measure Sleep, Networks, Affect, Performance, Stress, and Health using Objective Techniques. It is a NIH-funded collaborative research project between the Affective Computing group, Macro Connections group, and Harvard Medical School's Brigham & Women's hospital. We have been running this study since fall 2013 to collect one month of data from 50 MIT undergraduate students who are socially connected every semester. We have collected data from about 170 participants, totaling over 5,000 days of data. We measure physiological, behavioral, environmental, and social data using mobile phones, wearable sensors, surveys, and lab studies. We investigate how daily behaviors and social connectivity influence sleep behaviors, health, and outcomes such as mood, stress, and academic performance. Using this multimodal data, we are developing models to predict onsets of sadness and stress. This study will provide insights into behavioral choices for wellbeing and performance.

  • StoryScape

    Rosalind W. Picard and Micah Eckhardt

    Stories, language, and art are at the heart StoryScape. While StoryScape began as a tool to meet the challenging language learning needs of children diagnosed with autism, it has become much more. StoryScape was created to be the first truly open and customizable platform for creating animated, interactive storybooks that can interact with the physical world. Download the android app: and make your own amazing stories at

  • The Challenge

    Special Interest group(s): 
    Natasha Jaques, Niaja Farve, Pattie Maes and Rosalind W. Picard

    Individuals who work in sedentary occupations are at increased risk of a number of serious health consequences. This project involves both a tool and an experiment aimed at decreasing sedentary activity and promoting social connections among members of the MIT Media Lab. Our system will ask participants to sign up for short physical challenges (ping pong, foosball, walking) and pair them with a partner to perform the activity. Participants' overall activity levels will be monitored with an activity tracker during the course of the study to assess the effectiveness of the system.

  • Tributary

    Yadid Ayzenberg, Rosalind Picard

    The proliferation of smartphones and wearable sensors is creating very large data sets that may contain useful information. However, the magnitude of generated data creates new challenges as well. Processing and analyzing these large data sets in an efficient manner requires computational tools. Many of the traditional analytics tools are not optimized for dealing with large datasets. Tributary is a parallel engine for searching and analyzing sensor data. The system utilizes large clusters of commodity machines to enable in-memory processing of sensor time-series signals, making it possible to search through billions of samples in seconds. Users can access a rich library of statistics and digital signal processing functions or write their own in a variety of languages.

  • Unlocking Sleep

    Rosalind W. Picard, Thariq Shihipar and Sara Taylor

    Despite a vast body of knowledge about the importance of sleep, our daily schedules are often planned around work and social events, not healthy sleep. While we're prompted throughout the day by devices and people to plan and think about our schedules in terms of things to do, sleep is rarely considered until we're tired and it's late. This project proposes a way that our everyday use of technology can help improve sleep habits. Smartphone unlock screens are an unobtrusive way of prompting user reflection throughout the day by posing "microquestions" as users unlock their phone. The questions are easily answered with a single-swipe. Since we unlock our phones 50 to 200 times per day, microquestions can collect information with minimal intrusiveness to the user’s daily life. Can these swipe-questions help users mentally plan their day around sleep, and trigger healthier sleep behaviors?

  • Valinor: Mathematical Models to Understand and Predict Self-Harm

    Special Interest group(s): 
    Rosalind W. Picard, Karthik Dinakar, Eric Horvitz (Microsoft Research) and Matthew Nock (Harvard)

    We are developing statistical tools for understanding, modeling, and predicting self-harm by using advanced probabilistic graphical models and fail-soft machine learning in collaboration with Harvard University and Microsoft Research.

  • Wavelet-Based Motion Artifact Removal for Electrodermal Activity

    Weixuan 'Vincent' Chen, Natasha Jaques, Sara Taylor, Akane Sano, Szymon Fedor and Rosalind W. Picard

    Electrodermal activity (EDA) recording is a powerful, widely used tool for monitoring psychological or physiological arousal. However, analysis of EDA is hampered by its sensitivity to motion artifacts. We propose a method for removing motion artifacts from EDA, measured as skin conductance (SC), using a stationary wavelet transform (SWT). We modeled the wavelet coefficients as a Gaussian mixture distribution corresponding to the underlying skin conductance level (SCL) and skin conductance responses (SCRs). The goodness-of-fit of the model was validated on ambulatory SC data. We evaluated the proposed method in comparison with three previous approaches. Our method achieved a greater reduction of artifacts while retaining motion-artifact-free data.