Affective Computing
Advancing wellbeing using new ways to communicate, understand, and respond to emotion.
The Affective Computing group aims to bridge the gap between human emotions and computational technology. Current research addresses machine recognition and modeling of human emotional expression, including the invention of new software and hardware tools to help people gather, communicate, and express emotional information, together with tools to help people better manage and understand the ways emotion impacts health, social interaction, learning, memory, and behavior. Our projects are diverse: from inventing ways to help people who face communication and emotion regulation challenges; to enabling customers to give rich emotional feedback; to quantifying patterns of autonomic activity (core emotional physiology) during seizures, stress-related disorders, and sleep.

Research Projects

  • Auditory Desensitization Games

    Rosalind W. Picard, Matthew Goodwin and Rob Morris
    Persons on the autism spectrum often report hypersensitivity to sound. Efforts have been made to manage this condition, but there is wide room for improvement. One approach—exposure therapy—has promise, and a recent study showed that it helped several individuals diagnosed with autism overcome their sound sensitivities. In this project, we borrow principles from exposure therapy, and use fun, engaging games to help individuals gradually get used to sounds that they might ordinarily find frightening or painful.
  • AutoEmotive: Bringing Empathy to the Driving Experience

    Pattie Maes, Rosalind W. Picard, Judith Amores Fernandez, Xavier Benavides Palos, Javier Hernandez Rivera and Daniel Jonathan McDuff

    Regardless of the emotional state of drivers, current cars feel impassive and disconnected. We believe that by adding emotion-sensing technologies inside the car, we can dramatically improve the driving experience while increasing the safety of drivers. This work explores the landscape of possible applications when incorporating stress-sensing devices in the car.

  • Automatic Stress Recognition in Real-Life Settings

    Rosalind W. Picard, Robert Randall Morris and Javier Hernandez Rivera

    Technologies to automatically recognize stress are extremely important to prevent chronic psychological stress and the pathophysiological risks associated with it. The introduction of comfortable and wearable biosensors have created new opportunities to measure stress in real-life environments, but there is often great variability in how people experience stress and how they express it physiologically. In this project, we modify the loss function of Support Vector Machines to encode a person's tendency to feel more or less stressed, and give more importance to the training samples of the most similar subjects. These changes are validated in a case study where skin conductance was monitored in nine call center employees during one week of their regular work. Employees working in this type of setting usually handle high volumes of calls every day, and they frequently interact with angry and frustrated customers that lead to high stress levels.

  • Autonomic Nervous System Activity in Epilepsy

    Rosalind W. Picard and Ming-Zher Poh

    We are performing long-term measurements of autonomic nervous system (ANS) activity on patients with epilepsy. In certain cases, autonomic symptoms are known to precede seizures. Usually in our data, the autonomic changes start when the seizure shows in the EEG, and can be measured with a wristband (much easier to wear every day than wearing an EEG). We found that the larger the signal we measure on the wrist, the longer the duration of cortical brain-wave suppression following the seizure. The duration of the latter is a strong candidate for a biomarker for SUDEP (Sudden Unexpected Death in Epilepsy), and we are working with scientists and doctors to better understand this. In addition, bilateral changes in ANS activity may provide valuable information regarding seizure focus localization and semiology.

  • BioGlass: Physiological Parameter Estimation Using a Head-mounted Wearable Device

    Rosalind W. Picard, Javier Hernandez Rivera, James M. Rehg (Georgia Tech) and Yin Li (Georgia Tech)

    What if you could see what calms you down or increases your stress as you go through your day? What if you could see clearly what is causing these changes for your child or another loved one? People could become better at accurately interpreting and communicating their feelings, and better at understanding the needs of those they love. This work explores the possibility of using sensors embedded in Google Glass, a head-mounted-wearable device, to robustly measure physiological signals of the wearer.

  • Building the Just-Right-Challenge in Games and Toys

    Rosalind W. Picard and Elliott Hedman

    Working with the LEGO Group and Hasbro, we looked at the emotional experience of playing with games and LEGO bricks. We measured participants’ skin conductance as they learned to play with these new toys. By marking the stressful moments we were able to see what moments in learning should be redesigned. Our findings suggest that framing is key–how can we help children recognize their achievements? We also saw how children are excited to take on new responsibilities but are then quickly discouraged when they aren’t given the resources to succeed. Our hope for this work is that by using skin conductance sensors, we can help companies better understand the unique perspective of children and build experiences fit for them.

  • Cardiocam

    Ming-Zher Poh, Daniel McDuff and Rosalind W. Picard

    Cardiocam is a low-cost, non-contact technology for measurement of physiological signals such as heart rate and breathing rate using a basic digital imaging device such as a webcam. The ability to perform remote measurements of vital signs is promising for enhancing the delivery of primary healthcare.

  • College Sleep

    Akane Sano, Cesar Hidalgo and Rosalind Picard

    Sleep is critical to a wide range of biological functions; inadequate sleep results in impaired cognitive performance and mood, and adverse health outcomes including obesity, diabetes, and cardiovascular disease. Recent studies have shown that healthy and unhealthy sleep behaviors can be transmitted by social interactions between individuals within social networks. We investigate how social connectivity and light exposure influence sleep patterns and their health and performance. Using multimodal data collected from closely connected MIT/Harvard undergraduates with wearable sensors and mobile phones, we are developing statistical and multiscale mathematical models of sleep dynamics within social networks based on sleep and circadian physiology. These models will provide insights into the emergent dynamics of sleep behaviors within social networks, and allow us to test the effects of candidate strategies for intervening in populations with unhealthy sleep behaviors.

  • Digging into Brand Perception with Psychophysiology

    Elliott Hedman

    What do customers really think about your company or brand? Using skin conductance sensors, we measure what excites and frustrates customers when discussing topics relevant to your brand. For example, with the National Campaign to Prevent Teenage Pregnancy, we saw conversations about empowerment and abortion upset conservative families. However, talking about the importance of strong families excited and engaged them. Rather than rely on self-reports, physiological measurements allow us to pinpoint what words and concepts affect your customers. We hope work like this will help companies better reflect on how their actions and messaging affect their customer’s opinion in more detailed and accurate ways.

  • Emotion Prototyping: Redesigning the Customer Experience

    Rosalind W. Picard and Elliott Hedman

    You can test whether a website is usable by making wire frames, but how do you know if that site, product, or store is emotionally engaging? We build quick, iterative, environments where emotions can be tested and improved. Emphasis is on setting up the right motivation (users always have to buy what they pick), pressures (can you buy the laptop in 10 minutes?), and environment (competitors’ products better be on the shelf too). Once we see where customers are stressed or miss the fun part, we change the space on a daily, iterative cycle. Within two to three weeks, we can tell how to structure a new offering for a great experience. Seldom do the emotions we hope to create happen on the first try; emotion prototyping delivers the experience we want. We hope to better understand the benefits of emotion prototyping, especially while using the skin conductance sensor.

  • Exploring Temporal Patterns of Smile

    Rosalind W. Picard and Mohammed Ehasanul Hoque

    A smile is a multi-purpose expression. We smile to express rapport, polite disagreement, delight, sarcasm, and often, even frustration. Is it possible to develop computational models to distinguish among smiling instances when delighted, frustrated, or just being polite? In our ongoing work, we demonstrate that it is useful to explore how the patterns of smile evolve through time, and that while a smile may occur in positive and in negative situations, its dynamics may help to disambiguate the underlying state.

  • Facial Expression Analysis Over the Web

    Rosalind W. Picard, Daniel Jonathan McDuff, and formerly: Affectiva and Forbes

    This work builds on our earlier work with FaceSense, created to help automate the understanding of facial expressions, both cognitive and affective. The FaceSense system has now been made available commercially by Media Lab spinoff Affectiva as Affdex. In this work we present the first project analyzing facial expressions at scale over the Internet. The interface analyzes the participants' smile intensity as they watch popular commercials. They can compare their responses to an aggregate from the larger population. The system also allows us to crowd-source data for training expression recognition systems and to gain better understanding of facial expressions under natural at-home viewing conditions instead of in traditional lab settings.

  • Fathom: Probabilistic Graphical Models to Help Mental Health Counselors

    Karthik Dinakar, Jackie Chen, Henry A. Lieberman, and Rosalind W. Picard

    We explore advanced machine learning and reflective user interfaces to scale the national Crisis Text Line. We are using state-of-the-art probabilistic graphical topic models and visualizations to help a mental health counselor, extract patterns of mental health issues experienced by participants, and bring large scale data science to understanding the distribution of mental health issues in the United States.

  • FEEL: A Cloud System for Frequent Event and Biophysiological Signal Labeling

    Yadid Ayzenberg and Rosalind W. Picard

    The wide availability of low-cost, wearable, biophysiological sensors enables us to measure how the environment and our experiences impact our physiology. This creates a new challenge: in order to interpret the collected longitudinal data, we require the matching contextual information as well. Collecting weeks, months, and years of continuous biophysiological data makes it unfeasible to rely solely on our memory for providing the contextual information. Many view maintaining journals as burdensome, which may result in low compliance levels and unusable data. We present an architecture and implementation of a system for the acquisition, processing, and visualization of biophysiological signals and contextual information.

  • Gesture Guitar

    Rosalind W. Picard, Rob Morris and Tod Machover
    Emotions are often conveyed through gesture. Instruments that respond to gestures offer musicians new, exciting modes of musical expression. This project gives musicians wireless, gestural-based control over guitar effects parameters.
  • Got Sleep?

    Akane Sano, Rosalind W. Picard

    Got Sleep? is an android application to help people to be aware of their sleep-related behavioral patterns and tips about how they should change their behaviors to improve their sleep. The application evaluates people’s sleep habits before they start using the app, tracks day and night behaviors, and provides feedback about what kinds of behavior changes they should make and whether the improvement is achieved or not.

  • IDA: Inexpensive Networked Digital Stethoscope

    Yadid Ayzenberg

    Complex and expensive medical devices are mainly used in medical facilities by health professionals. IDA is an attempt to disrupt this paradigm and introduce a new type of device: easy to use, low cost, and open source. It is a digital stethoscope that can be connected to the Internet for streaming physiological data to remote clinicians. Designed to be fabricated anywhere in the world with minimal equipment, it can be operated by individuals without medical training.

  • MACH: My Automated Conversation coacH

    M. Ehsan Hoque, Rosalind Picard

    MACH, My Automated Conversation coacH, is a system for people to practice social interactions in face-to-face scenarios. MACH consists of a 3D character that can “see,” “hear,” and make its own “decisions” in real time. The system was validated in the context of job interviews with 90 MIT undergraduate students. Students who interacted with MACH demonstrated significant performance improvement compared to the students in the control group. We are currently expanding this technology to open up new possibilities in behavioral health (e.g., treating people with Asperger syndrome, social phobia, PTSD) as well as designing new interaction paradigms in human-computer interaction and robotics.

  • Making Engaging Concerts

    Rosalind W. Picard and Elliott Hedman

    Working with the New World Symphony, we measured participant skin conductance as they attended a classical concert for the first time. With the sensor technology, we noted times when the audience reacted or engaged with the music and other times when they became bored and drifted away. Our overall findings suggest that transitions, familiarity, and visual supplements can make concerts accessible and exciting for new concert goers. We hope this work can help entertainment industries better connect with their customers and refine the presentation of their work so that it can best be received by a more diverse audience.

  • Mapping the Stress of Medical Visits

    Rosalind W. Picard and Elliott Hedman

    Receiving a shot or discussing health problems can be stressful, but does not always have to be. We measure participants' skin conductance as they use medical devices or visit hospitals and note times when stress occurs. We then prototype possible solutions and record how the emotional experience changes. We hope work like this helps bring the medical community closer to their customers.

  • Measuring Arousal During Therapy for Children with Autism and ADHD

    Rosalind W. Picard and Elliott Hedman

    Physiological arousal is an important part of occupational therapy for children with autism and ADHD, but therapists do not have a way to objectively measure how therapy affects arousal. We hypothesize that when children participate in guided activities within an occupational therapy setting, informative changes in electrodermal activity (EDA) can be detected using iCalm. iCalm is a small, wireless sensor that measures EDA and motion, worn on the wrist or above the ankle. Statistical analysis describing how equipment affects EDA was inconclusive, suggesting that many factors play a role in how a child’s EDA changes. Case studies provided examples of how occupational therapy affected children’s EDA. This is the first study of the effects of occupational therapy’s in situ activities using continuous physiologic measures. The results suggest that careful case study analyses of the relation between therapeutic activities and physiological arousal may inform clinical practice.

  • Mobile Health Interventions for Drug Addiction and PTSD

    Rich Fletcher and Rosalind W. Picard

    We are developing a mobile phone-based platform to assist people with chronic diseases, panic-anxiety disorders, or addictions. Making use of wearable, wireless biosensors, the mobile phone uses pattern analysis and machine learning algorithms to detect specific physiological states and perform automatic interventions in the form of text/images plus sound files and social networking elements. We are currently working with the Veterans Administration drug rehabilitation program involving veterans with PTSD.

  • Mobisensus: Predicting Your Stress/Mood from Mobile Sensor Data

    Akane Sano and Rosalind Picard

    Can we recognize stress, mood, and health conditions from wearable sensors and mobile-phone usage data? We analyze long-term, multi-modal physiological, behavioral, and social data (electrodermal activity, skin temperature, accelerometer, phone usage, social network patterns) in daily lives with wearable sensors and mobile phones to extract bio-markers related to health conditions, interpret inter-individual differences, and develop systems to keep people healthy.

  • Multimodal Computational Behavior Analysis

    David Forsyth (UIUC), Gregory Abowd (GA Tech), Jim Rehg (GA Tech), Shri Narayanan (USC), Matthew Goodwin (NEU), Rosalind W. Picard, Javier Hernandez Rivera, Micah Eckhardt, Stan Scarloff (BU) and Takeo Kanade (CMU)

    This project will define and explore a new research area we call Computational Behavior Science–integrated technologies for multimodal computational sensing and modeling to capture, measure, analyze, and understand human behaviors. Our motivating goal is to revolutionize diagnosis and treatment of behavioral and developmental disorders. Our thesis is that emerging sensing and interpretation capabilities in vision, audition, and wearable computing technologies, when further developed and properly integrated, will transform this vision into reality. More specifically, we hope to: (1) enable widespread autism screening by allowing non-experts to easily collect high-quality behavioral data and perform initial assessment of risk status; (2) improve behavioral therapy through increased availability and improved quality, by making it easier to track the progress of an intervention and follow guidelines for maximizing learning progress; and (3) enable longitudinal analysis of a child's development based on quantitative behavioral data, using new tools for visualization.

  • Panoply

    Rosalind W. Picard and Robert Morris

    Panoply is a crowdsourcing application for mental health and emotional well-being. The platform offers a novel approach to computer-based psychotherapy, one that is optimized for accessibility, engagement and therapeutic efficacy. A three-week randomized controlled trial with 166 participants compared Panoply to an active control task (online expressive writing). Panoply conferred greater or equal benefits for nearly every therapeutic outcome measure. Panoply also significantly outperformed the control task on all measures of engagement.

  • Reinventing the Retail Experience

    Elliott Hedman and Rosalind W. Picard

    With skin conductance sensors, we map out what frustrates and excites customers as they shop—from layout to wanting to touch the product. Our work has helped a variety of large retailers innovate on what it means to shop. Findings have focused on reducing the stress of choices and learning while surprising customers in new ways. With the sensor technology we can pinpoint moments when customers are overwhelmed and then build out new ways to make retail engaging again.

  • SenseGlass: Using Google Glass to Sense Daily Emotions

    Rosalind W. Picard and Javier Hernandez Rivera

    For over a century, scientists have studied human emotions in laboratory settings. However, these emotions have been largely contrived–elicited by movies or fake “lab” stimuli, which tend not to matter to the participants in the studies, at least not compared with events in their real life. This work explores the utility of Google Glass, a head-mounted wearable device, to enable fundamental advances in the creation of affect-based user interfaces in natural settings.

  • StoryScape

    Rosalind W. Picard and Micah Eckhardt

    Stories, language, and art are at the heart StoryScape. While StoryScape began as a tool to meet the challenging language learning needs of children diagnosed with autism, it has become much more. StoryScape was created to be the first truly open and customizable platform for creating animated, interactive storybooks that can interact with the physical world. Download the android app at Google Play and go make your own amazing stories at website