Affective Computing
How new technologies can help people better communicate, understand, and respond to affective information.
The Affective Computing group aims to bridge the gap between human emotions and computational technology. Current research addresses machine recognition and modeling of human emotional expression, including the invention of new software and hardware tools to help people gather, communicate, and express emotional information, together with tools to help people better manage and understand the ways emotion impacts health, social interaction, learning, memory, and behavior. Our projects are diverse: from inventing ways to help people who face communication and emotion regulation challenges; to enabling customers to give rich emotional feedback; to quantifying patterns of autonomic activity (core emotional physiology) during seizures, stress-related disorders, and sleep.

Research Projects

  • Analysis and Visualization of Longitudinal Physiological Data of Children with ASD

    Javier Hernandez Rivera

    Individuals diagnosed with Autism Spectrum Disorder (ASD) who have written about their experiences almost always describe immense stress and anxiety. Traditional methods of measuring these responses consist of monitoring the Autonomic Nervous System (ANS) of participants who behave compliantly in artificial laboratory settings. To the best of our knowledge, the study here is the first to conduct long-term monitoring and analysis of ANS in daily school activity settings with minimally-verbal individuals on the autism spectrum. ANS data obtained under natural circumstances can be very useful to provide warning indications of stress-related events and life-threatening events.

  • Analysis of Autonomic Sleep Patterns

    Akane Sano, Rosalind W. Picard, Suzanne E. Goldman, Beth A. Malow (Vanderbilt) Rana el Kaliouby, and Robert Stickgold (Harvard)

    We are examining autonomic sleep patterns using a wrist-worn biosensor that enables comfortable measurement of skin conductance, skin temperature, and motion. The skin conductance reflects sympathetic arousal. We are looking at sleep patterns in healthy groups, in groups with autism, and in groups with sleep disorders. We are looking especially at sleep quality and at performance on learning and memory tasks.

  • Auditory Desensitization Games

    Rosalind W. Picard, Matthew Goodwin and Rob Morris
    Persons on the autism spectrum often report hypersensitivity to sound. Efforts have been made to manage this condition, but there is wide room for improvement. One approach—exposure therapy—has promise, and a recent study showed that it helped several individuals diagnosed with autism overcome their sound sensitivities. In this project, we borrow principles from exposure therapy, and use fun, engaging, games to help individuals gradually get used to sounds that they might ordinarily find frightening or painful.
  • Automatic Stress Recognition in Real-Life Settings

    Rosalind W. Picard, Robert Randall Morris and Javier Hernandez Rivera

    Technologies to automatically recognize stress, are extremely important to prevent chronic psychological stress and the pathophysiological risks associated to it. The introduction of comfortable and wearable biosensors have created new opportunities to measure stress in real-life environments, but there is often great variability in how people experience stress and how they express it physiologically. In this project, we modify the loss function of Support Vector Machines to encode a person's tendency to feel more or less stressed, and give more importance to the training samples of the most similar subjects. These changes are validated in a case study where skin conductance was monitored in nine call center employees during one week of their regular work. Employees working in this type of settings usually handle high volumes of calls every day, and they frequently interact with angry and frustrated customers that lead to high stress levels.

  • Cardiocam

    Ming-Zher Poh, Daniel McDuff and Rosalind W. Picard

    Cardiocam is a low-cost, non-contact technology for measurement of physiological signals such as heart rate and breathing rate using a basic digital imaging device such as a webcam. The ability to perform remote measurements of vital signs is promising for enhancing the delivery of primary health care.

  • Customized Computer-Mediated Interventions

    Rosalind W. Picard and Rob Morris

    Individuals diagnosed with autism spectrum disorder (ASD) often have intense, focused interests. These interests, when harnessed properly, can help motivate an individual to persist in a task that might otherwise be too challenging or bothersome. For example, past research has shown that embedding focused interests into educational curricula can increase task adherence and task performance in individuals with ASD. However, providing this degree of customization is often time-consuming and costly and, in the case of computer-mediated interventions, high-level computer-programming skills are often required. We have recently designed new software to solve this problem. Specifically, we have built an algorithm that will: (1) retrieve user-specified images from the Google database; (2) strip them of their background; and (3) embed them seamlessly into Flash-based computer programs.

  • Emotion and Memory

    Daniel McDuff, Rana el Kaliouby and Rosalind Picard

    Have you ever wondered what makes an ad memorable? We have performed a comprehensive review of literature concerning advertising, memory, and emotion. A summary of results are available.

  • Evaluation Tool for Recognition of Social-Emotional Expressions from Facial-Head Movements

    Rosalind W. Picard
    To help people improve their reading of faces during natural conversations, we developed a video tool to evaluate this skill. We collected over 100 videos of conversations between pairs of both autistic and neurotypical people, each wearing a Self-Cam. The videos were manually segmented into chunks of 7-20 seconds according to expressive content, labeled, and sorted by difficulty—all tasks we plan to automate using technologies under development. Next, we built a rating interface including videos of self, peers, familiar adults, strangers, and unknown actors, allowing for performance comparisons across conditions of familiarity and expression. We obtained reliable identification (by coders) of categories of smiling, happy, interested, thinking, and unsure in the segmented videos. The tool was finally used to assess recognition of these five categories for eight neurotypical and five autistic people. Results show some autistics approaching the abilities of neurotypicals while several score just above random.
  • Exploring Temporal Patterns of Smile

    Rosalind W. Picard and Mohammed Ehasanul Hoque

    A smile is a multi-purpose expression. We smile to express rapport, polite disagreement, delight, sarcasm, and often, even frustration. Is it possible to develop computational models to distinguish among smiling instances when delighted, frustrated or just being polite? In our ongoing work, we demonstrate that it is useful to explore how the patterns of smile evolve through time, and that while a smile may occur in positive and in negative situations, its dynamics may help to disambiguate the underlying state.

  • Externalization Toolkit

    Rosalind W. Picard, Matthew Goodwin and Jackie Chia-Hsun Lee
    We propose a set of customizable, easy-to-understand, and low-cost physiological toolkits in order to enable people to visualize and utilize autonomic arousal information. In particular, we aim for the toolkits to be usable in one of the most challenging usability conditions: helping individuals diagnosed with autism. This toolkit includes: wearable, wireless, heart-rate and skin-conductance sensors; pendant-like and hand-held physiological indicators hidden or embedded into certain toys or tools; and a customized software interface that allows caregivers and parents to establish a general understanding of an individual's arousal profile from daily life and to set up physiological alarms for events of interest. We are evaluating the ability of this externalization toolkit to help individuals on the autism spectrum to better communicate their internal states to trusted teachers and family members.
  • FaceSense: Affective-Cognitive State Inference from Facial Video

    Daniel McDuff, Rana el Kaliouby, Abdelrahman Nasser Mahmoud, Youssef Kashef, M. Ehsan Hoque, Matthew Goodwin and Rosalind W. Picard
    People express and communicate their mental states—such as emotions, thoughts, and desires—through facial expressions, vocal nuances, gestures, and other non-verbal channels. We have developed a computational model that enables real-time analysis, tagging, and inference of cognitive-affective mental states from facial video. This framework combines bottom-up, vision-based processing of the face (e.g., a head nod or smile) with top-down predictions of mental-state models (e.g., interest and confusion) to interpret the meaning underlying head and facial signals over time. Our system tags facial expressions, head gestures, and affective-cognitive states at multiple spatial and temporal granularities in real time and offline, in both natural human-human and human-computer interaction contexts. A version of this system is being made available commercially by Media Lab spin-off Affectiva, indexing emotion from faces. Applications range from measuring people's experiences to a training tool for autism spectrum disorders and people who are nonverbal learning disabled.
  • Facial Expression Analysis Over the Web

    Rosalind W. Picard, Rana el Kaliouby, Daniel Jonathan McDuff, Affectiva and Forbes

    This work builds on our earlier work with FaceSense, created to help automate the understanding of facial expressions, both cognitive and affective. The FaceSense system has now been made available commercially by Media Lab spin-off Affectiva as Affdex. In this work we present the first project analyzing facial expressions at scale over the Internet. The interface analyzes the participants' smile intensity as they watch popular commercials. They can compare their responses to an aggregate from the larger population. The system also allows us to crowd-source data for training expression recognition systems and to gain better understanding of facial expressions under natural at-home viewing conditions instead of in traditional lab settings.

  • FEEL: A Cloud System for Frequent Event and Biophysiological Signal Labeling

    Yadid Ayzenberg and Rosalind Picard

    The wide availability of low-cost, wearable, biophysiological sensors enables us to measure how the environment and our experiences impact our physiology. This creates a new challenge: in order to interpret the collected longitudinal data, we require the matching contextual information as well. Collecting weeks, months, and years of continuous biophysiological data makes it unfeasible to rely solely on our memory for providing the contextual information. Many view maintaining journals as burdensome, which may result in low compliance levels and unusable data. If we are to learn the effects of the environment and our day-to-day actions, and choices on our physiology, it would be invaluable to develop systems that will label biophysiological sensor data with contextual information. We present an architecture and implementation of a system for the acquisition, processing, and visualization of biophysiological signals and contextual information.

  • Frame It

    Rosalind W. Picard and Micah Eckhardt
    Frame It is an interactive, blended, tangible-digital puzzle game intended as a play-centered teaching and therapeutic tool. Current work is focused on the development of a social-signals puzzle game for children with autism that will help them recognize social-emotional cues from information surrounding the eyes. In addition, we are investigating if this play-centered therapy results in the children becoming less averse to direct eye contact with others. The study uses eye-tracking technology to measure gaze behavior while participants are exposed to images and videos of social settings and expressions. Results indicate that significant changes in expression recognition and social gaze are possible after repeated uses of the Frame It game platform.
  • Gesture Guitar

    Rosalind W. Picard, Rob Morris and Tod Machover
    Emotions are often conveyed through gesture. Instruments that respond to gestures offer musicians new, exciting modes of musical expression. This project gives musicians wireless, gestural-based control over guitar effects parameters.
  • IDA: Inexpensive Networked Digital Stethoscope

    Yadid Ayzenberg

    Complex and expensive medical devices are mainly used in medical facilities by health professionals. IDA is an attempt to disrupt this paradigm and introduce a new type of device: easy to use, low cost, and open source. It is a digital stethoscope that can be connected to the Internet for streaming the physiological data to remote clinicians. Designed to be fabricated anywhere in the world with minimal equipment, it can be operated by individuals without medical training.

  • Infant Monitoring and Communication

    Rana el Kaliouby, Rich Fletcher, Matthew Goodwin and Rosalind W. Picard
    We have been developing comfortable, safe, attractive physiological sensors that infants can wear around the clock to wirelessly communicate their internal physiological state changes. The sensors capture sympathetic nervous system arousal, temperature, physical activity, and other physiological indications that can be processed to signal changes in sleep, arousal, discomfort or distress, all of which are important for helping parents better understand the internal state of their child and what things stress or soothe their baby. The technology can also be used to collect physiological and circadian patterns of data in infants at risk for developmental disabilities.
  • Inside-Out: Reflecting on your Inner State

    Richard R. Fletcher, Rosalind W. Picard, Daniel Jonathan McDuff and Javier Hernandez Rivera

    We present a novel sensor system and interface that enables an individual to capture and reflect on their daily activities. The wearable system gathers both physiological responses and visual context through the use of a wearable biosensor and a cell-phone camera, respectively. Collected information is locally stored and securely transmitted to a novel digital mirror. Through interactive visualizations, this interface allows users to reflect not only on their outer appearance but also on their inner physiological responses to daily activities. Finally, we illustrate how combining a time record of physiological data with visual contextual information can improve and enhance the experience of reflection in many real-life scenarios, and serve as a useful tool for behavior science and therapy.

  • Long-Term Physio and Behavioral Data Analysis

    Akane Sano and Rosalind Picard

    Can we recognize stress, mood, health condition from wearable sensors or mobile phone usage data? We analyze long-term multi-modal physiological and behavioral data (electro-dermal activity, skin temperature, accelerometer, how often you use your mobile phone, how often you make call/sms) during day and night with wearable sensors and mobile phones to extract bio-markers related to health conditions, interpret inter-individual differences, and develop systems to keep people healthy.

  • Machine Learning and Pattern Recognition with Multiple Modalities

    Hyungil Ahn and Rosalind W. Picard
    This project develops new theory and algorithms to enable computers to make rapid and accurate inferences from multiple modes of data, such as determining a person's affective state from multiple sensors—video, mouse behavior, chair pressure patterns, typed selections, or physiology. Recent efforts focus on understanding the level of a person's attention, useful for things such as determining when to interrupt. Our approach is Bayesian: formulating probabilistic models on the basis of domain knowledge and training data, and then performing inference according to the rules of probability theory. This type of sensor fusion work is especially challenging due to problems of sensor channel drop-out, different kinds of noise in different channels, dependence between channels, scarce and sometimes inaccurate labels, and patterns to detect that are inherently time-varying. We have constructed a variety of new algorithms for solving these problems and demonstrated their performance gains over other state-of-the-art methods.
  • Measuring Arousal During Therapy for Children with Autism and ADHD

    Rosalind W. Picard and Elliott Hedman

    Physiological arousal is an important part of occupational therapy for children with autism and ADHD, but therapists do not have a way to objectively measure how therapy affects arousal. We hypothesize that when children participate in guided activities within an occupational therapy setting, informative changes in electrodermal activity (EDA) can be detected using iCalm. iCalm is a small, wireless sensor that measures EDA and motion, worn on the wrist or above the ankle. Statistical analysis describing how equipment affects EDA was inconclusive, suggesting that many factors play a role in how a child’s EDA changes. Case studies provided examples of how occupational therapy affected children’s EDA. This is the first study of the effects of occupational therapy’s in situ activities using continuous physiologic measures. The results suggest that careful case study analyses of the relation between therapeutic activities and physiological arousal may inform clinical practice.

  • Measuring Customer Experiences with Arousal

    Rosalind W. Picard and Elliott Hedman

    How can we better understand people’s emotional experiences with a product or service? Traditional interview methods require people to remember their emotional state, which is difficult. We use psychophysiological measurements such as heart rate and skin conductance to map people’s emotional changes across time. We then interview people about times when their emotions changed, in order to gain insight into the experiences that corresponded with the emotional changes. This method has been used to generate hundreds of insights with a variety of products including games, interfaces, therapeutic activities, and self-driving cars.

  • Mobile Health Interventions for Drug Addiction and PTSD

    Rich Fletcher and Rosalind Picard

    We are developing a mobile phone-based platform to assist people with chronic diseases, panic-anxiety disorders or addictions. Making use of wearable, wireless biosensors, the mobile phone uses pattern analysis and machine learning algorithms to detect specific physiological states and perform automatic interventions in the form of text/images plus sound files and social networking elements. We are currently working with the Veterans Administration drug rehabilitation program involving veterans with PTSD.

  • Multimodal Computational Behavior Analysis

    David Forsyth (UIUC), Gregory Abowd (GA Tech), Jim Rehg (GA Tech), Shri Narayanan (USC), Rana el Kaliouby, Matthew Goodwin, Rosalind W. Picard, Javier Hernandez Rivera, Stan Scarloff (BU) and Takeo Kanade (CMU)

    This project will define and explore a new research area we call Computational Behavior Science–integrated technologies for multimodal computational sensing and modeling to capture, measure, analyze, and understand human behaviors. Our motivating goal is to revolutionize diagnosis and treatment of behavioral and developmental disorders. Our thesis is that emerging sensing and interpretation capabilities in vision, audition, and wearable computing technologies, when further developed and properly integrated, will transform this vision into reality. More specifically, we hope to: (1) enable widespread autism screening by allowing non-experts to easily collect high-quality behavioral data and perform initial assessment of risk status; (2) improve behavioral therapy through increased availability and improved quality, by making it easier to track the progress of an intervention and follow guidelines for maximizing learning progress; and (3) enable longitudinal analysis of a child's development based on quantitative behavioral data, using new tools for visualization.

  • Panoply

    Rosalind W. Picard and Robert Morris

    In the next year, roughly 26 million Americans will suffer from depression. Many more will meet the clinical diagnosis for an anxiety disorder. While psychotherapies like cognitive-behavioral therapy are known to be effective for these conditions, the demand for these treatments exceeds the resources available. There are simply not enough clinicians available. Access is also limited by cost, stigma, and the logistics of scheduling and traveling to appointments. What if we could crowdsource this problem? Panoply is a crowd-based platform for mental health and emotional well-being. In lieu of clinician oversight, Panoply coordinates therapeutic support from anonymous online workers who are trained on demand. The system utilizes advances in collective intelligence and crowdsourcing to ensure that feedback is timely and vetted for quality.

  • Sensor-Enabled Measurement of Stereotypy and Arousal in Individuals with Autism

    Matthew Goodwin, Clark Freifeld and Sophia Yuditskaya
    A small number of studies support the notion of a functional relationship between movement stereotypy and arousal in individuals with ASD, such that changes in autonomic activity either precede or are a consequence of engaging in stereotypical motor movements. Unfortunately, it is difficult to generalize these findings as previous studies fail to report reliability statistics that demonstrate accurate identification of movement stereotypy start and end times, and use autonomic monitors that are obtrusive and thus only suitable for short-term measurement in laboratory settings. The current investigation further explores the relationship between movement stereotypy and autonomic activity in persons with autism by combining state-of-the-art ambulatory heart rate monitors to objectively assess arousal across settings; and wireless, wearable motion sensors and pattern recognition software that can automatically and reliably detect stereotypical motor movements in individuals with autism in real time.
  • Smart Phone Frequent EDA Event Logger

    Yadid Ayzenberg and Rosalind Picard

    Have you ever wondered which emails, phone calls, or meetings cause you the most stress or anxiousness? Well, now you can find out. A wristband sensor measures electrodermal activity (EDA), which responds to stress, anxiety, and arousal. Each time you read an email, place a call, or hold a meeting, your phone will measure your EDA levels by connecting to the sensor via Bluetooth. The goal is to design a tool that enables the user to attribute levels of stress and anxiety to particular events. FEEL allows the user to view all of the events and the levels of EDA that are associated with them: with FEEL, users can see which event caused a higher level of anxiety and stress, and can view which part of an event caused the greatest reaction. Users can also view EDA levels in real time.

  • Social + Sleep + Moods

    Akane Sano and Rosalind Picard

    Sleep is critical to a wide range of biological functions; inadequate sleep results in impaired cognitive performance and mood, and adverse health outcomes including obesity, diabetes, and cardiovascular disease. Recent studies have shown that healthy and unhealthy sleep behaviors can be transmitted by social interactions between individuals within social networks. We investigate how social connectivity and light exposure influence sleep patterns and their health and performance. Using multimodal data collected from closely connected MIT undergraduates with wearable sensors and mobile phones, we will develop the statistical and multi-scale mathematical models of sleep dynamics within social networks based on sleep and circadian physiology. These models will provide insights into the emergent dynamics of sleep behaviors within social networks, and allow us to test the effects of candidate strategies for intervening in populations with unhealthy sleep behaviors.

  • StoryScape

    Rosalind W. Picard and Micah Eckhardt

    StoryScape is a social illustrated primer. The StoryScape platform is being developed to allow for easy creation of highly interactive and customizable stories. In addition, the platform will allow a community of content creators to easily share, collaborate, and remix each others' works. Experimental goals of StoryScape include its use with children diagnosed with autism who are minimally verbal or non-verbal. We seek to test our interaction paradigm and personalization feature to determine if multi-modal interactive and customizable stories influence language acquisition and expression.

  • The Frustration of Learning Monopoly

    Rosalind W. Picard and Elliott Hedman

    We are looking at the emotional experience created when children learn games. Why do we start games with the most boring part, reading directions? How can we create a product that does not create an abundance of work for parents? Key insights generated from field work, interviews, and measurement of electrodermal activity are: kids become bored listening to directions, "it's like going to school"; parents feel rushed reading directions as they sense their children's boredom; children and parents struggle for power in interpreting and enforcing rules; children learn games by mimicking their parents, and; children enjoy the challenge of learning new games.