Project

Personalized Machine Learning for Autism Therapy

MIT Media Lab

EngageME: Personalized machine learning and humanoid robots for measuring affect and engagement of children with autism

EngageME is a project aimed at building a new technology to enable automatic monitoring of affect and engagement of children with ASC (Autism Spectrum Conditions) in communication-centered activities.

This work has been published in Science Robotics, June 2018.

ASC affects 1 in 64 individuals in the United States. Children with ASC have persistent challenges in social communication and interactions, and restricted and repetitive patterns of behavior and interests–all of which pose serious challenges for their socio-emotional lives and the lives of their families. Different types of autism therapies help children with ASC improve their social skills. Recently, social robots have been used in interactive play during therapy, because many children with ASC find them enjoyable and engaging, perhaps due to their human-like yet predictable and non-threatening nature. However, to enable naturalistic interaction between social robots and a child, these robots must be equipped with a type of socio-emotional intelligence that allows them to learn and recognize the child’s behavioral cues and respond in a more natural and engaging way.

EngageME investigates the use of humanoid robots (for example, NAO) in autism therapy for children with ASC. This technology builds upon state-of-the-art machine learning, bringing novel personalized and culture-tailored models for automated measurement of affect and engagement. What’s specific to this project is that we devised robot perception that uses personalized machine learning to adapt its interpretations of observed affective states, such as valence (pleasure-displeasure continuum) and arousal (alertness), and engagement in the task, to each child.

We realized this framework by personalizing “deep learning” models: a type of machine learning algorithms that mimic the learning activity of the human brain. We used state-of-the-art data processing tools for analysis of multi-modal behavioral cues of each child (their facial expressions, head pose, tone of voice and vocalizations, and biosignals including body temperature, heart rate and skin conductance) recorded during real-world therapy sessions. These were paired with ratings of affect and engagement provided by human experts, who inspected audio-visual recordings of the therapy sessions, and used to learn the personalized deep models for robot perception. We tested these models on new data from therapy sessions, and achieved an agreement of ~60% between robot-perceived and human-coded levels of affect and engagement shown by these children.

This is the first time that a fully data-driven approach (using machine learning) was used in the context of autism therapy to design a robot perception module that can automatically adapt its interpretations of children’s affect and engagement by accounting for cultural and individual differences between children with autism.

EngageME is a EU HORIZON 2020 funded project, under grant agreement no. 701236 —Marie Skłodowska-Curie Individual Fellowship.