In collaboration with the Lurie Center for Autism, Massachusetts General Hospital, and MIT Lincoln Lab, the aim is to study, characterize and help communication for minimally verbal individuals diagnosed with Autism Spectrum Disorder (ASD). Communication impairments have a big impact on the quality of life and impose increased health risks and a significant burden on patients, caregivers, families, and healthcare institutions. Physiological signals give rich information about bodily functions which can be used to provide assistance to people when they need them. People with ASD are known to have a rich inner world of language but the intelligibility and vocalization of speech are affected. We are working on using electromyography (EMG) signals, picked up using AlterEgo, to decode what the individual is trying to say through machine learning. Additionally, we are interested in predicting engagement and disengagement of a person when performing different speech tasks using AttentivU, an electroencephalography (EEG) and electrooculography (EOG)-enabled sensing device.
The objective of the project is to eventually improve speech intelligibility using auditory, haptic, visual, and attention feedback mechanisms.