Project

Automatic Facial Expression Analysis

Groups

Recognizing non-verbal cues, which constitute a large percentage of our communication, is a prime facet of building emotionally intelligent systems. Facial expressions and movements such as a smile or a nod are used either to fulfill a semantic function, to communicate emotions, or as conversational cues. We are developing an automatic tool using computer vision and various machine-learning techniques, which can detect the different facial movements and head gestures of people while they are interacting naturally with the computer. Past work on this project determined techniques to track upper facial features (eyes and eyebrows) and detect facial actions corresponding to those features (eyes squintint or widening, eyebrows raised). The ongoing project is expanding its scope to track and detect facial actions corresponding to the lower features. Further, we hope to integrate the facial expression analysis module with other sensors developed by the Affective Computing group to reliably detect and recognize different emotions.