Publication

Modeling the Dynamics of Nonverbal Behavior on Interpersonal Trust for Human-Robot Interactions.

J. J. Lee, B. Knox, C. Breazeal.

Abstract

We describe research towards creating a computational model for recognizing interpersonal trust in social interactions. We found that four negative gestural cues— leaning-backward, face-touching, hand-touching, and crossing-arms—are together predictive of lower levels of trust. Three positive gestural cues—leaningforward, having arms-in-lap, and open-arms—are predictive of higher levels of trust. We train a probabilistic graphical model using natural social interaction data, a “Trust Hidden Markov Model” that incorporates the occurrence of these seven important gestures throughout the social interaction. This Trust HMM predicts with 69.44% accuracy whether an individual is willing to behave cooperatively or uncooperatively with their novel partner; in comparison, a gesture-ignorant model achieves 63.89% accuracy. We attempt to automate this recognition process by detecting those trust-related behaviors through 3D motion capture technology and gesture recognition algorithms. We aim to eventually create a hierarchical system—with low-level gesture recognition for high-level trust recognition—that is capable of predicting whether an individual finds another to be a trustworthy or untrustworthy partner through their nonverbal expressions.

Related Content