Manipulating Mental States through Physical Action.

J. Gray, C. Breazeal


We present our implementation of a self-as-simulator architecture for mental state manipulation through physical action. The robot attempts to model how a human’s mental states are updated through their visual perception of the world around them. This modeling, combined with geometrically detailed, perspective correct simulations of the immediate future, allows the robot to choose actions which influence the human’s mental states through their visual perception. The system is demonstrated in a competitive game scenario, where the robot attempts to manipulate the mental states of an individual in order to win. We evaluate people’s reaction to the system, focusing on the participants’ perception of a robot with mental state manipulation capabilities.

Related Content