Publication

Dyadic Affect in Parent-child Multi-modal Interaction: Introducing the DAMI-P2C Dataset and its Preliminary Analysis

H. Chen, S. M. Alghowinem, S. J. Jang, C. Breazeal and H. W. Park, "Dyadic Affect in Parent-child Multi-modal Interaction: Introducing the DAMI-P2C Dataset and its Preliminary Analysis," in IEEE Transactions on Affective Computing, doi: 10.1109/TAFFC.2022.3178689.

Abstract

High-quality parent-child conversational interactions are crucial for children's social, emotional, and cognitive development. However, many children have limited exposure to these interactions at home. As increasingly accessible and scalable interventions in child development, interactive technologies, such as social robots, have great potential for facilitating parent-child interactions. However, such technology-based interventions are still underexplored, as the technologies' limited ability to understand the social-emotional dynamics of human dyadic interactions impedes their effective delivery of timely, adaptive interventions. To advance research on resolving this roadblock, we present a “dyadic affect in multimodal interaction-parent to child” (DAMI-P2C) dataset collected during a study of 34 parent-child pairs, where parents and children (3-7 years old) engaged in reading storybooks together. In contrast to existing public datasets for social-emotional behaviors in dyadic interactions, each instance for both participants in our dataset was annotated for affect by three labelers. Additionally, the dataset contains audiovisual recordings as well as each dyad's sociodemographic profiles, co-reading behaviors, affect labels, and body joints. We describe the dataset's main characteristics and provide a preliminary analysis of the interrelations between sociodemographic profiles, co-reading behaviors, and affect labels. The dataset provides us with useful insights into the computing and social science fields.

Related Content