Project

Dyadic Affect in Multimodal Interaction - Parent to Child (DAMI-P2C) Dataset

PRG

Groups

Audio-visual data set containing around 21.6 hours of parent-child interaction during story reading.

The DAMI-P2C dataset consists of audio-visual recordings of 2 sessions from each of the 34 parent-child dyad participating in the story reading activity. The 34 families with children between the ages of 3-7 years were recorded for two 45-minute in-lab sessions (Huili, et al., 2020). 

The parent and child select any books from a corpus of 30 storybooks that were digitized on a touchscreen tablet. The dyad was always a pair of one parent and one child.

As seen in the figure above, we use 7 cameras and one boundary microphone to record the sessions. One of the cameras was attached to the parent’s forehead for a first-person view, and another was attached to the tablet (and some sessions to the child’s forehead). 

Bellow is the audio-visual Recordings of Parent-Child Story-reading Interactions.

Fig. (a) displays a story-reading station setup including different camera angles, a high-quality microphone, a story-reading table, and a story-reading tablet with storybooks.

Fig. (b-g) displays an example scene of a parent-child story-time interaction from different camera views.

Sharing Annotations

The interaction has been transcribed and annotated for a variety of verbal and non-verbal features.

  • Survey: Pre & post sessions.
  • Transcription (manually) of the parent-child dialog was manually transcribed and 
  • Dialog annotation: annotated for questions, chatting, reading segments.
  • Parent Perception Ratings (every 1-5 min of the interaction)
  • Engagement: both joint engagement and coordinated engagement of the child (every 5-sec segments)
  • Affect: both arousal and valance (every 5-sec segments)

 Details and Instructions of the dataset Content and Usage can be found here

Request for Access

To access the dataset for academic, non-profit, or research purposes only, please send an email to dami-p2c-admins@media.mit.edu with the following information:  

  1. Full Name 
  2. Job Title
  3. Academic Affiliation
  4. Research Group Website
  5. Signed Data Use Agreement Document -Signatures of the lead PI and all students using the dataset must be included.
  6. CITI certificate

FAQs

  1. The Data Use Agreement has an expiration date. Is it possible to request a renewal?

     

     Yes!

  1. The Data Use Agreement says that we need to obtain MIT's written content before we can "present, submit for publication, publicly post or publish any information contained in or derived from the DATA." What is required for MIT's content and agreement?

     

    The only restriction is demonstrating that the following paper is properly cited:

    H. Chen, S. M. Alghowinem, S. J. Jang, C. Breazeal and H. W. Park, "Dyadic Affect in Parent-child Multi-modal Interaction: Introducing the DAMI-P2C Dataset and its Preliminary Analysis," in IEEE Transactions on Affective Computing, doi: 10.1109/TAFFC.2022.3178689.

    Chen, Huili, et al. "Dyadic Speech-based Affect Recognition using DAMI-P2C Parent-child Multimodal Interaction Dataset." arXiv preprint arXiv:2008.09207 (2020).