Long-term Personalized Multimodal Interactions Between a Human Group and a Social Robot



Research efforts in social human-robot interactions (HRI) have been mainly targeted at a single-person interacting with a single robot in various domains such as education and healthcare.The increasing availability of social robots in people’s everyday lives poses a new urgency to expand the HRI research focus from single-person contexts to multi-person interactions, such as with a small group of 2-6 people. A growing number of HRI studies have started to examine how to design a robot’s interaction role (e.g., robot moderator) and social behaviors (e.g., robot’s backchanneling behaviors) in multi-person HRI (MHRI), as well as how a robot can influence processes and dynamics of human groups (e.g., group conflict and group participation).Nevertheless, this MHRI paradigm is still largely under-explored, particularly its conceptual frameworks, design principles and technical tools. The current single-person HRI (SHRI) theories, approaches and technical tools cannot readily scale to human groups and sufficiently capture a fundamental change in complexity introduced by MHRI yet.

Motivated by this need, our research takes a multidisciplinary approach to develop both design frameworks and computational tools for fully autonomous personalized robot companions that can engage in social interactions with two people in the long term. To achieve this research goal, we are working on multiple projects targeting different aspects of this novel personalized MHRI paradigm.

In the DAMI-P2C project, we collected a multimodal dataset of 34 parent-child dyads reading and conversing together with the goal of examining and modeling the interpersonal dynamics in dyadic interactions. For example, we analyzed both parents’ and children’s individual and dyadic nonverbal behaviors in relation to their four relationship characteristics ,i.e., child temperament, parenting style, parenting stress, and home literacy environment, and showed the importance of accounting for both individual- and dyad-scale nonverbal behaviors when predicting dyadic relationship characteristics.

In the Triadic-Pilot project, we designed, developed and implemented a novel parent-child-robot interaction paradigm in the context of shared reading. Then, we conducted a pilot triadic robot interaction study with 12 parent-child pairs families.The pilot study aims to investigate the effects of triadic reading on the human dyad’s socio-affective connections and reading behaviors, and compares the effects of different robot behaviors strategies. In addition, we are currently conducting qualitative and quantitative analyses on how to take a human-centered approach to design next-generation robot companions for parent-child story time.

In the TAMI project, we proposed a novel context-generic design framework, TAMI-HHR, for triadic adaptive multimodal interactions between a human dyad and a social robot. The presented framework builds upon existing work within the HRI field, aiming at unifying and extending key concepts in MHRI. It makes the following unique contributions. First, it proposes the first generalizable MHRI design framework that integrates both robot behavior design and adaptation components together while taking both group-level and individual-level design factors and considerations into account. Second, it provides step-by-step design guidelines foreach component in TAMI-HHR as well as three novel and distinct MHRI case studies, scaffolding researchers and designers to develop their contextualized MHRI studies from scratch. Lastly, it presents an overview of the state-of-the-art MHRI research, key challenges and future directions for MHRI.

In the TAMI-MODEL project, we are designing both the robot affective sensing models in the multi-person context and the robot behavior personalization models for the triadic dyad-robot interactions. In addition, we are designing new evaluation methods to analyze the robot’s long-term personalization effectiveness.

The DAMI-P2C project:
ICMI (2020): “Dyadic Speech-based Affect Recognition using DAMI-P2C Parent-child Multimodal Interaction Dataset”
TAC (submitted): “Dyadic Affect in Parent-child Multi-modalInteraction: Introducing the DAMI-P2C Dataset and its Preliminary Analysis”
FG (submitted): “Body Gesture and Head Movement Analyses in Dyadic Parent-Child Interaction as Indicators of Relationship”
The TAMI-HHR project:
UMUAI (in preparation): “TAMI-HHR: A Design Framework for Triadic AdaptiveMultimodal Interactions between a Human Dyad and aSocial Robot”
The Triadic-Pilot project:
Soo Jung Jang’s M.Eng Thesis: Designing Parent-Child-Robot Triadic Storybook Reading Interaction