Project

Impact of Education on Child-Robot Relationships

Copyright

Daniella DiPaola

Daniella DiPaola

Technologies designed with personalities and social interfaces are entering our homes in the form of social robots such as Jibo and Cozmo. By emulating interpersonal interactions, social robots have great potential to help us learn, be more creative, and reduce stress. Social robots also introduce potentials for harm, such as emotional manipulation for money or power. More information about the nature of long-term social relationships between social robots and children has the potential to help avoid potential harms.

In artificial intelligence, transparency is one of the key tenets of ethical and responsible design. It is hypothesized that by knowing how a system works, users may be able to better use and trust robotic systems. Educators are beginning to create materials that give students both a conceptual understanding of the system (i.e. how do sensors work?) and an applied understanding of the system (i.e. program a sensor to detect when the lights are off). Children, as early as Pre-K, are capable of learning about and building features for social robots. However, we still do not know how social relationships between children and robots change when the inner workings of these systems become more transparent.

First, I discuss the design of two curricula that take different approaches to educate youth (grades 4 and 5) about social robots. The Knowledge and Societal Impact curriculum teaches students about the technical and ethical topics surrounding social robots. The Programming curriculum allows students to program their own conversational skills on Jibo. These curricula represent two pedagogical approaches in the field of AI education, one focused on embedding ethics, and the other focused on students as self-driven makers.

Next, I evaluated the impact of these curricula on fourth and fifth-grade students who simultaneously lived with a social robot in their home for two months. Students were assigned to one of four conditions: no education, only Knowledge and Societal Impact, only Programming, and both Knowledge and Societal Impact and Programming. I found that students were able to understand and engage with the curricula and that the curricula helped them form a more clear model of what their robot was capable of. However, I found no difference in perceived emotional relationship or usage among groups. Students in all groups found the robot as equally likable, anthropomorphic, intelligent, safe, and animated. However, students who engaged in the Knowledge and Societal Impact curriculum found their robot to be significantly less trustworthy than those in different groups.

Overall, results from this study indicate that children will continue to treat robots as social partners regardless of the information that they hold about them. However, it seems that teaching students about the societal impact of robots makes students less trusting of their own robots. These results are timely and relevant given public discourse. Many have advocated for education to prevent deception or misuse of social robots, and findings from the study suggest that new approaches to education are needed.