Post

PopBots: A Hands-on STEAM platform for the AI generation

Randi Williams

My first programming experience was in 10th grade, when I took my school's introduction to computer science class. I remember working on an assignment where we had to make a ball bounce off the edges of the screen. When I finally got it working, I stared at the ball bouncing around the screen getting increasingly excited. There were so many things I could do to improve the project: make it bounce in unexpected directions, change colors, or even multiply. That is the power of learning to program—you learn to use technology to solve problems that are meaningful to you.

Almost a decade later, computer science is still my favorite problem-solving tool. Especially in issues of access and empowerment for underserved communities, I find that technology can be quite powerful. This is why children’s computational thinking platforms are incredibly important: they teach children the skills they need to change the world.

But existing platforms focus on programming, and I think we must go further. Children are not just growing up with computers and video games anymore; they are growing up in the era of Artificial Intelligence (AI). Today’s researchers and engineers are therefore uniquely positioned to empower children with this exciting new technology. My dream is to be on the forefront of this change, creating technology that promotes creativity, expression, and equitable access to AI education. My two main research goals are: first, to empower children to understand and create with AI. Second, to understand what children think about intelligent devices and how their understanding evolves as they learn about how they work.

In doing so, I also get the chance to tackle tough ethical questions about children’s relationships with robots. The first questions I get when I tell people that I build robots to help children learn is, “Are you trying to take teacher’s jobs?” and “Why are you replacing parents with robots?” When parents see their children spending hours talking to a smart speaker, calling it their best friend, does this mean they are choosing robots over people? My group, the Personal Robots Group, has investigated this question and found that young children, whose socioemotional skills are still developing, may talk to AI devices like people, but inside they see them as something else, like something between a pet and a friend (link, link). I’ve been investigating this relationship since the start of my grad school career and continue to ask questions about how this new technology might change the way children relate to their world.

What are PopBots?

PopBots are programmable, intelligent social robots that play with children to help them learn about AI. The full system includes a mobile-phone based social robot learning companion, LEGO blocks and LEGO/Arduino peripherals, and a tablet or computer for a programming interface. Around this system we developed AI activities that allow children to implement their own algorithms.

What really makes PopBots unique are that they were designed to make robotics education creative, hands-on, and low-cost. All of the activities are centered around games and art, and the playful robot character makes the experience particularly engaging for young children.

For example, one activity helps children learn about supervised machine learning by teaching a robot to classify foods. It starts with a question: How would you teach a robot (or computer) to eat healthy food? Children usually first suggest that we tell the robot about all of the different foods and whether each one is healthy or not, like making a list. But children quickly realize that teaching a robot about every single food would take a long time and be a lot of work. It is much better if the robot can learn a pattern for separating healthy and unhealthy foods. This is how we introduce the idea of supervised machine learning: the idea that we can provide examples and then use those examples to make better guesses. So children start looking for patterns—fruits and vegetables are healthy and things with a lot of sugar are unhealthy—and they teach these patterns to the robot by dragging examples of these ideas to different parts of the screen to label them. Along the way, they can ask the robot to guess whether another unlabelled food is healthy or unhealthy. Pretty soon, the robot is able to guess pretty well and with just a few examples can tell if any food it encounters is healthy or unhealthy.

Children are usually pretty impressed that a robot can learn about five foods and figure out 30 other foods, but what really hits home is when we talk about how to confuse the robot. What if we taught the robot that unhealthy foods were really healthy and vice versa? What if we only taught it that red foods were healthy and yellow ones were unhealthy and nothing about blue foods? The robot wouldn’t know any better and it would make a lot of mistakes. Even very young children connect with the idea that AI depends a lot on what people tell it. So if an AI makes a lot of mistakes, like if Alexa has a hard time answering questions, then it is probably because people have not taught it to do something very well. These are ideas that everyone needs to understand about AI, so it’s exciting to see children grappling with them early through the metaphor of a robot sorting food.

Broadening the reach of AI education

We envision that this platform will help change the conversation about who can do AI by making AI education more widely available. I want AI education to be more art-focused than typical CS education. Most robotic kits focus on solving mazes and obvious constructions like cars, leaving out children who may be more interested in making artifacts and designing characters. I’m also designing this platform to push back against the thousands of computer and tablet-based educational apps that have flooded the market in recent years. Although implementing AI activities on a computer would be easier to build and deploy, hands-on learning is an important part of children’s experiences; working with a physical artifact allows children to work together to construct their knowledge. In fact, in my studies, working together on the activities helped children understand AI concepts better.

Opening the AI “black box”

The punchline is that young children (4-6) are completely equipped to learn about AI algorithms from systems like PopBots. Developmental factors like age and perspective-taking skills sometimes made a difference in what children understood. However, the biggest differences occurred when we looked at how different children used the toolkit. Children who explored the activities more thoroughly did better on tricky questions that other children usually got wrong. This suggests that a further improvement to this system would be more personalized to each child, helping them explore new ideas as they go along.

In terms of children’s perceptions of AI, we saw that children developed an understanding of robots as “learning beings.” After teaching the robot themselves, they saw the robot as an object of dual nature: something that was alive, yet a machine; and something that had a mind, but no independent motivations.

Children need a lot more exposure to engineering and technology in the early classroom. For most of the children, this was their first experience with any formal technology education although every child had had a previous experience with robots and interactive virtual agents like Siri, Alexa, and Google. Only a handful of children that I worked with knew what an engineer was, and even those who had some idea did not identify themselves as engineers. Yet, preschool and kindergarten children are arguably the best engineers at in any school—their classrooms are full of little projects and construction sets. Compare this with science; almost all of the children wanted to be scientists because preschool curricula expose children to science and positively reinforces the field. The same needs to be done for engineering.

What PopBots does differently

The coolest thing about this project is how it builds on what’s been done before. PopBots leverage the social robot technology found in Tega. Tega is a powerful platform that the Personal Robots group has used to teach children literacy skills, second languages, socioemotional skills, and even programming. We have seen over and over how a social robot learning companion helps children learn through interactive play. 

PopBots also build on existing computational thinking platforms like Scratch and ScratchJr. that made it possible for children all over the world to embrace programming creatively. We also learned a lot from AI curricula designed for students in high school and above. Over the past two years we have been excited to see AI education gain traction, but very little exists for younger children. PopBots uniquely empower children to learn about AI before they can even read or have learned any complex math. It uses a social robot as a window into the machine, allowing children to use social interaction to explore AI.

We believe this unique approach is important because it comes at computer science topics from a completely different angle than programming. Any non-programming, non-technical person can benefit from the metaphors we present to understand the AI algorithms that are all around them.

This spring, we are training local teachers to deliver this curriculum in their classrooms. One of the most difficult problems in tech education, besides funding, is equipping teachers to deliver the material. We are very excited to take this next step.

If you are interested in ways to explore AI with your child today then check out this list of resources compiled by Blakeley Hoffman.

I would like to acknowledge my advisor, Cynthia Breazeal, my labmates in the Personal Robots Group, and my professors and role models who contributed to this work including Marina Bers and Paul Harris. Without their knowledge, creativity, support, advice, and encouragement PopBots would not exist. Thank you!

Related Content