Personal Robots
Building socially engaging robots and interactive technologies to help people live healthier lives, connect with others, and learn better.
Robots are an intriguing technology that can straddle both the physical and social world of people. Inspired by animal and human behavior, our goal is to build capable robotic creatures with a "living" presence, and to gain a better understanding of how humans will interact with this new kind of technology. People will physically interact with them, communicate with them, understand them, and teach them, all in familiar human terms. Ultimately, such robots will possess the social savvy, physical adeptness, and everyday common sense to partake in people's daily lives in useful and rewarding ways.

Research Projects

  • AIDA: Affective Intelligent Driving Agent

    Cynthia Breazeal and Kenton Williams
    Drivers spend a significant amount of time multi-tasking while they are behind the wheel. These dangerous behaviors, particularly texting while driving, can lead to distractions and ultimately to accidents. Many in-car interfaces designed to address this issue still neither take a proactive role to assist the driver nor leverage aspects of the driver's daily life to make the driving experience more seamless. In collaboration with Volkswagen/Audi and the SENSEable City Lab, we are developing AIDA (Affective Intelligent Driving Agent), a robotic driver-vehicle interface that acts as a sociable partner. AIDA elicits facial expressions and strong non-verbal cues for engaging social interaction with the driver. AIDA also leverages the driver's mobile device as its face, which promotes safety, offers proactive driver support, and fosters deeper personalization to the driver.
  • Animal-Robot Interaction

    Brad Knox, Patrick Mccabe and Cynthia Breazeal

    Like people, dogs and cats live among technologies that affect their lives. Yet little of this technology has been designed with pets in mind. We are developing systems that interact intelligently with animals to entertain, exercise, and empower them. Currently, we are developing a laser-chasing game, in which dogs or cats are tracked by a ceiling-mounted webcam, and a computer-controlled laser moves with knowledge of the pet's position and movement. Machine learning will be applied to optimize the specific laser strategy. We envision enabling owners to initiate and view the interaction remotely through a web interface, providing stimulation and exercise to pets when the owners are at work or otherwise cannot be present.

  • Cloud-HRI

    Cynthia Breazeal, Nicholas DePalma, Adam Setapen and Sonia Chernova

    Imagine opening your eyes and being awake for only half an hour at a time. This is the life that robots traditionally live. This is due to a number of factors, such as battery life and wear on prototype joints. Roboticists have typically muddled though this challenge by crafting handmade perception and planning models of the world, or by using machine learning with synthetic and real-world data, but cloud-based robotics aims to marry large distributed systems with machine learning techniques to understand how to build robots that interpret the world in a richer way. This movement aims to build large-scale machine learning algorithms that use experiences from large groups of people, whether sourced from a large number of tabletop robots or a large number of experiences with virtual agents. Large-scale robotics aims to change embodied AI as it changed non-embodied AI.

  • Collaborative Robot Storyteller

    Cynthia Breazeal, Hae Won Park, Jacqueline M Kory, Mirko Gelsomini, Goren Gordon (Tel Aviv), Stephanie Gottwald(Tufts), and Susan Engel(Williams College)

    Can robots collaboratively exchange stories with children and improve their language and storytelling skills? With our latest Tega robot platform, we aim to develop a deep personalization algorithm based on a long-term interaction with an individual user. Through robot interaction, we collect a corpus of each child's linguistics, narrative, and concept skill information, and develop robot's AI that can generate stories and behaviors personalized to each child's growth level and engagement factors, including affective states.

  • DragonBot: Android Phone Robots for Long-Term HRI

    Adam Setapen, Natalie Freed, and Cynthia Breazeal

    DragonBot is a new platform built to support long-term interactions between children and robots. The robot runs entirely on an Android cell phone, which displays an animated virtual face. Additionally, the phone provides sensory input (camera and microphone) and fully controls the actuation of the robot (motors and speakers). Most importantly, the phone always has an Internet connection, so a robot can harness cloud-computing paradigms to learn from the collective interactions of multiple robots. To support long-term interactions, DragonBot is a "blended-reality" character: if you remove the phone from the robot, a virtual avatar appears on the screen and the user can still interact with the virtual character on the go. Costing less than $1,000, DragonBot was specifically designed to be a low-cost platform that can support longitudinal human-robot interactions "in the wild."

  • Global Literacy Tablets

    Cynthia Breazeal, David Nunez, Tinsley Galyean, Maryanne Wolf (Tufts), and Robin Morris (GSU)

    We are developing a system of early literacy apps, games, toys, and robots that will triage how children are learning, diagnose literacy deficits, and deploy dosages of content to encourage app play using a mentoring algorithm that recommends an appropriate activity given a child's progress. Currently, over 200 Android-based tablets have been sent to children around the world; these devices are instrumented to provide a very detailed picture of how kids are using these technologies. We are using this big data to discover usage and learning models that will inform future educational development.

  • Huggable: A Social Robot for Pediatric Care

    Special Interest group(s): 
    Boston Children's Hospital, Northeastern University, Cynthia Breazeal, Sooyeon Jeong, Fardad Faridi and Jetta Company

    Children and their parents may undergo challenging experiences when admitted for inpatient care at pediatric hospitals. While most hospitals make efforts to provide socio-emotional support for patients and their families during care, gaps still exist between human resource supply and demand. The Huggable project aims to close this gap by creating a social robot able to mitigate stress, anxiety, and pain in pediatric patients by engaging them in playful interactions. In collaboration with Boston Children's Hospital and Northeastern University, we are currently running an experimental study to compare the effects of the Huggable robot to a virtual character on a screen and a plush teddy bear. We demonstrated preliminarily that children are more eager to emotionally connect with and be physically activated by a robot than a virtual character, illustrating the potential of social robots to provide socio-emotional support during inpatient pediatric care.

  • Interactive Journaling

    Cynthia Breazeal, Sooyeon Jeong and LG Electronics

    We are creating a mobile application that adapts traditional expressive writing therapy into the framework of mobile phone technologies. Instead of writing on paper or on a computer device via keyboards, mobile phone users will verbally express themselves and their daily experiences to a virtual agent on their smart phone. The virtual agent will prompt the journaling activity with positive psychology interventions based on the user's affective state, and continuously learn the user's preferences during interactions. We hypothesize that users will gain higher psychological wellbeing through these interactive journaling activities.

  • Mind-Theoretic Planning for Robots

    Cynthia Breazeal and Sigurdur Orn Adalgeirsson

    Mind-Theoretic Planning (MTP) is a technique for robots to plan in social domains. This system takes into account probability distributions over the initial beliefs and goals of people in the environment that are relevant to the task, and creates a prediction of how they will rationally act on their beliefs to achieve their goals. The MTP system then proceeds to create an action plan for the robot that simultaneously takes advantage of the effects of anticipated actions of others and also avoids interfering with them.

  • Robot Learning from Human-Generated Rewards

    Brad Knox, Robert Radway, Tom Walsh, and Cynthia Breazeal

    To serve us well, robots and other agents must understand our needs and how to fulfill them. To that end, our research develops robots that empower humans by interactively learning from them. Interactive learning methods enable technically unskilled end-users to designate correct behavior and communicate their task knowledge to improve a robot's task performance. This research on interactive learning focuses on algorithms that facilitate teaching by signals of approval and disapproval from a live human trainer. We operationalize these feedback signals as numeric rewards within the machine-learning framework of reinforcement learning. In comparison to the complementary form of teaching by demonstration, this feedback-based teaching may require less task expertise and place less cognitive load on the trainer. Envisioned applications include human-robot collaboration and assistive robotic devices for handicapped users, such as myolectrically controlled prosthetics.

  • Robot Mindset and Curiosity

    Cynthia Breazeal, Hae Won Park and Goren Gordon (Tel Aviv)

    A growth mindset and curiosity have significant impact on children's academic and social achievements. We are developing and evaluating a novel expressive cognitive-affective architecture that synergistically integrates models of curiosity, understanding of mindsets, and expressive social behaviors to advance the state-of the-art of robot companions. In doing so, we aim to contribute major advancements in the design of AI algorithms for artificial curiosity, artificial mindset, and their verbal and non-verbal expressiveness in a social robot companion for children. In our longitudinal study, we aim to evaluate the robot companion's ability to sustain engagement and promote children's curiosity and growth mindset for improved learning outcomes in an educational play context.

  • Robotic Language Learning Companions

    Cynthia Breazeal, Jacqueline Kory Westlund, Sooyeon Jeong, Paul Harris, Dave DeSteno, and Leah Dickens

    Young children learn language not through listening alone, but through active communication with a social actor. Cultural immersion and context are also key in long-term language development. We are developing robotic conversational partners and hybrid physical/digital environments for language learning. For example, the robot Sophie helped young children learn French through a food-sharing game. The game was situated on a digital tablet embedded in a café table. Sophie modeled how to order food and as the child practiced the new vocabulary, the food was delivered via digital assets onto the table's surface. A teacher or parent can observe and shape the interaction remotely via a digital tablet interface to adjust the robot's conversation and behavior to support the learner. More recently, we have been examining how social nonverbal behaviors impact children's perceptions of the robot as an informant and social companion.

  • Robotic Learning Companions

    Cynthia Breazeal, Jacqueline Kory Westlund, and Samuel Spaulding

    The language and literacy skills of young children entering school are highly predictive of their long-term academic success. Children from low-income families are particularly at risk. Parents often work multiple jobs, giving them less time to talk to and read with their children. Parents might be illiterate or not speak the language taught in local schools, and they may not have been read to as children, providing less experience of good co-reading practice to draw upon. We are currently developing a robotic reading companion for young children, trained by interactive demonstrations from parents and/or educational experts. We intend for this robot to complement parental interaction and emulate some of their best practices in co-reading, building language and literacy through asking comprehension questions, prompting exploration, and simply being emotionally involved in the child's reading experience.

  • SHARE: Understanding and Manipulating Attention Using Social Robots

    Cynthia Breazeal and Nick DePalma

    SHARE is a robotic cognitive architecture focused on manipulating and understanding the phenomenon of shared attention during interaction. SHARE incorporates new findings and research in the understanding of nonverbal referential gesture, visual attention system research, and interaction science. SHARE's research incorporates new measurement devices, advanced artificial neural circuits, and a robot that makes its own decisions.

  • Socially Assistive Robotics: An NSF Expedition in Computing

    Tufts University, Cynthia Breazeal, Edith Ackermann, Goren Gordon, Michal Gordon, Sooyeon Jeong, Jacqueline Kory, Jin Joo Lee, Luke Plummer, Samuel Spaulding, Kasia Hayden (Stanford University) University of Southern California, Willow Garage and Yale Uni

    Our mission is to develop the computational techniques that will enable the design, implementation, and evaluation of "relational" robots, in order to encourage social, emotional, and cognitive growth in children, including those with social or cognitive deficits. Funding for the project comes from the NSF Expeditions in Computing program. This expedition has the potential to substantially impact the effectiveness of education and healthcare, and to enhance the lives of children and other groups that require specialized support and intervention. In particular, the MIT effort is focusing on developing second-language learning companions for pre-school aged children, ultimately for ESL (English as a Second Language).

  • Tega: A New Robot Platform for Long-Term Interaction

    Cooper Perkins Inc., Fardad Faridi, Cynthia Breazeal, Jin Joo Lee, Luke Plummer, IFRobots and Stacey Dyer

    Tega is a new robot platform for long-term interactions with children. The robot leverages smart phones to graphically display facial expressions. Smart phones are also used for computational needs, including behavioral control, sensor processing, and motor control to drive its five degrees of freedom. To withstand long-term continual use, we have designed an efficient battery-powered system that can potentially run for up to six hours before needing to be charged. We also designed for more robust and reliable actuator movements so that the robot can express consistent and expressive behaviors over long periods of time. Through its small size and furry exterior, the robot is aesthetically designed for children. We aim to field test the robot's ability to work reliably in out-of-lab environments and engage young children in educational activities.

  • TinkRBook: Reinventing the Reading Primer

    Cynthia Breazeal, Angela Chang, and David Nunez
    TinkRBook is a storytelling system that introduces a new concept of reading, called textual tinkerability. Textual tinkerability uses storytelling gestures to expose the text-concept relationships within a scene. Tinkerability prompts readers to become more physically active and expressive as they explore concepts in reading together. TinkRBooks are interactive storybooks that prompt interactivity in a subtle way, enhancing communication between parents and children during shared picture-book reading. TinkRBooks encourage positive reading behaviors in emergent literacy: parents act out the story to control the words onscreen, demonstrating print referencing and dialogic questioning techniques. Young children actively explore the abstract relationship between printed words and their meanings, even before this relationship is properly understood. By making story elements alterable within a narrative, readers can learn to read by playing with how word choices impact the storytelling experience. Recently, this research has been applied in developing countries.