Project

Project Us

Copyright

Valdemar Dunry

Valdemar Dunry

Project Contact:
Groups

“Us” aims to help people develop their sense of empathy. It uses machine learning to analyze interlocutors’ signals (e.g., tone of voice, electrodermal activity, -soon- visual cues, etc.) during conversations, and feeds the emotional content back to them in real-time. We have versions of the system for both in person and online conversations.

This technologically enhanced feedback loop aims to enable each partner to "feel" the other’s emotions and corroborate it with context data, in a way that could result in a gradual training of their “empathy muscle". Results from our experimental studies show that users experience an increased level of attention, as well as awareness of self and others.

Would you like to try it out? Register for an online demo here. Our team will be contacting you with access instructions.

This is a prototype of a university-based research project. We are continuing to develop the system in terms of accuracy, security, end-to-end user experience and applications. We would greatly appreciate your feedback after using the demo.

Vision

Support better conversations and relationships through empathy buildin… View full description

“Us” aims to help people develop their sense of empathy. It uses machine learning to analyze interlocutors’ signals (e.g., tone of voice, electrodermal activity, -soon- visual cues, etc.) during conversations, and feeds the emotional content back to them in real-time. We have versions of the system for both in person and online conversations.

This technologically enhanced feedback loop aims to enable each partner to "feel" the other’s emotions and corroborate it with context data, in a way that could result in a gradual training of their “empathy muscle". Results from our experimental studies show that users experience an increased level of attention, as well as awareness of self and others.

Would you like to try it out? Register for an online demo here. Our team will be contacting you with access instructions.

This is a prototype of a university-based research project. We are continuing to develop the system in terms of accuracy, security, end-to-end user experience and applications. We would greatly appreciate your feedback after using the demo.

Vision

Support better conversations and relationships through empathy building.

Inspiration

Empathy—our ability to feel someone's emotional state while preserving the knowledge about its personal origin—stands at the core of our existence as humans. It has dramatically contributed to our evolution as a species, and remains a key driver of the way we experience life. Being empathetic can make us more effective at work, less stressed, it can improve our relationship satisfaction, give us a deeper sense of connection and attachment. Still, we sometimes find it difficult to empathise with others, and for some poorly understood reasons, some people tend to face more challenges than others. Technologically-enabled solutions, ranging from virtual reality (VR) to tangible avatars, have shown promise in this direction. Yet, existing techniques tend to be difficult and expensive to deliver (e.g., requiring VR headsets), and often disconnected from daily life.

State of the Project

Us consists of two modules that can be used either separately, or jointly. Our results indicate that users experience an increased level of attention and awareness of self and others for both of these modules used separately. 

1. Virtual interface (Us.virtual) – can run during any virtual interaction (e.g., Zoom), extract the emotional valence from the conversation (speech, tone and, soon, visual cues) and discretely feed it back through an on-screen display.  This tool has been tested in a user study with 20 participants (publication under review). 

Copyright

Contact Author

2. In-person interface (Us.eda) – a set of bracelets that measure each interlocutors’ EDA (electrodermal activity) – which is correlated to their level of excitement, and give a subtle vibrating nudge when it exceeds a preset threshold. In the future, the wearable solution could also be offered as an app to user's existing FitBit. This tool has been evaluated and we have published results of a user study with 18 participants (paper).

Copyright

Camilo Rojas

Copyright

Camilo Rojas

Ideas were generated and advanced by a task force including Camilo Rojas (postdoctoral researcher, Fluid Interfaces), Gaurav Patekar (research assistant, Fluid Interfaces), Malena Corral (Independent Researcher); Eugenio Zuccarelli (MIT Sloan); Alex Chin (Wellesley Undergraduate); Korina Hernandez (Wellesley Undergraduate); Suze Barlow (Wellesley Undergraduate); Kat Swint (Wellesley Undergraduate); Niels Poulsen (visiting student, Fluid Interfaces), along with professor Pattie Maes (Fluid Interfaces).

Frequently Asked Questions

  1. What is your design approach?

    Project Us follows a participatory design approach, working together with end-users (e.g., psychologists, couples, leadership coaches, etc.) to understand their needs and identify the best venues for intervention. We also inform our work with clinical psychologists with extensive expertise in therapy and biofeedback,  and neuroscientists and specialists in affective computing to navigate the complex technological challenges.

  2. What means emotional arousal?

    Human emotions can be conceptualised using a two-dimensional model defined by emotional arousal and valence. Emotional arousal is also known as the intensity of our emotional state, and valence describes the extent to which an emotion is positive or negative. More information can be found here and here.

  3. How can I contribute?

    We deeply value feedback from people who are interested in exploring our devices. In the future, we will recruit for research studies related to our projects. If you provide your contact information here, we will reach out to you if we are recruiting for a study that might be of interest to you.