• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Article

Can AI Learn to Understand Emotions?


Growing up in Egypt in the 1980s, Rana el Kaliouby was fascinated by hidden languages—the rapid-fire blinks of 1s and 0s computers use to transform electricity into commands and the infinitely more complicated nonverbal cues that teenagers use to transmit volumes of hormone-laden information to each other.

Culture and social stigma discouraged girls like el Kaliouby in the Middle East from hacking either code, but she wasn’t deterred. When her father brought home an Atari video game console and challenged the three el Kaliouby sisters to figure out how it worked, Rana gleefully did. When she wasn’t allowed to date, el Kaliouby studied her peers the same way that she did the Atari.
“I was always the first one to say ‘Oh, he has a crush on her’ because of all of the gestures and the eye contact,” she says.

Following in the footsteps of her parents, both computer scientists, el Kaliouby knew that her knowledge of programming languages would be a foundational skill for her career. But it wasn’t until graduate school that she discovered that her interest in decoding human behavior would be equally important. In 1998, while looking for topics for her Master’s thesis at the American University in Cairo, el Kaliouby stumbled upon a book by MIT researcher Rosalind Picard. It argued that, since emotions play a large role in human decision-making, machines will require emotional intelligence if they are to truly understand human needs. El Kaliouby was captivated by the idea that feelings could be measured, analyzed, and used to design systems that can genuinely connect with people. The book, called Affective Computing, would change her career. So would its author.

Today, el Kaliouby is the CEO of Affectiva, a company that’s building the type of emotionally intelligent AI systems Picard envisioned two decades ago. Affectiva’s software measures a user’s emotional response through algorithms that identify key facial landmarks and analyze pixels in those regions to classify facial expressions. Combinations of those facial expressions are then mapped to any of seven different emotions as well as some complex cognitive states such as drowsiness and distraction. Separate algorithms also analyze voice patterns and inflections.

Related Content