Event

Affectiva: In-cabin AI

Copyright

Affectiva

Affectiva

Tuesday
March 30, 2021
10:00am — 11:30am ET

Workshop: In-cabin AI

Register and Add to Calendar via Zoom
(Registration Form below for those unable to use Zoom)

Co-hosted by Affectiva

Affectiva Automotive AI is the leading in-cabin sensing solution designed to understand what is happening with people in a vehicle. It uses cameras in the car to measure, in real time, the state of the driver, the state of the occupants, and the state of the vehicle interior (i.e. cabin). This insight helps car manufacturers, fleet management companies, and rideshare providers improve road safety, by understanding dangerous driver behavior such as drowsiness, distraction, and anger. It can also be used to create more comfortable and enjoyable transportation experiences, by understanding how passengers react to the environment, including content they can consume in the back of the car. In addition to understanding driver and occupant emotional and cognitive states, Affectiva Automotive AI can also detect contextual cabin information such as the number of passengers, where they are sitting, and if an object is present.

Affectiva’s technology is also used by 28 percent of Fortune Global 500 companies to test consumer engagement with ads, videos and TV programming. Beyond media analytics and automotive, Emotion AI also has the potential to transform industries including healthcare and mental health, robotics, conversational interfaces, education, and more.

About Affectiva

Affectiva is humanizing technology, building a world where technology understands people the way we understand one another, and forever changing how humans interact with technology and with one another in a digital world. An MIT Media Lab spin-off, Affectiva created and defined the technology category of artificial emotional intelligence, or Emotion AI.

Built on deep learning, computer vision, speech science and massive amounts of real-world data, Affectiva’s AI can understand human emotions, cognitive states, activities and the objects people use, by analyzing facial and vocal expressions. To date, we have analyzed more than 10 million faces from 90 countries, making our emotion database one of the largest data repositories of its kind.

More Events