banking and finance
internet of things
neural interfacing and control
point of care
sports and fitness
natural language processing
Advance human wellbeing by developing new ways to communicate, understand, and respond to emotion
Creating technology for social change
An exhibition at the Barbican in London features work from OpenAg, the Mediated Matter group, Joy Buolamwini, and more.
Advancing justice in Earth's complex systems using designs enabled by space
Exploring new forms of social justice through art
The Humanizing AI in Law (HAL) project aims to build the technical and legal foundations necessary to establish a due-pro...
Pretrial risk assessments draw from “deeply flawed data" generated from arrest and conviction records, say researchers.
By Chelsea Barabas, Karthik Dinakar and Colin Doyle
www.ajlunited.orgAn unseen force is rising—helping to determine who is hired, granted a loan, or even how long someone spends in prison. ...
Joy Buolamwini has been selected as one of Fortune Magazine’s “40 Under 40.”
Joy Buolamwini talks to the AP about her work on algorithmic bias.
An open letter asking Amazon to stop selling Rekognition to law enforcement agencies.
More than three dozen AI researchers have signed an open letter asking Amazon to stop selling Rekognition to law enforcement agencies.
Engaging people in creative learning experiences
Now activists are working to bring women, and feminism, back to Silicon Valley.
Tech companies working on artificial intelligence find that a diverse staff can help avoid biased algorithms that cause public embarrassment
Jaleesa Trapp led a discussion at her alma mater about antiracist approaches to engaging young people from marginalized groups in STEM.
Joy Buolamwini’s essay is featured in the Optimists issue of Time, guest-edited by Ava DuVernay.
Algorithmic auditing has emerged as a key strategy to expose systematic biases embedded in software platforms, yet scholarship on the imp...
"We must provide a mechanism for civil society to be informed and engaged in the way in which algorithms are used."
Amazon’s system had more difficulty identifying the gender of female and darker-skinned faces than similar services from IBM and Microsoft.
Thoughts from the Media Lab's Space Enabled group and Space Exploration Initiative
The Gender Shades project pilots an intersectional approach to inclusive product testing for AI.Algorithmic Bias PersistsGender Shades is...