Work for a Member company and need a Member Portal account? Register here with your company email address.
Creative Commons
Attribution-NonCommercial-NoDerivatives 4.0 International
Joy Buolamwini
Joy Buolamwini was honored for her work exposing race and gender bias in commercial artificial intelligence (AI).
An exhibition at the Barbican in London features work from OpenAg, the Mediated Matter group, Joy Buolamwini, and more.
Cristina Quinn shares what she learned in a visit to the AI + Ethics summer camp run by Personal Robots group student Blakeley Payne.
Joy Buolamwini has been selected as one of Fortune Magazine’s “40 Under 40.”
Joy Buolamwini talks to the AP about her work on algorithmic bias.
An open letter asking Amazon to stop selling Rekognition to law enforcement agencies.
More than three dozen AI researchers have signed an open letter asking Amazon to stop selling Rekognition to law enforcement agencies.
Now activists are working to bring women, and feminism, back to Silicon Valley.
Tech companies working on artificial intelligence find that a diverse staff can help avoid biased algorithms that cause public embarrassment
Joy Buolamwini’s essay is featured in the Optimists issue of Time, guest-edited by Ava DuVernay.
Amazon’s system had more difficulty identifying the gender of female and darker-skinned faces than similar services from IBM and Microsoft.
The technology could revolutionize policing, medicine, even agriculture—but its applications can easily be weaponized.
Joy Buolamwini identifies bias in algorithms and develops practices for accountability during design.
The 2019 Forbes 30 Under 30 list includes at least three members of the Media Lab community.
Joy Buolamwini showed facial-recognition systems consistently giving the wrong gender for famous women of color.
A.I. systems are shaped by the priorities and prejudices…of the people who design them, a phenomenon that I refer to as "the coded gaze."
"We have to continue to check our systems, because they can fail in unexpected ways."
A case study implemented by Inioluwa Raji under the guidance of Joy Buolamwini
All people are created equal, but in the eyes of the algorithm, not all faces are just yet.A new study from MIT and Microsoft res…
A new review of face recognition software found that, when identifying gender, the software is most accurate for men with light skin&n…
New research out of MIT’s Media Lab is underscoring what other experts have reported or at least suspected before: facial recognition techn…
Examination of facial-analysis software shows error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women.
Real-world biases and artificial intelligence