Attribution-NonCommercial-NoDerivatives 4.0 International
Joy Buolamwini identifies bias in algorithms and develops practices for accountability during design.
The 2019 Forbes 30 Under 30 list includes at least three members of the Media Lab community.
Joy Buolamwini showed facial-recognition systems consistently giving the wrong gender for famous women of color.
A.I. systems are shaped by the priorities and prejudices…of the people who design them, a phenomenon that I refer to as "the coded gaze."
"We have to continue to check our systems, because they can fail in unexpected ways."
A case study implemented by Inioluwa Raji under the guidance of Joy Buolamwini
All people are created equal, but in the eyes of the algorithm, not all faces are just yet.A new study from MIT and Microsoft r...
A new review of face recognition software found that, when identifying gender, the software is most accurate for men with light skin...
New research out of MIT’s Media Lab is underscoring what other experts have reported or at least suspected before: facial recognition tec...
Examination of facial-analysis software shows error rate of 0.8 percent for light-skinned men, 34.7 percent for dark-skinned women.
Real-world biases and artificial intelligence