via TED Radio Hour
Jan. 26, 2018
Joy Buolamwini identifies bias in algorithms and develops practices for accountability during design.
"We have to continue to check our systems, because they can fail in unexpected ways."
A.I. systems are shaped by the priorities and prejudices…of the people who design them, a phenomenon that I refer to as "the coded gaze."
Joy Buolamwini showed facial-recognition systems consistently giving the wrong gender for famous women of color.