Work for a Member company and need a Member Portal account? Register here with your company email address.
via Marketplace
Feb. 13, 2018
Joy Buolamwini identifies bias in algorithms and develops practices for accountability during design.
"We have to continue to check our systems, because they can fail in unexpected ways."
A.I. systems are shaped by the priorities and prejudices…of the people who design them, a phenomenon that I refer to as "the coded gaze."
Joy Buolamwini showed facial-recognition systems consistently giving the wrong gender for famous women of color.