Publication

The Abuse and Misogynoir Playbook

Copyright

Katlyn Turner, Danielle Wood, Catherine D'Ignazio, Melissa Teng

Katlyn Turner, Danielle Wood, Catherine D'Ignazio with design by Melissa Teng

By Dr. Katlyn Turner, Prof. Danielle Wood and Prof. Catherine D'Ignazio. This article originally appeared in the January Issue of the Quarterly State of AI Ethics Report published by the Montreal AI Ethics Institute.

Abstract

Disbelieving, devaluing, and discrediting the contributions of Black women has been the historical norm. Let’s write a new playbook for AI Ethics.

“...come celebrate

with me that everyday

something has tried to kill me

and has failed.”

- Lucille Clifton

In the past decade, Black women have been producing leading scholarship that challenges the  dominant narratives of the AI and Tech industry: namely that technology is ahistorical, “evolved”, “neutral” and “rational” beyond the human quibbles of issues like gender, class, and race. Safiya Noble demonstrates how search algorithms routinely work to dehumanize Black women and girls (Noble 2018). Ruha Benjamin challenges what she calls the “imagined objectivity” of software and explains how Big Tech has collaborated with unjust systems to produce “the New Jim Code”, software products that work to reproduce racial inequality (Benjamin 2019). Joy Buolamwini and Timnit Gebru definitively expose racial and gender bias in facial analysis libraries and training datasets (Buolamwini & Gebru 2018). Meredith Broussard challenges the “technochauvinism” embedded in AI and machine learning products (Broussard 2018). Rediet Abebe calls for us to confront the limitations of the concept of fairness and center our analysis on power (Kasy & Abebe 2020). Simone Browne teaches us that today’s cutting-edge technologies are part of a long history of surveillance of Black bodies in public spaces (Browne 2015).

These scholars, along with many others, are sounding the alarm that tech is neither neutral nor ahistorical. Rather, how evolved it is reflects how quickly it can reproduce and entrench our historical biases. How rational it is indicates our collective desire to forget and erase the ugliness of racism, sexism, classism, xenophobia — and assign it instead to an opaque algorithm and output. And these instincts are far from ahistorical, rather they are part of a centuries-old playbook employed swiftly and authoritatively over the years to silence, erase, and revise contributions and contributors that question the status quo of innovation, policy, and social theory.

The Abuse and Misogynoir Playbook, as we name it here, has been used successfully by  individuals and institutions to silence, shame, and erase Black women and their contributions for centuries. Misogynoir is a term introduced by Dr. Moya Bailey in 2010 (Bailey and Trudy, 2018; Bailey 2021) that describes the unique racialized and gendered oppression that Black women systemically face. We see the Playbook in operation in the recent well-publicized and swift firing of Google’s Ethical AI Co-Lead, Dr. Timnit Gebru. We see the Playbook in operation in the case of poet Phillis Wheatley in the 1700s. The Playbook’s tactics, described in the accompanying diagram, are disbelief, dismissal, gaslighting, discrediting, revisionism, and erasure of Black women and their contributions.

Follow the link below to read the full text.

Related Content