By Catherine D’Ignazio
Julia Angwin is an award-winning investigative journalist and the best-selling author of Dragnet Nation. She has worked at the Wall Street Journal, where she oversaw the groundbreaking series “What They Know” about erosion of privacy in the age of Big Data. Angwin is a co-founder and editor-in-chief of The Markup, a nonprofit newsroom that investigates the impact of technology on society. Before that, she worked as a senior investigative reporter at ProPublica where she was a finalist for a Pulitzer for her series on algorithmic bias, including a story on “Machine Bias,” a story about the ways that risk assessment tools perpetuate discriminatory sentencing in the American criminal justice system.
Lauren Klein and I reference “Machine Bias” in the “Collect, Analyze, Imagine, Teach” chapter of Data Feminism, which talks about challenging power in data science. “Machine Bias” serves as an undeniable, real world example of the harmful outcomes that can result when demographic data is used without accountability to the people it describes. As we say in the book, “Machine Bias”—and the many examples like it—proves that “the data are never ‘raw.’ Data are always the product of unequal social relations—relations affected by centuries of history.”