Discriminating Data

Correlation, Neighborhoods, and the New Politics of Recognition

Illustrated by Alex Barnett
$20.99 US
The MIT Press
On sale Nov 02, 2021 | 9780262367257
Sales rights: World

See Additional Formats
How big data and machine learning encode discrimination and create agitated clusters of comforting rage.

In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal—not an error—within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data’s predictive potential, stems from twentieth-century eugenic attempts to “breed” a better future. Recommender systems foster angry clusters of sameness through homophily. Users are “trained” to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.
 
Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data. 
 
How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data. 
 
Preface ix
Introduction: How to Destroy the World, One Solution at a Time 1
Red Pill Toxicity, or Liberation Envy 29

1 Correlating Eugenics 35
The Transgressive Hypothesis 75
2 Homophily, or the Swarming of the Segregated Neighborhood 81
3 Algorithmic Authenticity 139
Correlating Ideology, or What Lies at the Surface 173
4 Recognizing Recognition 185
The Space Between Us 231
Coda: Living in Difference 239

Acknowledgments 255
Notes 259
References for Mathematical Illustrations 317
Index 319

About

How big data and machine learning encode discrimination and create agitated clusters of comforting rage.

In Discriminating Data, Wendy Hui Kyong Chun reveals how polarization is a goal—not an error—within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data’s predictive potential, stems from twentieth-century eugenic attempts to “breed” a better future. Recommender systems foster angry clusters of sameness through homophily. Users are “trained” to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible.
 
Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data. 
 
How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data. 
 

Table of Contents

Preface ix
Introduction: How to Destroy the World, One Solution at a Time 1
Red Pill Toxicity, or Liberation Envy 29

1 Correlating Eugenics 35
The Transgressive Hypothesis 75
2 Homophily, or the Swarming of the Segregated Neighborhood 81
3 Algorithmic Authenticity 139
Correlating Ideology, or What Lies at the Surface 173
4 Recognizing Recognition 185
The Space Between Us 231
Coda: Living in Difference 239

Acknowledgments 255
Notes 259
References for Mathematical Illustrations 317
Index 319