If the New Jim Code seeks to penetrate all areas of life, extracting data, producing hierarchies, and predicting futures, thin description exercises a much needed discretion, pushing back against the all- knowing, extractive, monopolizing practices of coded inequity.
Thinness is not an analytic failure, but an acceptance of fragility … a methodological counterpoint to the hubris that animates so much tech development. What we know today about coded inequity may require a complete rethinking, as social and technical systems change over time. Let’s not forget: racism is a mercurial practice, shape-shifting, adept at disguising itself in progressive-like rhetoric. If our thinking becomes too weighed down by our own assuredness, we are likely to miss the avant-garde stylings of NextGen Racism as it struts by.
Beyond Biased Bots How do we move beyond the idea of biased bots, so we can begin to understand a wide range of coded inequities? Here I propose four dimensions to the New Jim Code: engineered inequity, default discrimination, coded exposure, and technological benevolence; and I will elaborate on them in the following chapters.
A closer look at how engineered inequity explicitly works to amplify social hierarchies that are based on race, class, and gender and how the debate regarding “racist robots” is framed in popular discourse. I conclude that robots can be racist, given their design in a society structured by interlocking forms of domination.103
What happens when tech developers do not attend to the social and historical context of their work and explores how default discrimination grows out of design processes that ignore social cleavages. I also consider how what is often depicted as glitches might
serve as powerful opportunities to examine the overall system, a technological canary in the coal mine.
Multiple forms of coded exposure that technologies enable, from Polaroid cameras to computer software. Here I think through the various form of visibility and of how, for racialized groups, the problem of being watched (but not seen) relates to newfangled forms of surveillance.
Technological beneficence animates tech products and services that offer fixes for social bias. Here I take a look at technologies that explicitly work to address different forms of discrimination, but that may still end up reproducing, or even deepening, discriminatory processes because of the narrow way in which “fairness” is defined and operationalized.
Practitioners, scholars, activists, artists, and students are working to resist and challenge the New Jim Code – and how you, the reader, can contribute to an approach to technology that moves beyond accessing new products, to advocating for justice-oriented design practices.
Taken as a whole, the conceptual toolkit we build around a race critical code studies will be useful, I hope, for analyzing a wide range of phenomena – from the explicit codification of racial difference in particular devices to the implicit assumption that technology is race- neutral – through which Whiteness becomes the default setting for tech development. This field guide critically interrogates the progressive narratives that surround technology and encourages us to examine how racism is often maintained or perpetuated through technical fixes to social problems. And finally, the next chapters examine the different facets of coded inequity with an eye toward designing them differently. Are you ready?