Database design, in that way, is “an exercise in worldbuilding,” a normative process in which programmers are in a position to project their world views – a process that all too often reproduces the technology of race.2 Computer systems are a part of the larger matrix of systemic racism. Just as legal codes are granted an allure of objectivity – “justice is (color)blind” goes the fiction – there is enormous mystique around computer codes, which hides the human biases involved in technical design.
The Google Maps glitch is better understood as a form of displacement or digital gentrification mirroring the widespread dislocation underway in urban areas across the United States. In this case, the cultural norms and practices of programmers – who are drawn from a narrow racial, gender, and classed demographic – are coded into technical systems that, literally, tell people where to go. These seemingly innocent directions, in turn, reflect and reproduce racialized commands that instruct people where they belong in the larger social order.3
Ironically, this problem of misrecognition actually reflects a solution to a difficult coding challenge. A computer’s ability to parse Roman
numerals, interpreting an “X” as “ten,” was a hard-won design achievement.4 That is, from a strictly technical standpoint, “Malcolm Ten Boulevard” would garner cheers. This illustrates how innovations reflect the priorities and concerns of those who frame the problems to be solved, and how such solutions may reinforce forms of social dismissal, regardless of the intentions of individual programmers.
While most observers are willing to concede that technology can be faulty, acknowledging the periodic breakdowns and “glitches” that arise, we must be willing to dig deeper.5 A narrow investment in technical innovation necessarily displaces a broader set of social interests. This is more than a glitch. It is a form of exclusion and subordination built into the ways in which priorities are established and solutions defined in the tech industry. As Andrew Russell and Lee Vinsel contend, “[t]o take the place of progress, ‘innovation,’ a smaller, and morally neutral, concept arose. Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in the way of moral and social improvement.”6
For this reason, it is important to question “innovation” as a straightforward social good and to look again at what is hidden by an idealistic vision of technology. How is technology already raced?
This chapter probes the relationship between glitch and design, which we might be tempted to associate with competing conceptions of racism. If we think of racism as something of the past or requiring a particular visibility to exist, we can miss how the New Jim Code operates and what seeming glitches reveal about the structure of racism. Glitches are generally considered a fleeting interruption of an otherwise benign system, not an enduring and constitutive feature of social life. But what if we understand glitches instead to be a slippery place (with reference to the possible Yiddish origin of the word) between fleeting and durable, micro-interactions and macro- structures, individual hate and institutional indifference? Perhaps in that case glitches are not spurious, but rather a kind of signal of how the system operates. Not an aberration but a form of evidence, illuminating underlying flaws in a corrupted system.
Default Discrimination At a recent workshop sponsored by a grassroots organization called Stop LAPD Spying, the facilitator explained that community members with whom she works might not know what algorithms are, but they know what it feels like to be watched. Feelings and stories of being surveilled are a form of “evidence,” she insisted, and community testimony is data.7 As part of producing those data, the organizers interviewed people about their experiences with surveillance and their views on predictive policing. They are asked, for example: “What do you think the predictions are based on?” One person, referring to the neighborhood I grew up in, responded:
Because they over-patrol certain areas – if you’re only looking on Crenshaw and you only pulling Black people over then it’s only gonna make it look like, you know, whoever you pulled over or whoever you searched or whoever you criminalized that’s gonna be where you found something.