A number of other examples illustrate algorithmic discrimination as an ongoing problem. When a graduate student searched for “unprofessional hairstyles for work,” she was shown photos of Black women; when she changed the search to “professional hairstyles for work,” she was presented with photos of White women.40 Men are shown ads for high-income jobs much more frequently than are women, and tutoring for what is known in the United States as the Scholastic Aptitude Test (SAT) is priced more highly for customers in neighborhoods with a higher density of Asian residents: “From retail to real estate, from employment to criminal justice, the use of data mining, scoring and predictive software … is proliferating … [And] when software makes decisions based on data, like a person’s zip code, it can reflect, or even amplify, the results of historical or institutional discrimination.”41
A team of Princeton researchers studying associations made with Black-sounding names and White-sounding names confirmed findings from employment audit studies42 to the effect that respondents make negative associations with Black names and positive associations with White ones. Caliskan and colleagues show that widely used language- processing algorithms trained on human writing from the Internet reproduce human biases along racist and sexist lines.43 They call into question the assumption that computation is pure and unbiased, warning that, “if we build an intelligent system that learns enough about the properties of language to be able to understand and produce it, in the process it will also acquire historic cultural associations, some of which can be objectionable. Already, popular online translation systems incorporate some of the biases we study … Further concerns may arise as AI is given agency in our society.”44 And, as we shall see in the following chapters, the practice of codifying existing social prejudices into a technical system is even harder to detect when the stated purpose of a particular technology is to override human prejudice.