Problem solving is at the heart of tech. An algorithm, after all, is a set of instructions, rules, and calculations designed to solve problems. Data for Black Lives co-founder Yeshimabeit Milner reminds us that “[t]he decision to make every Black life count as three-fifths of a person was embedded in the electoral college, an algorithm that continues to be the basis of our current democracy.”19 Thus, even just deciding what problem needs solving requires a host of judgments; and yet we are expected to pay no attention to the man behind the screen.
As danah boyd and M. C. Elish of the Data & Society Research Institute posit, “[t]he datasets and models used in these systems are not objective representations of reality. They are the culmination of particular tools, people, and power structures that foreground one way of seeing or judging over another.”21 By pulling back the curtain and drawing attention to forms of coded inequity, not only do we become more aware of the social dimensions of technology but we can work together against the emergence of a digital caste system that relies on our naivety when it comes to the neutrality of technology. This problem extends beyond obvious forms of criminalization and surveillance. It includes an elaborate social and technical apparatus that governs all areas of life.