On December 4, 2020, Brian Jordan Jefferson, Associate Professor of Geography and Geographic Information Science at the University of Illinois gave a lecture as part of the West Hollywood Aesthetics and Politics (WHAP) series. The Fall 2020 Lecture Series, “Black Out: On the Surveillance of Blackness,” is presented by the CalArts Aesthetics and Politics MA Program and the West Hollywood Public Library. A recording of the lecture, “Information Capitalism Meets Racial Capitalism: Surveillance and Racial Criminalization in the Digital Age” and respondent Shakeer Rahman is available on the CalArts Aesthetics & Politics Youtube page.
Terms of Service: To Protect and To Serve the People
Brian Jefferson’s book, Digitize and Punish, examines the dangers that lie at the intersection American racialized policing with idealistic applications of data science as facilitated through Smart Cities and the Internet of Things. In investigating the manipulation of the police’s alleged objective to ‘protect’ the people, the critique of systemic failure reorients similar questions toward unclear delineation of power given to the creators of these technologies that serve to surveil, record, track, and punish. Who are the citizens who create carceral technology, and who are they to do it? In A Theory of the Drone, Grégoire Chamayou deconstructs the contract that the citizen has with society, that protection is paid for with obedience. In summation, he explains that,
“political authority is structured by what Hobbes calls ‘the mutual relation between protection and obedience.’ The sovereign protects me, and it is because he protects me that he has the right to force me to obey him. Schmitt condensed this into the formula ‘Protego ergo obligo’: I protect, so I am obliged” [1]
Though his work investigates what this pact means for citizen soldiers in times of war, the paradox illuminates the non-protection—the predatory relationship—of police and communities of color, as well as the inability for a citizen to reject this deal. For marginalized groups, the stakes for disobedience are too high and the ability to sufficiently perform obedience is impossible. What does it mean to involuntarily be born into a society that does not offer each citizen this same contract?
It has become increasingly clear that state protections are not offered to all citizens. In the abstract, a democratic republic supports a relationship between the citizen, the elected leader, and directives implemented by that leader. A system of devices such as policymaking, the court system, and public spending are features of the government in which we participate. In practice, this arrangement is only offered to those who exist un-targeted by carceral tools. Right now, incarcerated people cannot vote in thirty-seven states; California, New York, and Connecticut do not allow those on parole to vote; felons lose voting rights indefinitely in eleven states (some require a governor’s pardon); and in Iowa, Kentucky, and Virginia, felons permanently lose their right to vote.[2][3] These policies enable a white supremacist version of democracy, where the rights of targeted groups to be represented and heard becomes severed.
Our shameful legacy of police brutality demonstrates that the demand of obedience is an excuse, not the actual objective. Too many Black men and women have been killed while acting in compliance during their arrest. The operative goal of behavior enforcement among society becomes control. As abolitionist calls to action have become significantly more mainstream, the pressure to defund the police, to drastically reallocate funding, to completely reconfigure our current system of emergency services, et cetera, are no longer considered radical positions. Many criticize the evident flaws in police training, with particularly the hours required to graduate from a police academy. Though regulations vary by state, it is pointed out increasingly often that police often undergo fewer hours of training than most barbers and cosmetologists. Two things are clear: we do not currently function under the theoretical contract wherein all members of society are protected, and those who enforce the letter of the law are considered under-qualified by the citizens to do so. The credentials of police officers are being questioned by both citizens and lawmakers alike, which begs the question: are carceral tools being designed for an idealized police force or for the one that is currently critiqued by its citizens en masse for lacking sufficient qualifications?
Carceral Technology
Brian Jordan Jefferson’s work addresses the political correlations which emerge when tying alleged disobedience to recorded data. When police officers are always present to surveille any infraction in Black neighborhoods, they develop confirmation bias. This practice of bias is a product of their lived observations and further fanned by internal discourse, and the perceived neutrality of data visualization seemingly validates their tendency toward a violent course of action. Data collection which starts biased will produce results which are biased, and ‘indisputable’ facts and figures lend credibility to the interpretation of such quantitative data. “The police intuitions and policies that are fundamental to racialized policing are validated anew.”[4] Confirmation bias is no longer a collection of anecdotal personal observations, as it becomes recorded history.
The distance between America’s most marginalized groups and those most privileged to data literacy education could not be more apparent; on-the-ground law enforcement and the engineers of carceral technology are likely speaking past one another, and the gravity of the consequences goes unnoticed. Who is the latest tech created for if not the ideal user, the best-behaved and most informed police? The cultural awareness that precincts need assistance, require better training, and must change strategy means that carceral tools are created to supplement individual deficits in decision-making. If the data scientist creates tools for unexamined, data-illiterate law enforcement, then the design of such tools takes the lead on any change of police action. When examining the assertion that law enforcement is ‘unqualified’ for public service, it then becomes a necessity to have the same interrogations directed toward the creators of these new technologies and their credentials to impact society. In a recent lecture, Safiya Noble gave the attendees a rather eye-opening look at the indoctrination of young minds who enter a computer science classroom.
“We don’t solve the epistemological approaches that are in those fields. … many of our first- and second-year computer science students have AP tested out of English, … are rolling on some high-school level humanities sometimes when they’re graduating from college. This is unacceptable. … I tell my students all the time, you have no business designing technology for society, and you know nothing about society.”[5]
Because raw data offers the appearance of neutrality, it is often misused as an object of ethical good. It is positioned as uniquely unquestionable: though the process of data collection is subject to human error, the numbers produced are considered unable to ‘lie’. A perfect storm of these unquestionable numerical values, the individual interpretations of data by officers, and the distance of tech workers and designers creates a system of non-blame. When not one party is to be held clearly accountable, the threat of errors holds less gravity among those in charge. Increasingly sophisticated systems for surveillance and predictive policing continue to facilitate brutality while critics struggle to penetrate specific pre-existing political talking points—namely, the too-familiar gas-lighting tactics that sow doubt and denial that racialized police violence even exists. Misdirection is already far too easily accomplished even in the most cut-and-dry circumstances.
“A robot commits a war crime. Who is responsible? … [The state] would no doubt acknowledge some responsibility, but … the state might place the blame on the manufacturer, who in turn might seek to blame the programmers.”[6]
Like the tiresome joke “it’s not a bug, it’s a feature,” obscuring the source of operator error comes with its own appeal. Carceral technology relies on selling tools that are believed to be so undebatable that any resulting punishments must be deserved. On blame, Chamayou offers some worrying insight.
“When the lethal decision is purely automatic, the only human agent directly identifiable as the efficient cause of death would turn out to be the victim himself, who, as a result of making inappropriate physical movements, was unfortunate enough to set off the automatic mechanism that results in his own elimination.”[7]
Here, Chamayou is speaking specifically about drones and automatons that kill within war zones, though it is crucial to understand its translation at the broken-windows level in order to prevent digital brutality, rather than creating a condition to critique and attempt to fix. The very idea that any network of technologies could ever function perfectly and equitably upon launch is impossible. These creations rely on instances of failure in order to facilitate fine tuning and improvements. It is troubling that marginalized peoples who become the targets of such failures aren’t given an opportunity to consent to technological ‘betterment.’ Their participation is decided for them.
If the police cannot act as reliably trained experts of citizen protections, then the producers of applications and digital tools have undue influence on their directives, capabilities, and decisions. The demand to competitively ‘innovate’ carceral technologies financially incentivizes the effectiveness of such developments: i.e., which company can develop the next great application that will facilitate the most efficient arrests? Industry leaders with any involvement in carceral tech become the designers of racialized control. The question of who is meant to be responsible becomes complicated, and that of who actually is responsible can go uninvestigated indefinitely. Automation means that machines can function unhindered while the humans argue. When the growing pains of carceral technology are inflicted on the whole of society, members of Black and brown communities are the ones who pay the price.
Endnotes
[1] Chamayou Grégoire, and Janet Lloyd. A Theory of the Drone. The New Press, New York, NY, 2015. 178-9
[2] “Felon Voting Rights.” National Conference of State Legislatures, 1 Oct. 2020.
[3] “Felony Disenfranchisement Laws (Map).” American Civil Liberties Union.
[4] Jefferson, Brian Jordan. “Predictable Policing: Predictive Crime Mapping and Geographies of Policing and Race.” Annals of the American Association of Geographers, vol. 108, no. 1, 2017, pp. 1–16., doi:10.1080/24694452.2017.1293500.
[5] “Anti-Blackness and Technology.” UC Santa Barbara Center for Black Research and the Multicultural Center, 18 Nov. 2020.
[6] Chamayou. 210
[7] Chamayou. 211