Blackening The Tech-Savvy Leviathan: Hobbes, Race, and Autonomous Security Technologies.
David Xu
I. Introduction
In 2016, the âDallas sniperâ Micah Xavier Johnson, who killed five cops at a Black Lives Matter protest, was killed via SWAT drone strike. This killing was the first use of a drone by police to kill a suspect on American soil. While Johnsonâs actions were widely condemned, including by BLM activists, and I do not seek to redeem them, this paper will investigate new questions in sovereignty, race, and autonomous technologies sparked by the killing of a Black citizen by the American sovereign by use of drones. I begin by reading the Hobbesian social contract with Charles Millsâ The Racial Contract, then exploring how autonomous technologies have been interpreted by Hobbesian theorists for the American sovereign, and finally exploring how a racialized Hobbesian would approach the ethical question of said technologies.
II. Blackening Hobbes
Millsâ impact on philosophy was his reading of race as the central foundation in Western Philosophy. Inverting the color-ignorant assumption that footnoted racial concerns, Mills flips the presumption; âthe racist âexceptionâ has really been the rule; what has been taken as the âruleâ the ideal norm, has really been the exceptionâ. Instead of reading racism as an unfortunate byproduct, Mills recognizes, âa partitioned social ontologyâŚdivided between persons and racial sub personsâ, as a precondition of contractarianism. For Hobbes, his social contract was used to escape the state of nature, an anarchist war, âof every man against every manâ, where life was ânasty, brutish, and shortâ. In response, Hobbes proposes the cession of rights to âabsolute sovereigntyâ as the sole arbiter of ethics and rights, resolving anarchist disorder. To demonstrate its racialization, Mills quotes Hobbes in, âhis only real-life exampleâ, of the state of nature as, âthe savage people in many places of Americaâ. This demonstrates the foundational white supremacy in Hobbesâ original theory, where absolute immorality and violence is only found in indigeneity. Beyond the violent emergence, Mills criticizes the Hobbesian racial contract as, âestablishing a moral hierarchyâŚto secure and legitimate the privileging of those individuals designated as white/persons and the exploitation of those designated as nonwhiteâ, which can be seen within Hobbesâ ironic and racialized example of the state of nature as, âthe very nonwhite people upon whose land his fellow Europeans were then encroachingâ. However, the social contractâs racialized birth does not damn it to whiteness; Mills engages a recuperative project where, âcontractarian liberalism can be radicalized, historicized, and racializedâ, by moving from ideal theoryâs ivory tower toward understanding, âmodernityâs constitutive racializationâ, to âredress its asymmetrical instantiationsâ. For Hobbes, this requires centering the very racism required to create his social contract to correct its racialized biases, ensuring its ethical and real-world robustness.
III. Blackening Autonomous Technologies
For Hobbesians, the use of autonomous technologies e.g. autonomous weapons systems (AWS), drones, and predictive algorithms, by the military and police, resolves various tensions found within Hobbesâ social contract. Creating the sovereign eliminates anarchy to ensure the commonwealthâs stability, and autonomous technologies assist this mission in two ways. However, this paper will demonstrate how a failure to center race in autonomous deployment will lead to its failure as well.
First, AWSs resolve a paradox unnoticed in peace. The Hobbesian contract relies on a, âmutual Relation between Protection and Obedienceâ, where the sovereign guarantees the subjectâs protection in exchange for obedience. However, during wartime, âevery man is bound ⌠to protect in Warre, the authority by which he is himself protected in time of peaceâ. Thus, the contractual exchange erodes during wartime when the subjectâs protection is not insured because of their military duty to the sovereignâs protection. To resolve the risk to the subjectâs protection, autonomous weapons can, âfeasibly replace combatants, thereby eliminating the need for human deploymentsâŚreduce the risk to American lives without diminishing U.S. combat capabilitiesâ. However, the assumption that protection in exchange for obedience is sutured by whiteness. In the name of preserving peace, Hobbes further writes, âthat a Subject may be put to death, by the command of the Soveraign Powerâ, because of the sovereignâs right to punishment. Although a citizen during peacetime enjoys protection, their citizen status is in jeopardy because of the porous border between civil and martial law, where the rights of citizens are not recognized when they are deemed an enemy of the state, either via war or criminal status.
The deployment of AWSs in accordance with Hobbesâ social contract is complicated by the systematically-biased American judicial system, where Black people are, âincarcerated in state prisons at nearly five times the rate of white Americansâ. The precarious citizen status of racial minorities demonstrates Millsâ point that existing power structures are hierarchized via the racial contract and codified in legal structures. For autonomous weapons, the deployment of the piloted drone against a Black citizen further blurs the line between civil and martial law as the American police force militarizes. Along that telos, American police departments such as San Francisco and Boston have been adopting military, robotic weaponry. Although not yet autonomous, the shift from remotely piloted, to semi-autonomous, to fully autonomous is on the horizon for military/police technologies. This is important for racialized populations as the calculus for autonomous weapon technologies will be constantly deciding, âwho is a threatâŚwho is suspiciousâŚwho is an enemyâ. Thus, the threat constructions of the black population by the American police in the present is embedded in the future as the, âproduction of suspects/targets/threats clearly parallels how in US cities Black youth congregating on street corners has long been read as the âsignatureâ or âpersonalityâ of threat, thereby licensing police intervention.â In the context of warfare, as the American empire has historically engaged in racialized and violent military intervention in the Middle East, the present Orientalist threat construction is reproduced in an autonomous-warfare future as âPakistani boys doing âjumping jacksâ are easily construed as a âterrorist training campââ by drone program officials. Thus, the question of future autonomous technologies, when deployed by military/police forces, is inextricable from questions of present racialized threat construction, biases, and hierarchies. This complicates the autonomous solution to the Hobbesian problem; the precarious citizen status of racialized populations is entrenched in the racial contract, and military capabilities are reduced by ineffective, racially-biased autonomous weapons.
Second, human administration â âbiased, weak willed, exhaustible, unable to fully workâ â is a problem for Hobbesians who require, âa political order that predicts, shapes, and reshape our collective behaviour in advanceâ, which âensures that peace prevails, and every human action is under scrutiny of the political orderâ. One solution is predictive policing, where data analysis can predict, âwhere a crime may occurâŚwho will be involvedâŚfor crime control and forecastingâ. For Hobbesian, AWSs make, âperfect administrators and enforcers of law, unbiased and tirelessâŚthe perfection of the rule of lawâ, a seemingly perfect solution to human error.
However, researchers found predictive policing recreates the racialized biases of police because the, âdata reflects the practices, policies, biasesâŚof a given departmentâ. In a study of Microsoft, researchers found their algorithm to, âfalsely flag black defendants as future criminals, wrongly labeling them [as greater crime risks] at almost twice the rate as white defendantsâ. while, â[w]hite defendants were mislabeled as low riskâ. AWSs and predictive policing are two sides of the same coin where, âdeveloping fully autonomous weaponsâŚbased on data-inputs and pre-programmed algorithmsâŚexacerbate [long-standing inherent biases] and lead to deadly consequencesâ. Thus, the abstract solution of deploying predictive policing and AWSs to infallibly enforce the law only leads to retrenched racial biases and exacerbates the original problem of human error in the racial contract.
These failures to integrate autonomous technologies within a social contract reflect Millsâ critique, where the construction of a Hobbes sovereign entrenches existing racial hierarchies. However, just as Mills believes that centering race can redeem social contract theory, centering concerns about racial biases would effectively ensure ethical autonomous technologies where, âinput data are valid and are analyzed properly to avoid discriminationâ, creating an opportunity to, âbuild safer communities, rather than cracking down harder on areas that are already strugglingâ. This has been demonstrated with predictive policing where, âusing the algorithm in contextâŚthatâs sensitive to issues of racial justiceâŚfewer people would go to jail, and the rate of racially-disparate false positives would almost disappearâ. This move from an idealized view of autonomous weaponry/policing that recreates the legacy of racial bias towards centering how race materially affects the data that informs autonomous analytics is reflective of Millsâ reading of the, âideal of the social contract and the reality of the Racial Contractâ.
IV. Conclusion
In this paper, I have located the Hobbesian social contract in Millsâ racial contract, investigated color-evasive Hobbesian approaches to autonomous technologies, their mistakes by ignoring how race would lead to the failure of those methods, and finally how Millsâ technique of centering race would be able to redeem historically racist theories and technologies. These autonomous technologies are neither good nor bad in a vacuum but can only be ethical and effective with a racialized lens, as ânaming this reality brings it into the necessary theoretical focus for these issues to be addressedâ.
VII. Bibliography
American Civil Liberties Union. âRace and Criminal Justiceâ. https://www.aclu.org/issues/racial-justice/race-and-criminal-justice
Bargu, B (2014) Sovereignty as erasure: Rethinking enforced disappearances. Qui Parle: Critical Humanities and Social Sciences 23(1): 35â75.
Choudhary, Priyanka. (2023). âPublic Policy and Prospects of Pharmaceutical MSMEs: A Study with Special Reference to Dehradun, Uttarakhand.â XIX. 236-247.
Mason, Ari. âBlack Lives Matter Activists, Civil Rights Leaders Condemn Dallas Ambushâ. NBC New York, 2016.
http://www.nbcnewyork.com/news/national-international/Dallas-Police-Shooting-Sniper-Black-Lives-Matter-NAACP-385997131.html
Mills, Charles W. The Racial Contract. Cornell University Press, 1997. http://www.jstor.org/stable/10.7591/j.ctt5hh1wj
Har, Janie and Lauer, Claudia. âPolice can now legally use killer robots and theyâve already deployed them a few timesâ, Fortune, December 5, 2022. https://fortune.com/2022/12/05/how-much-are-police-using-killer-robots-san-francisco/
Hobbes, Thomas. The Leviathan. Penguin Books, 1651.
Isaac, William And Dixoi, Andi. "Why big-data analysis of police activity is inherently biased." Phys.org, 10 May 2017, https://phys.org/news/2017-05-big-data-analysis-police-inherently-biased.html
MacIntosh, Duncan. âAutonomous Weapons and the Nature of Law and Morality: How Rule-of-Law-Values Require Automation of the Rule of Lawâ. Temple International and Comparative Law Journal, 2016. https://sites.temple.edu/ticlj/files/2017/02/30.1.MacIntosh-TICLJ.pdf
Mae Pedron, Stephanie and de Arimateia da Cruz, Jose. "The Future of Wars: Artificial Intelligence (AI) and Lethal Autonomous Weapon Systems (LAWS)." International Journal of Security Studies, 2(1). 2020. https://digitalcommons.northgeorgia.edu/cgi/viewcontent.cgi?article=1020&context=ijoss.
Marwah, Inder S., 'Charles W. Mills, The Racial Contract', in Jacob T. Levy (ed.), The Oxford Handbook of Classics in Contemporary Political Theory (online edn, Oxford Academic, 10 Dec. 2015), https://doi.org/10.1093/oxfordhb/9780198717133.013.62.
O'Donnell, Renata. "Challenging Racist Predictive Policing Algorithms Under the Equal Protection Clause." New York University Law Review, Volume 94, June 2019, https://www.nyulawreview.org/wp-content/uploads/2019/06/NYULawReview-94-3-ODonnell.pdf.
Ramsay-Jones, Hayley. âRacism and Fully Autonomous Weapons.â Campaign to Stop Killer Robots, October 17, 2019. https://www.ohchr.org/sites/default/files/Documents/Issues/Racism/SR/Call/campaigntostopkillerrobots.pdf
Richardson, Rashida, M. Schultz, Jason, and Crawford, Kate. âDirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice.â New York University Law Review, May 2019, https://www.nyulawreview.org/wp-content/uploads/2019/04/NYULawReview-94-Richardson-Schultz-Crawford.pdf
Schwartzapfel, Beth. "Can Racist Algorithms Be Fixed?" The Marshall Project, July 1, 20219. https://www.themarshallproject.org/2019/07/01/can-racist-algorithms-be-fixed
Wall, Tyler (School of Justice Studies, Eastern Kentucky University). âOrdinary Emergency: Drones, Police, and Geographies of Legal Terrorâ, Antipode Vol. 48, No. 4, 2016, pg.1124-1129