top of page
carceral AI isn't new.

While advances in big data and machine learning are a relatively recent development, the broader ideologies and practices that underpin carceral AI are rooted in long histories of control and colonization of human populations. The emergence of surveillance technologies can be traced to anti-Black surveillance tactics widely used during the transatlantic slave trade, such as 18th century “lantern laws,” which required Black and Indigenous people to carry candle lanterns after dark if not accompanied by a white person.

​

lantern_transparent.png

This practice of counting, surveilling, subdividing, and controlling populations through data related to race, caste, and biometric markers is grounded in eugenics – a set of practices that aim to increase the occurrence of ‘desirable’ human traits while eliminating the ‘undesirable’ human traits. This philosophy continues to underpin many penal policies, which are built on the belief that certain parts of the human population have a proclivity for breaking laws and that these “risky” populations should be identified and isolated from the rest of society.

​

​Caste and race have also been used in colonial contexts such as India, Bengal, Tanzania, Mauritius, and Kenya, to name only a few, as a way to create boundaries and control populations using scientific racism to justify surveillance and domination. In 1920s Kenya, for instance, the British colonial government required Black African males to wear an identification document known as a kipande, which allowed the colonizers to segregate and restrict the movement of Black Africans. Other examples include the use of P.C. Mahalanobis’ anthropometric caste distances in colonial Bengal and the mobility pass for new immigrants in colonial Mauritius. 

​

Colonization has long been a testing ground for technological innovation, and carceral technologies continue to be tested in occupied areas and at borders. Historically, technologies of enumeration like Kenya’s kipande system have morphed into digital identification and movement monitoring projects, while systematic collection of data for immigration enforcement at the US-Mexico border have been part of the US immigration system since processing newly arrived people at Ellis Island in 1900s, intensifying after 9/11. However, testing of technologies remains a present day phenomenon. For example, Israel – one of the world’s largest exporters of military equipment – tests new AI-driven surveillance and weapons technologies in occupied Palestinian territories and later exports them globally to police departments and militaries in over 100 countries, including the United States.

 

In other words, the technologies that make up carceral AI domestically are built on a legacy of experimentation on oppressed groups abroad.

 

In the US, police have often been early adopters of technology, using radio systems in the 1920s, measuring their work quantitatively since at least the 1960s, and commonly relying on tools such as Compstat to quantify and manage police activity in larger cities since the 1990s. Nevertheless, the apparent novelty of contemporary data-driven policing programs has played a role in helping police secure more resources and political cover, frequently in response to moments of social upheaval. Police adopt AI technologies under policing “reform” initiatives that can then be further reformed and rebranded (a 2.0 following 1.0) as needed. For example, after community organizing shut down the PredPol and LASER programs described in part 1.1 above, the LAPD repackaged them into a vaguer new initiative titled “Data-Informed Community-Focused Policing,” and the company PredPol changed its name to Geolitica, before its components were ultimately purchased by a larger surveillance technology firm, SoundThinking (which itself is a rebranding of ShotSpotter).

 

Carceral AI is just one part of a complex of technologies that is fueled by longstanding geopolitical agendas of control, conquest and exclusion. Militarism, exclusionary bordering practices, and political agendas built on securitization have given rise to a global industrial complex of carceral technologies, driven by “innovation,” growth, and ultimately, profit.

​

Suggested readings:

  • Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code. John Wiley & Sons.

  • Browne, S. (2015). Dark Matters: On the Surveillance of Blackness. Duke University Press.

  • Loewenstein, A. (2023). The Palestine laboratory: How Israel exports the technology of occupation around the world. Verso Books.

  • Molnar, P. (2024). The walls have eyes: surviving migration in the age of artificial intelligence. The New Press.

  • Resisting Borders and Technologies of Violence (2024). eds. Aizeki, M., Mahmoudi, M., & Schupfer, C.

bottom of page