the scale, scope, and opacity of carceral AI is unprecedented.
While historical perspective shows that carceral AI repackages long-standing tactics to limit human autonomy, it is important to emphasize which aspects of carceral AI are new. Largely, the novelty of carceral AI rests in its unprecedented scale, scope, and opacity.
The ability to store, combine, and query data at scale and from previously disparate sources gives new uses to old data collection efforts. Private companies also play an increasingly central role, collecting data and selling it to the government either in raw data form or in packaged tools and platforms. There has been a rise in companies specifically marketing products to the criminal legal system, such as for gunshot detection, predictive policing, video analytics and facial recognition, and DNA profiling. These companies often work across global and local scales, collecting data and testing products in carceral and/or global militarized contexts, and then marketing them for the general population.The global border industrial complex is estimated to be between $70 billion and $149 billion.
​
The new computational power to process data has further encouraged the expansion of data collection. The systematic and widespread use of surveillance, or “mass surveillance,” has augmented the scope of how these technologies get used, placing a heavier emphasis on prediction and “proactive” decisions instead of reaction and explanation.
Previous techniques are also applied more easily to new arenas where they may be more prone to errors or have unknown risks. DNA software, for example, has expanded DNA analysis into more complex mixtures that contain more people’s DNA and to samples with much lower amounts of DNA, a departure from the samples on which the technology was initially developed that renders it difficult or infeasible for a human to verify the results.
Finally, the opacity of carceral AI tools makes it more difficult to interpret their outputs, identify potential sources of error and bias, and contest the decision-making processes in their development and use. The quantification of scores and packaging of decision processes into software provides an air of objectivity and abstracts the discretionary choices that go into the process. This predilection for mathematical and data-driven decision-making makes it more difficult to challenge decisions, especially where the software is proprietary and therefore hidden from cross-examination.​
Suggested readings:
-
Brayne, S. (2020). Predict and surveil: Data, discretion, and the future of policing. Oxford University Press.
-
Aizeki, M., Bingham, L., & Narváez, S. (2023). The everywhere border: Digital migration control infrastructure in the Americas.
-
Molnar, P. (2024). The walls have eyes: surviving migration in the age of artificial intelligence. The New Press.
-
Zuboff, S. (2018). The Age of Surveillance Capitalism. Public Affairs.