Predictive Policing: World Building and Made Up People
Produced by: Daniel Evans and Jude Marcella
This project is concerned with the politics and ethics of policing, specifically predictive policing methods. Our lives are tangibly and deeply impacted by the data that is collected about us and used without our knowledge by police services. Through our research we aim to interrogate the ways the police are using machine learning to process their datasets, or perhaps aspire to use it. In the UK and US specifically there has been a growing use and development of predictive mapping programs, or PMPs, which generate heatmaps of areas where crimes are likely to occur, using algorithms trained on historical police data. These algorithmic systems and PMPs are of course completely opaque to the public, particularly in the UK, although we can conduct our research around the visible edges of this information black hole, by looking at evidence that orbits the software.
One publicised example of such a predictive program is the US analytics company, PredPol. The company does not reveal anything about its computational processes, nor which police departments use its services, but it does have a public website (https://www.predpol.com/) featuring ‘customer testimonials’ from US police chiefs, which suggest a lot about the assumptions and attitudes that underlie their methods. Although PredPol has been used by a small number of UK police services, many of them are understood to now use internally developed, more secretive systems.
These policing services claim not to target specific groups or individuals, whilst the data they use is inextricably tied to pre-existing biases already present in policing datasets. PredPol has indeed been investigated for its tendency towards pronounced racially biased outcomes. In the phenomenon of pattern discrimination, PMPs act to “over-recognise” the biases implicit in policing data by associating communities and neighbourhoods into the category of criminal. This then results in direct police action, which then of course becomes police data thus joining up the closed loop of this algorithmic cycle.
We conclude from this that we must reveal the ‘magical reality’ of the PMP, by taking poetic and theoretical readings of the subject. The PMP acts as a politicomaniacal world builder: constructing a thin veneer of a fictional reality on top of the real world. The PMP is creating the data on which it will then feed. This culminates in the image of a zoetrope, spinning and displaying an image of particles dispersing and re-forming into rectangles that represent buildings of bureaucracy, domicile, or imprisonment, in the contemporary metropolis. The zoetrope is a closed loop, and we have decided to 3D model it thus combining the archaic nature of it as a historical item, and synthesising this with new computational technologies. This digital zoetrope is a reflection of our research project's concern with how the old forms of policing are sustained and reinforced in a loop by contemporary algorithmic technologies.
Moses, L. B., Chan, J. (2008). Algorithmic Prediction in Policing: Assumptions, Evaluations and Accountability, Policing and Society, 28:7, 806-822
Butler, J. (2006) Precarious Life: The Power of Mourning and Violence, Verso
Foucault M. (1977) Discipline and Punish, Vintage Books
Couchman H. (2018) Policing By Machine, Liberty Report
Babuta A. and Oswald M. (2019), Data Analytics and Algorithmic Bias in Policing, RUSI Report
Mohaghegh J. B. (2019) Omnicide: Mania, Fatality and the Future-In-Delirium, Urbanomic
Ferguson, A. G. (2015) BIG DATA AND PREDICTIVE REASONABLE SUSPICION, University of Pennsylvania Law Review
Ballard, J.G. (1973) Crash, Jonathan Cape
Smith, M. (2018) Can we predict when and where a crime will take place?, BBC News https://www.bbc.co.uk/news/business-46017239
PredPol Website https://www.predpol.com/
Smith, J. (2016), Crime Prediction Toll PredPol only Amplifies Racially Biased Policing, Mic https://www.mic.com/articles/156286/crime-prediction-tool-pred-pol-only-amplifies-racially-biased-policing-study-shows
Apprich, C., Chun, W., Cramer, F., Steyerl, H. (2019) Pattern Discrimination, University of Minnesota Press