In the short story and movie “Minority Report,” police in the future are authorized to arrest, and prosecutors to convict, people for the crimes they will (according to predictive technology) commit in the future. Although the thought of arresting, trying and convicting individuals for what they have not yet done seems far-fetched, the basic structural and technological prerequisites for such a world are already in place. What does this mean?
Over the past few years, crime prevention through big data has undergone a transition from interfacing with data to using it predicatively. Up until now, law enforcement could check individual samples against DNA and fingerprint databases, arrest records, that kind of thing. But now, there’s an emerging norm towards “predictive analytics algorithms to identify broader trends.” And yes, this is a step closer to a “Minority Report” world, where police predict where crime is likely to occur based on what we might call “moving” or trending data rather than snapshots of particular markers and records like fingerprints and DNA.
Areas, where predictive software is used, have experienced remarkable results: a “33% reduction in burglaries, 21% reduction in violent crimes and a 12% reduction in property crime.” But there’s even more at stake here. In that such policing is preventative rather than reactive, what less crime also means is less confrontation between police and suspects–including innocent suspects erroneously targeted by police in the old world where cops wait to see suspicious behavior and then act on it in the midst of uncertainty. Of course, there are other implications, for detective work, for work dealing with serial offenders, in assessing the need for future resources. Big data won’t necessarily improve relations between residents and police outright, but smarter policing may translate into less confrontational policing and an increase in public perception of police effectiveness.
And so a lot of police forces see the biggest challenge now to be teaching police how to use the technology the right way. A recent British report on “big data’s use in policing published by the Royal United Services Institute for Defence and Security Studies (RUSI) said British forces already have access to huge amounts of data but lack the capability to use it.” This is unfortunate because, at least as researchers and developers see it, in the words of Alexander Babuta, who did the British research, “The software itself is actually quite simple – using crime type, crime location and date and time – and then based on past crime data it generates a hotspot map identifying areas where crime is most likely to happen.”
So we might be led to think that the only challenge left is training police forces to effectively use big data predictably, that doing so will decrease crime without resorting to aggressive policing and the regressive and socially negative “broken windows” policing of Giuliani-era New York, where policymakers attempted to harness individual acts of police intimidation in the service of an overall social perception of a crime-free city. Data accuracy is also critical—when working with client Accurate Append, we find that demographic and email and other contact data are missing and incomplete in data files across industries.
The jury is not unanimous on the use of algorithmic big data as a crime prevention tool. To begin with, predictive policing can be perceived as just as oppressive as reactive policing. Predicting that certain areas are prone to future crime almost certainly means putting up video cameras, possibly with controversial facial recognition technology, in these “risky” areas. But the construct of the “high-risk area” by data interpretation risks being just as laden with racist or other assumptions as policing itself can often be.
After all, we know that big data is not immune to racism or to other stereotyping of its human keepers. And what if, in an effort to politically manipulate the landscape of city policing, politicians and appointees manipulate or misinterpret the conclusions of long-term trends, or short-term spikes in crime, to continue the over-policing of oppressed communities? This is an emerging concern among civil liberties advocates in the UK and the U.S.
Another concern, expressed by the editorial staff at the British paper The Guardian, is that in addition to predicting trends in particular areas, police are also using this interpretive technology “on individuals in order to predict their likelihood of reoffending” — which gets us even closer to “Minority Report” status. At the very least, “it is easy to see that the use of such software can perpetuate and entrench patterns of unjust discrimination.” Or worse, many fear. And, to make perhaps an obvious but necessary point, “the idea that algorithms could substitute for probation officers or the traditional human intelligence of police officers is absurd and wrong. Of course such human judgments are fallible and sometimes biased. But training an algorithm on the results of previous mistakes merely means they can be made without human intervention in the future . . . Machines can make human misjudgments very much worse.”
Perhaps the most interesting take on the social dangers of big data use in policing comes from Janet Chan, professor of law at the University of New South Wales, in an academic paper for Criminology and Criminal Justice. Chen writes that “data visualization, by removing the visibility of real people or events and aestheticizing the representations in the race for attention, can achieve further distancing and deadening of conscience in situations where graphic photographic images might at least garner initial emotional impact.” In other words, seeing only data instead of the faces of victims and offenders, or the social and neighborhood contexts of property crimes, or the living dynamics of domestic violence cases, risks making law enforcement, and perhaps policymakers, less engaged and empathetic towards the public. Chen cites other scholars’ work to suggest that visual representation and face-to-face encounters, however imperfect, are necessary forms of social engagement.