Training - Big Data: The Future of Policing

Big Data: The Future of Policing
By Deniz Majagah

Mr. Marks was arrested and taken away for a crime he had yet to commit. This is pre-crime policing. Futuristic and foreboding.

Imagine a world where crime can be predicted. A world where you knew who was going to break the law. A world where you knew when a robbery was going to be committed. A world where you knew when someone was going to be murdered. This is the dystopian future of Minority Report, a short story penned by Philip K. Dick and later turned into a movie.

“Mr. Marks, by mandate of the District of Columbia Precrime Division, I'm placing you under arrest for the future murder of Sarah Marks and Donald Dubin that was to take place today, April 22 at 0800 hours and four minutes.” – Precrime Chief John Anderton, Minority Report

Rewind to today, to reality. Predictive policing is no longer science fiction. But, it’s not something necessarily new either.

The first uses of predictive policing resulted in people being falsely accused, arrested and imprisoned because of the crude, flawed theory put forth by the father of modern criminology, Cesare Lombroso. Lombroso postulated that criminality was inherited, and that certain people were predisposed to commit crime. These “born criminals” could be identified by a set of atavistic physical features such as facial asymmetry, big ears or the “angular or sugar-loaf form of the skull, common to criminals and apes.”

Pin mapping for hotspot policing and psychological profiling to identify serial killers and other criminals are well-established, though somewhat basic, forms of predictive policing. Observational information coupled with an officer’s experience also provides a certain level of predictive policing. As an overly simplified example, it’s not difficult to guess what is going to happen when you see someone in a ski mask walking into a bank when everyone else is wearing short sleeves and shorts.

The difference between the predictive policing of the past and the predictive policing of the present and the future lies with data. Small data versus big data.

Pin mapping, psychological profiling and observational information coupled with experience are all examples of small data. Small data is easy to access, analyze and small enough that we can comprehend and understand what it is. Big data is very much everything that small data is not. It is made of large data sets, often from different, disparate sources, making it very complex and difficult to use and make sense of. When used correctly, though, and with the right tools, big data reveals hidden patterns and useful information. It helps us to see connections and make correlations that we wouldn’t have been able to find otherwise. In the world of law enforcement, big data is a tool that can help us to predict crime and criminality.

If big data is the heart of smart, predictive policing, algorithms are the brains. At its simplest, an algorithm is a set of rules that are used to solve a problem. You feed the algorithm with big data information, it runs that information against the rules that are set up and it gives you results.

It’s easy enough to understand, sans the technical aspects. Tons of information from different sources are pumped into a computer and useful, actionable information is returned. You now have a crime forecast.

The computer is telling you when and where to expect certain crimes to occur. Great. Extra patrols are sent to those areas and it’s had the expected effect and there are no burglaries and only two assaults during the shift as a result.

The computer has also provided you a heat list of individuals; a list of people who are predisposed to commit crime. The list is built from the rules that you gave the algorithms, from the data sources that you provided.

But what happens when these tools are misused? What happens when during those same patrols individuals on that list are stopped, questioned, frisked or even arrested for no other reason than being on a heat list and spotted within or near a “crime forecast” location? What happens if the data that is fed into the algorithms is skewed somehow? Do we have information that is objective and neutral, or are biases that were present in the original, raw data now amplified by algorithms?

As with most things “police,” the Los Angeles Police Department is at the forefront of predictive policing. They are also now being scrutinized in how they implement and use predictive policing. Inspector General Mark Smith submitted a report ordered by the Board of Police Commissioners where he found that , “Officers used inconsistent criteria in targeting and tracking people they considered to be most likely to commit violent crimes.”

Predictive policing, like anything else, has its pitfalls when not used correctly. However, this type of intelligence-led policing and the technology behind it cannot and should not be abandoned because of some mistakes made by the LAPD or other agencies. These experiences and issues should be used to steer other agencies in the right direction.

Leveraging big data for intelligence-led, predictive policing is in the future for all law enforcement agencies big and small. It will enhance public safety as well as officer safety when used as the tool that it’s meant to be. It will also help to improve and strengthen community relations by augmenting police transparency and objectivity. But it needs to be implemented correctly, using unbiased data, with proper oversight that includes community stakeholders and strong controls that prevent misuse.

Predictive policing, like anything else, has its pitfalls when not used correctly. However, this type of intelligence-led policing and the technology behind it cannot and should not be abandoned because of some mistakes made by the LAPD or other agencies. These experiences and issues should be used to steer other agencies in the right direction.

Leveraging big data for intelligence-led, predictive policing is in the future for all law enforcement agencies big and small. It will enhance public safety as well as officer safety when used as the tool that it’s meant to be. It will also help to improve and strengthen community relations by augmenting police transparency and objectivity. But it needs to be implemented correctly, using unbiased data, with proper oversight that includes community stakeholders and strong controls that prevent misuse.