This Predictive Policing Company Compares Its Software to ‘Broken Windows’ Policing
Crime-predicting software PredPol perpetuates discrimination, just like the discredited “broken windows” policing strategy, say digital rights advocates.
Image: Armando L. Sanchez/Chicago Tribune/TNS via Getty Images
Police training documents from PredPol, a company that sells predictive policing software, shows that the company considers its software to be comparable to a “broken windows” policing strategy that has led to overpolicing of minority communities and is widely believed to be ineffective.
The documents, obtained through a freedom of information request by digital rights group Lucy Parsons Labs from the Elgin, Illinois Police Department, include police training materials produced by PredPol and contracts between the city—a suburb of Chicago—and the company. PredPol supplies technology to dozens of police and intelligence agencies across North America.
Jake Ader, a contributor to Lucy Parsons Labs, filed the freedom of information request with the City of Elgin in April 2018. After the city denied Ader’s request by saying that disclosure of the information could endanger cops’ lives, he appealed the decision and in May the city finally released the training manual that PredPol provided to Elgin police.
In the manual, the firm compares its software to the “broken windows” policing strategy, a widely criticized approach to policing that involves cracking down on minor crimes with the goal of deterring more serious criminal activity.
Predpol also explains how its software can predict which crimes will happen in areas as small as 500 by 500 feet, based on historical crime data. This data is fed into an algorithm that spits out predictions of where similar crimes will occur next. But PredPol software doesn’t predict instances of white-collar crime, such as mortgage fraud, leading police to focus only on neighborhoods where arrests for street crimes, including assault and robbery, have occurred in the past.
I left a message with the Elgin police media relations department asking if the force still used PredPol software, but did not receive a reply. The contract documents obtained by Ader show that Elgin police signed an initial three-year contract with PredPol for $55 thousand in 2012 and renewed their agreement through 2017.
Ader says that even though evidence shows that predictive policing doesn’t work and that it perpetuates discrimination against marginalized communities, the use of predictive policing has expanded across the US in recent years, including in New York City, Los Angeles and other, smaller cities.
Predictive policing “is happening under the radar,” Ader told Motherboard by email. “The City of Chicago has its own secretive [predictive policing] algorithm called the Strategic Subject Lists (SSL). We know that 56 percent of black men in the city [between] the ages of 20 and 29 have an SSL score,” a number used by police to determine an individual’s supposed risk of committing a crime in the future.
Ader says predictive policing is inherently biased because the data used to make crime predictions is based on years of biased policing strategies that “over-criminalize” certain neighborhoods.
Ader says this leads to “absurd” situations where police may end up patrolling only certain city blocks “on the off chance they catch a burglar.”
Even as more police forces adopt predictive software, police in several cities have decided to reject the technology. After evaluating PredPol’s software, police in Oakland, California, declined to adopt it, partially out of concern that it could lead to racial profiling. Police in the cities of Richmond and Milpitas in California also cut ties with PredPol.
“A major reason that police have rejected this software is because it doesn’t work,” says Ader. But he says that pressure from privacy rights groups working to raise awareness about predictive policing has been the most effective tactic to shed light on the use of the technology.
There has been a major push in Oakland to be involved in bringing more transparency to the use of predictive software, Ader says, noting that the Oakland Privacy Commissioner and the Human Rights Data Analysis Group are actively pushing to publicize information about predictive policing.
Ader says these efforts are crucial in informing the public about the use of the technology, because in some cities predictive policing has been put into practice without public consultation.
“New Orleans’ [use of predictive policing] was hidden from the city council. When government isn’t even aware of these programs, how are communities supposed to be protected?”