Tech

The Netherlands Is Becoming a Predictive Policing Hot Spot

'Predictive policing projects like these are explicitly biased and prejudiced and rely on data that is explicitly biased and prejudiced, but nobody does anything about it.'
GettyImages-186714817
Image: Getty Images

“Number of one parent households.” “Number of social benefits recipients.” “Number of non-Western immigrants.”

These are just some of the demographic variables that have been used by CAS (Crime Anticipation System) to predict 125x125 meter crime “hot spots” across the Netherlands. The system, which relies entirely on automated algorithms and data analytics, was piloted in Amsterdam in 2014 and rolled out nationally in 2017. Critics at the time pointed to the system as a potential slippery slope, one which could lead to an increased willingness by law enforcement to embrace algorithmic models that perpetuate discriminatory practices like ethnic profiling.

Advertisement

Fast forward to 2020 and it seems like they were right to be concerned.

A report released late last month by Amnesty International revealed that Dutch law enforcement have been engaged in a number of predictive-policing pilots and referred to the Netherlands as “one of the countries at the forefront of predictive policing in practice.” The report also calls on law enforcement to halt all predictive policing projects until legislative safeguards are put in place and accuses the pilots of multiple human rights violations, including the right to privacy, the presumption of innocence, and the right to non-discrimination.

Blatant Discriminiation

Amnesty’s report particularly focuses on a predictive policing pilot in Roermond (a municipality in the southeast of the Netherlands) called the Sensing Project, which uses cameras to capture the license plate, brand, model, and year of manufacture of passing cars. According to the report, which primarily relies on documents obtained via WOB (public information request) requests, this data is fed into an algorithm that in real time calculates “hits” on passing vehicles. These hits are then automatically sent to police officers, who can decide whether or not they want to investigate.

The project is not only intrusive, the report claims, but discriminatory by design, since its aim is to fight “mobile banditry” (crimes like theft, pickpocketing, and drug trafficking), a term which explicitly excludes people of Dutch nationality and assumes that the offender is either of Eastern European origin or Romani, a minority ethnic group. Considering this, it is unsurprising that some of the variables that generate “extra points” when calculating the risk scores of passing vehicles include whether the car has an Eastern European license plate and the route (re-tracked via automatic licence plate recognition cameras) it is taking, leading to what the report describes as “the automation of ethnic profiling.”

Advertisement

Much to the dismay of human rights and privacy activists, programs like the Sensing project have seen relatively little pushback.

“Article one of the Dutch constitution is the prohibition of the discrimination,” Gerbrig Klos, one of the authors of the Amnesty report, told Motherboard. “If there’s one constitutional article that every person in the Netherlands knows, it’s that one. It’s at the front of every police station. Members of law enforcement and the government all say ‘ethnic profiling is prohibited’ and yet, at the same time, we find time and time again that predictive policing projects like these are explicitly biased and prejudiced and rely on data that is explicitly biased and prejudiced, but nobody does anything about it.”

“The big question I keep asking myself,” she continued, “is why have they allowed this?”

Strengthening Inequality

Predictive policing in the Netherlands has also been weaponized to target people with low incomes.

Last year it was revealed that the Dutch Authorities were using automated algorithms in a system known as SyRI (System Risk Indication) to predict fraud. The system was criticized by the UN special rapporteur on poverty for using parameters that were explicitly targeted at people from low income backgrounds and ethnic minorities. It was later found to be in violation of existing European human rights law and was finally discontinued earlier this year.

Advertisement

Marc Schuilenburg is a professor of law and criminology at the Vrije Universiteit Amsterdam and author of the upcoming book Hysteria: Crime, Media, and Politics. He argues that predictive policing not only reflects existing inequalities, but also strengthens them.

“Predictive policing in the Netherlands is always focused on petty crimes committed by the underclass. It’s never focused on big fraud committed by the upper class, which we as criminologists know is hugely prevalent,” Schuilenburg told Motherboard. “When you analyze predictive policing, you can’t analyze it as something that stands on its own. It’s always linked to what I term the ‘surveillance continuum’ of other methods of policing which explicitly target ethnic minorities and people from the underclass, such as hotspot policing.”

They also create feedback loops in which algorithms designed on already biased data produce even more biased data, which is then fed back in. In the case of the Sensing project, for example, Amnesty’s report points out that investigations of cars flagged as ‘hits’–even if the hits are false positives–are entered into operational databases as relating to “mobile banditry.” This essentially means that the predetermined bias of the algorithm against Eastern European and Romani drivers is perpetually reconfirmed.

No End in Sight

With new predictive policing and surveillance pilots appearing all across the Netherlands and faint regulatory pushback, it doesn’t look like predictive policing practices will end anytime soon. As cities like Rotterdam turn themselves into self-described “smart cities” where lampposts are fitted with sensors to detect burglaries, it seems there will only be more data to dump into systems like CAS.

Advertisement

"I mostly miss discussion about these kinds of programs,” says Lotte Houwing, a policy analyst at the activist and privacy research organization Bits of Freedom. “There is this tendency to introduce these kinds of programs under the guise of a pilot. This is used as an excuse for lacking a solid legal basis. But then surveillance that is introduced (in whatever way) tends to stick around."

But for researchers like Schuilenburg, just as dangerous as the move towards predictive law enforcement technology itself is the accompanying move away from traditional notions of justice and criminality.

“The history of criminal law is based on a person’s action: you almost always have to act,” he says. “But with this shift towards predictive technology and authorities trying to intervene on the basis of predictions, we have this shift from post-crime to pre-crime where the focus is to get inside your head and anticipate what you’re going to do. In that shift it’s your mind and your thoughts that become the object of suspicion.”