The UK government is quietly developing a “murder prediction” system designed to identify potential killers before they commit a crime. Yes, it is, very literally, the plot of Minority Report.
At least on the surface. There are, of course, some differences, like how the UK’s version is not powered by three people with the powers of precognition who are perpetually floating in water tanks. The real version runs on algorithms being fed your personal data—typical dystopian stuff, and not quite sci–fi dystopian.
Videos by VICE
The program was originally slated to be called the “homicide prediction project,” but I guess that was deemed too terrifying and on the nose. It was swiftly rebranded to the more euphemistically and bureaucratically toothless “Sharing Data to Improve Risk Assessment.”
Britain’s Pre-Crime Program Wants to Minority Report You
The Ministry of Justice says it’s just research, nothing concrete yet. A UK-based investigative journalism outfit called Statewatch dug up some FOI documents revealing the program’s ambition: feeding thousands of people’s data into algorithms to sniff out potential murderers. This includes deeply personal information like mental health status, addictions, self-harm history, and contact with police—even if you were the one calling for help.
All you want to do is call the cops to scare off a bad guy and now you are a part of a murder prediction model, and who knows exactly how you’ll be a part of its predictions?
The MoJ insists it’s all very ethical and only includes people with prior convictions, but of course, simple observation of how the world actually works makes us all abundantly aware that it will not exclusively include people with prior convictions. It will be used to include anybody the curators of the algorithm deemed unfit for their corrupted vision of a perfect society.
This is especially true of supposedly unbiased, rigidly objective, algorithmically based technology. Tech folks trying to sell a product love to position technologies like this as only concerned with cold hard data that is unburdened by prejudice. But that’s not how any of this works. It is been found time and again that people in the tech world are constantly inserting their own biases into everything they build.
Crime-predicting technologies have an inherent flaw: the biases inherent within the people building them shine through. But police departments all over the world, looking to put in as little effort as humanly possible into investigating criminality, will continue investing in crime prediction technologies that will almost exclusively target ethnic minority groups and the poor.
Even if thousands of experts in the field come together to collectively condemn such technologies.
More
From VICE
-
Screenshot: InEv Games -
(Photo by John Keeble/Getty Images) -
Photo: Marc Teyssier / O2 -
INDIANAPOLIS, IN – MAY 13: (L-R) Matt Keil, Andrew Neufeld and Jeremy Hiebert of Comeback Kid performs live at The Emerson Theater on May 13, 2012 in Indianapolis, Indiana. (Photo by Joey Foley/Getty Images)