British immigrant rights advocates won a landmark victory Tuesday, forcing the U.K. government to stop using a discriminatory algorithm in the country’s first ever legal challenge of its kind.
The U.K.’s Home Office, which oversees immigration and law enforcement, among other responsibilities, announced that it would suspend use of the automated decision-making system, referred to as the Streaming Tool, which the agency used to determine whether visa applicants represented a high-, medium-, or low-risk to the nation. The decision came as a result of a legal complaint filed with the British High Court by the nonprofit Joint Council for the Welfare of Immigrants (JCWI) and Foxglove Legal, a technology justice advocacy group.
The advocates argued that the Home Office’s visa algorithm drew conclusions about visa applicants based on their nationality and data from immigration raids that disproportionately targeted people from certain countries. As a result, it was prone to fast-tracking applicants from predominantly white countries and sorting people from other nations into a longer and more onerous review process. The Home Office’s use of the algorithm was first reported by the Financial Times.
Chai Patel, legal policy director for JCWI, described the algorithm as “a pretty obvious feedback loop of racism.”
“It’s deeply dodgy to use nationality in the first place and we wouldn’t have been very happy with that anyway, but then add to that the sort of unsophisticated way in which they were processing bad data to begin with and then putting it into a tool that just reinforces bias,” he told Motherboard. “It’s obviously a self-fulfilling prophecy.”
The retreat on the visa algorithm comes close on the heels of a blistering 275-page report on the culture of racism within the Home Office and the Windrush Scandal, which saw hundreds of legal UK residents improperly detained and deported.
In its letter to JCWI and Foxglove announcing that it would cease to use the visa algorithm, the Home Office said it did not accept the allegations against it, but that it would redesign the system and “intends carefully to consider and assess the points [JCWI and Foxglove] raised in [their] Claim, including issues around unconscious bias and the use of nationality, generally.”
The timing of the Home Office’s decision to pull the algorithmic system—coming before it filed its formal response to the challenge with the court—is significant. It allowed the government to avoid disclosing more details about how the algorithm worked and side-stepped what could potentially have been a precedent-setting decision by the High Court that would affect other automated decision systems currently in use.
“This algorithm, while it’s the first to be challenged, I think is the first of many,” Martha Dark, Foxglove’s director and co-founder, told Motherboard. “Particularly where government services are concerned, we need more of a debate about how these algorithms are working. I think one thing that’s been very clear from this case is that it absolutely can’t be a black box.”