Lawmakers have introduced a bill that would place a moratorium on the use of facial recognition technology by government and law enforcement, an effort that represents Congress' latest attempt to roll back the controversial technology.
The Facial Recognition and Biometric Technology Moratorium Act—which is being co-sponsored by Senators Ed Markey (D-MA) and Jeff Merkley (D-OR) and Representatives Pramila Jayapal (D-WA), Ayanna Pressley (D-MA) and Rashida Tlaib (D-MI)— proposes a total ban on the use of facial recognition technology that can only be lifted by Congress. It also prohibits the use of other biometric systems, such as voice and gait recognition, by federal and local entities. The bill is being co-sponsored by Senators Bernie Sanders (I-VT), Elizabeth Warren (D-MA) and Ron Wyden (D-OR).
The legislation would also give individuals and state attorney generals the right to take legal action against the federal government if their biometric information is gathered or used by these technologies.
“We are seeing continued use of facial recognition platforms and technologies by our government and law enforcement, resulting in reports of discriminatory outcomes that have put innocent people behind bars," Sen. Merkley said in a statement to Motherboard. "It’s clear that we can’t rely on private companies to implement their own moratoria on technology that isn’t ready for prime time.”
Facial recognition analyzes images of human faces to identify or track people, and has been repeatedly proven to be inaccurate and racist. Researchers have noted how this endangers people from marginalized communities, especially when the technology is used by law enforcement.
“People should be able to seek medical treatment, attend religious services, and visit friends and family without worrying that government agencies are keeping tabs on their every movement,” Carol Rose, executive director of the ACLU of Massachusetts, said in a press statement announcing the new bill.
Facial recognition already has a track record of enabling discrimination. In 2020, Robert Williams, a Black man from Detroit, was wrongfully arrested for shoplifting and detained for thirty hours after being misidentified by facial recognition software used by the Detroit Police Department. The software identified an image from the grainy surveillance footage to be Williams, who was then picked from a photo lineup by a security guard who wasn’t actually present for the incident. The previous year, Detroit police misidentified another Black man, Michael Oliver, using the technology.
Oliver, Williams, and the ACLU have since filed lawsuits against the Detroit Police Department for their use of the technology.
“There's still no enforceable obligation into how they should disclose the standards for the system to be used, or the standards that investigators must meet in order to use it, which are all things that are clear deficiencies,” Kate Ruane, senior legislative counsel at the ACLU, told Motherboard.
A prominent 2018 study by the MIT Media Lab found that facial recognition products from IBM, Microsoft, and Face++ were more accurate in identifying male subjects than female subjects. Face++ showed a 20.1 percent difference in accuracy between the genders.
When studying how it identified subjects based on their skin tone using the dermatologist-approved Fitzpatrick skin type classification, all products performed worst on darker skinned females. The study notes that “93.6 percent of faces misgendered by Microsoft were those of darker subjects.”
“This is also just technology that's being used in a system to augment biases and speed up procedures that are racist,” Caitlin Seeley George, director of campaigns and operations at digital rights group Fight For The Future, told Motherboard. “We know that there's no way for it to be used safely, that it is only going to be vastly used in cases that do more harm than good.”
This legislation follows years of efforts by grassroots organizations that have locally organized against the use of facial recognition technology. 20 cities across the country have officially banned its use, and Vermont became the first and only state to do so in October 2020.
“It's in their [technology companies] interest to push for weaker laws if they want authorization to be able to sell this access to this technology to law enforcement,” said the ACLU's Ruane. “Standard setting is something to be done far down the road, if we ever do it at all.”