Oakland Becomes Third U.S. City to Ban Facial Recognition

Oakland just joined San Francisco, CA and Somerville, MA in banning the use of facial recognition.
Surveillance cameras.

Oakland, California just became the third U.S. city to ban the use of facial recognition in public spaces.

A city ordinance passed Tuesday night which prohibits the city of Oakland from "acquiring, obtaining, retaining, requesting, or accessing" facial recognition technology, which it defines as “an automated or semi-automated process that assists in identifying or verifying an individual based on an individual's face.”


The ordinance amends a 2018 law which requires any city staff member to get approval from the chair of Oakland's Privacy Advisory Commission before “seeking or soliciting funds" for surveillance technology. State and federal funding for surveillance technology must also be approved by the chair, per the ordinance.

San Francisco banned the use of facial recognition by police and city government agencies a month ago, making it the first U.S. city to do so. Somerville, Mass., similarly banned the use of facial recognition last month. The success of a similar ordinance from Oakland shows that there’s momentum in major U.S. cities behind the idea that we shouldn’t just regulate the use of facial recognition, but ban it entirely.

According to a public memo by Rebecca Kaplan, Oakland City Council President, the ban was instituted on the basis that facial recognition is often inaccurate, lacks established ethical standards, is invasive in nature, and has a high potential for government abuse.

"Face recognition technology runs the risk of making Oakland residents less safe as the misidentification of individuals could lead to the misuse of force, false incarceration, and minority-based persecution," Kaplan said.

In a report to Oakland's Public Safety Committee, Chief of Police Anne Kirkpatrick said that the Oakland Police department doesn't currently have any technology that could be described as "facial recognition" and doesn’t have any plans to acquire the technology. However, Kirkpatrick argued that facial recognition could help law enforcement, and advised against a total ban.


“Staff does believe that Oakland’s current surveillance technology provides adequate thresholds for reviewing any possible future requests to test or purchase [facial recognition technology],” Kirkpatrick said.

Kirkpatrick said that the nearby San Mateo Sheriff’s Office has a shared “in-house facial recognition system” through the Northern California Regional Intelligence Center (NCRIC), a regional law enforcement agency. Kirkpatrick recommended that the Oakland Police Department use information from this intelligence center. The NCRIC also uses Palantir.

So per Kirkpatrick’s suggestion, there wouldn’t be a total ban on facial recognition—just a restriction on where Oakland police can use it.

Tracey Rosenberg—a spokesperson for Oakland Privacy, a grassroots coalition of citizens dedicated to increasing "public transparency and oversight" over the use of surveillance—said in a phone call that the Oakland Police Department’s proposed compromise didn’t go far enough.

“The problem with that is it essentially becomes an outsourcing of a problematic technology,” Rosenberg said. “By investing in it, by using it, by normalizing it, we set up a situation where Oakland may not be engaging in real-time facial recognition, but they’re sort of funding the landscape of the development of the technology and its ubiquitous use in law enforcement.”

Last year, Oakland Privacy campaigned in favor of a Surveillance Equipment Transparency Ordinance, which now requires any city surveillance to have public use and privacy policies, data sharing information, and yearly use reports. Similar ordinances have passed in nine other cities, and Santa Clara County.

“When you have a police force that has a history of racial profiling, and then you give them software that we know is racially biased and unable to clearly distinguish the face of darker skinned people, and will potentially be informing police in real time that darker skinned people are criminals when in fact they are not by false matches,” Rosenberg said, “there’s potential there for horror and tragedy.”