RIO DE JANEIRO — The 8 million daily passengers riding the São Paulo metro and train system in Brazil could soon get a dose of invasive surveillance.
A bill passed in the São Paulo state assembly would authorize the installation of facial recognition cameras in train cars and stations. If approved by Governor João Doria, the measure would pave the way for a massive expansion in the use of facial recognition surveillance in Brazil, where civil society groups have only begun to grapple with the new technology. They warn that, given the topic’s near non-existent public debate in Brazil, the mass installation of the young tech could lead to a new wave of racist policing.
At least 20 of Brazil's 26 states have begun piloting or implementing facial recognition cameras since 2019, but none on the scale now proposed in São Paulo, the most populous city in the Western Hemisphere.
"This is not a silver bullet," said Bruno Bioni, a lawyer and founder of Data Privacy Brasil, a think tank in São Paulo.
The legislation carries the risk of triggering a "waterfall effect," said Bioni. "And today, we do not have clear and robust evidence that facial recognition technology will not create or reinforce discriminatory practices, that they won’t create abusive uses.”
The issue, aside from the risks inherent in the massive data collection involved, said Bioni, begins with false positives. Facial recognition cameras scan and analyze crowds, searching for faces that correspond to images in an existing database. For security purposes, that usually means a mugshot bank maintained by local police. When a match is found, the system alerts the police, who then make an arrest.
But when a camera mismatches a face in public with an existing database image, police apprehend the wrong people.
Such was the case in July 2019. Just two days into Rio de Janeiro's own rollout, a camera installed in the neighborhood of Copacabana mistakenly matched a woman’s face with that of Maria Lêda Félix da Silva, a fugitive convicted of homicide. The apprehended woman was released hours later when family members provided proof of ID.
Other mistaken apprehensions, like when police swooped into a bakery and put their guns to the head of 25-year-old-man with special needs in Salvador, Bahia, weren't as peaceful.
"Even before the cameras, we knew that the police already have this racist approach," said Bruno Sousa, a researcher at O Panoptico, a monitoring project at the Candido Mendes University Center for Security and Citizenship Studies (CESeC) in Rio de Janeiro.
Cases like the ones above only become public when they make the news, said Sousa. Otherwise, Brazilian police do not readily provide information on the circumstances in which arrests are made. Groups like O Panoptico have taken to directly questioning police departments and filing freedom of information requests to analyze the data.
Initial research on the tech’s implementation isn’t promising.
“We are already seeing an absurdly high error rate,” said Sousa. Cameras installed around Rio’s Maracanã stadium ahead of the 2019 Copa America soccer tournament led to 11 apprehensions, according to a preliminary report by O Panoptico. Only four were true matches. “That’s a 63 percent rate of false alarms, in which people could have been arrested," said Sousa. "And this is a system that shouldn’t even err one percent of the time.”
Part of the issue is that the technology is young. “This is something new, something that’s undergoing so many transformations,” said Nina da Hora, a computer scientist at Rio de Janeiro’s Pontifical Catholic University. Chief among the new tech’s issues, da Hora pointed out, is its inaccuracy in identifying Black faces — algorithmic racism.
While facial recognition cameras commit minimal errors in recognizing the faces of white men, they misidentified black women up to nearly 35 percent of the time, according to a 2018 study by Joy Buolamwini at the Massachusetts Institute of Technology (MIT). In Brazil, where more than half the population is black or brown, mass installation could lead to a surge in mistaken apprehensions and unnecessary run-ins with the nation’s infamously deadly police.
Second, facial recognition tech is also only as good as its underlying database. In Brazil, where some two thirds of the prison population is Black, police image banks used for facial recognition matching are likely to be disproportionately Black. “You’re depending on a non-diverse data set,” said da Hora. “And no one knows its precedents or if it has been properly cared for.”
The mere act of leaving database management in the hands of state police can be problematic. After police released the woman mistakenly apprehended in Copacabana, for example, they found that the convict they mistook her for, Silva, wasn’t on the run at all. She was in jail — after having been arrested in 2015. Their database was out of date.
So far, São Paulo legislators have provided no information on which company will supply the cameras, who will manage the data, or even what database metro cameras will use, said Estela Waksberg Guerrini, a lawyer at the state’s Public Defender's Office. “But if it is constructed with data from the police — and historically we know that the police arrest more black people than white people, and not because they commit more crimes but because they are more actively pursued by police because of our racist history — then this database with more images of black people will be used to feed the camera system and, naturally, more black people than white people will be identified.”
Outside of São Paulo, that may already be the case. A CESeC analysis found that out of 151 arrests made using facial recognition technology throughout Brazil in the year 2019, 90.5 percent were of black Brazilians.
Despite overwhelming evidence of the technology’s issues, Bioni said the overall conversation in Brazil was still immature. Raising the point that a number of cities in the US and Europe have already opted to place moratoriums on the installation of facial recognition cameras, he added, “this debate is yet to take place in Brazil. We need to have that conversation. This is a question of the technology’s maturity, whether it is sufficiently mature to be adopted or not. Today, all signs point to no.”