The New York City Police Department has built a sprawling facial recognition network that may include more than 15,000 surveillance cameras in Manhattan, Brooklyn, and the Bronx, according to a massive crowdsourced investigation by Amnesty International.
Thousands of volunteers examined Google Maps street view images of the three boroughs and logged the locations of both public and privately owned surveillance cameras. They found 3,590 in Manhattan, 8,220 in Brooklyn, and 3,470 in the Bronx. The highest concentrations of cameras appeared to be in predominantly Black and brown neighborhoods like Brooklyn’s East New York, which was the city’s most surveilled neighborhood with 577 cameras. The project is still collecting data for Queens and Staten Island.
NYPD conducted more than 22,000 facial recognition searches from October 2016 through October 2019, according to disclosures it made in a Freedom of Information lawsuit brought by the Surveillance Technology Oversight Project. Those searches were made possible by the camera network Amnesty documented, which shows there are few patches of sidewalk in the city on which New Yorkers can’t be watched, identified, and tracked.
“This sprawling network of cameras can be used by police for invasive facial recognition and risk turning New York into an Orwellian surveillance city,” Matt Mahmoudi, an artificial intelligence and human rights researcher at Amnesty, wrote in the group’s report. “You are never anonymous. Whether you’re attending a protest, walking to a particular neighbourhood, or even just grocery shopping—your face can be tracked by facial recognition technology using imagery from thousands of camera points across New York.”
Amnesty also modeled the field of vision for the NYPD’s Argus wide-area aerial surveillance cameras and estimates that they can cover 200 meters, or two city blocks.
The human rights organization embarked on the volunteer survey of surveillance cameras in New York in large part because the NYPD has stonewalled previous efforts to get information about the extent of its surveillance programs.
“There has been a glaring lack of information around the NYPD’s use of facial recognition software—making it impossible for New Yorkers to know if and when their face is being tracked across the city,” Mahmoudi wrote. “The NYPD’s issues with systemic racism and discrimination are well-documented—so, too, is the technology’s bias against women and people of colour. Using FRT with images from thousands of cameras across the city risks amplifying racist policing, harassment of protesters, and could even lead to wrongful arrests.”