On Thursday, a coalition of 48 civil rights and advocacy groups organized by Athena asked the Federal Trade Commission to exercise its rulemaking authority by banning corporate facial surveillance technology, banning continuous corporate surveillance of public spaces, and protecting the public from data abuse.
“The harms caused by this widespread, unregulated corporate surveillance pose a direct threat to the public at large, especially for Black and brown people most often criminalized using surveillance,” the coalition wrote in an open letter. “Given these dangers, we’re calling on the Federal Trade Commission (FTC) to use its rulemaking authority to ban corporate use of facial surveillance technology, ban continuous surveillance in places of public accommodation, and stop industry-wide data abuse.”
While a number of firms offer networked surveillance devices to try and make homes "smart," the coalition uses Amazon as a case study into how dangerous corporate surveillance can become (and the sorts of abuses that can emerge) when in the hands of a dominant and anti-competitive firm. From Amazon's Ring—which has rolled out networked surveillance doorbells and car cameras that continuously surveil public and private spaces—to Alexa, Echo, or Sidewalk, the company has launched numerous products and services to try and convince consumers to generate as much data as possible for the company to eventually capitalize on.
“Pervasive surveillance entrenches Amazon’s monopoly. The corporation’s unprecedented data collection feeds development of new and existing artificial intelligence products, further entrenching and enhancing its monopoly power,” the coalition letter argues.
Amazon did not respond to a request for comment.
From this nexus of monopolistic power and unchallenged power, the coalition draws a long list of abuses committed by Amazon that have harmed consumers, communities, and total bystanders. Ring's surveillance devices have been hacked multiple times, have leaked owners’ Wi-Fi passwords, and shared locations over the Neighbors App. Vulnerabilities in Alexa risked revealing personally identifiable information, and all this takes place within the context of a lack of transparency around security protocols that force consumers to opt out of surveillance conducted without their consent.
On Ring's Neighbors App, racial profiling has been gamified to encourage and escalate surveillance of "suspicious" people. The company collects personal information on children—a potential violation of the Children's Online Privacy Protection Act—but has also seen the adoption of its various surveillance devices increase in schools, libraries, and communities across the country. Paired with Amazon's development of deeply biased facial surveillance technology and its partnerships with the police and fire departments of over 2,000 cities, the group argues the potential for abuse outstrips a threshold anyone should be comfortable with.
"This type of surveillance is illegal under the FTC Act in Section 5 and in particular the section that talks about unfair and deceptive practices," said Jane Chung, the Big Tech Accountability Advocate at Public CItizen, in an interview. "There's a list of three things that have to be true in order for a practice to be unfair and deceptive according to the FTC. Number 1: it has to cause substantial injury. Number 2: the injury can't be avoidable. And number 3: the injury isn't outweighed by benefits."
Chung pointed to biometric surveillance as a painfully clear example of an unfair and deceptive practice that the FTC has the authority to stop. Facial surveillance technology is deeply biased and has already resulted in over-policing and wrongful arrests. Biometric surveillance not only takes place without the consent of bystanders, but there aren’t countervailing benefits given how quickly it lends itself to more surveillance, more policing, and more abuse.
“Rulemaking is needed to stop widespread systematic surveillance, discrimination, lax security, tracking of individuals, and the sharing of data. While Amazon’s smart home ecosystem, facial surveillance technology, and e-learning devices provide a good case study, these rules must extend beyond this one technology corporation to include any entity collecting, using, selling, and/or sharing personal data.”
On Wednesday, FTC Chair Lina Khan spoke before the House Subcommittee on Consumer Protection and Commerce, commenting on 16 bills and how they might affect the FTC's ability to enforce fair business practices. Khan's testimony, which touched on fraud "supercharged" by monopolistic platforms with business models that have allowed and perversely incentivized such practices, is of interest because it rhymes with the coalition's own warnings about corporate surveillance abuses supercharged by monopolistic behavior.
“Significant market consolidation deprives consumers, workers, and independent businesses of choice, further enabling dominant firms to engage in unfair practices,” Khan said “As the wave of privacy abuses in recent years has shown, market dominance often allows companies to renege on commitments, evade the law, and repeatedly violate Commission orders.”
Already, the FTC has moved against abusive data practices. App developer Everalbum was recently ordered to delete data it kept from customers who deactivated their accounts, as well facial surveillance models and algorithms developed using that data. Furthermore, the company must obtain "express consent" going forward from consumers to use facial recognition technology on their photos and videos.
"This is a means to activate and engage a broader group of organizations outside of the traditional anti-monopoly space," said Vasudha Desikan, political director at the Action Center on Race and the Economy, in an interview. "This is for a lot of organizers who have been calling Amazon out for selling white nationalist and Islamophobic products on their platform. For activists who have been fighting Ring contracts with the police. Getting them involved is deeply helpful because they're the ones that came to the field and moved the public narrative. We have to make corporate power an issue at the ballot box."