Amazon Says The Face Recognition Tech It Sells to Cops Can Now Detect ‘Fear’

Activists fear more family separations and round-ups as Amazon expands its face surveillance offerings.
Janus Rose
New York, US
Demonstrators protest Amazon contracts with ICE
Kevin Hagen / Getty Images

Amazon has faced public outrage for providing cloud services to the U.S. government, including law enforcement agencies that conduct mass-raids and separate families at the southern border. Now, Amazon Web Services (AWS) has rolled out more terrifying features for its cloud-based facial recognition system—including, it claims, the ability to detect fear.

“Amazon Rekognition provides a comprehensive set of face detection, analysis, and recognition features for image and video analysis,” a blog post announcing the new features reads. “Face analysis generates metadata about detected faces in the form of gender, age range, emotions,” and other attributes such as whether the subject is smiling.


Emotion recognition is a facial analysis technique that has been marketed by private companies like Affectiva, Kairos, and Amazon. It works by training a machine learning system to look for certain features on a detected face which indicate emotional content. For example, a raised brow could indicate concern or bewilderment, while a downturned mouth could show feelings of repulsion.

The AWS post says that Amazon has updated the range of detectable emotions for Rekognition’s face analysis to include “fear,” adding to a list of seven other emotional states: “Happy”, “Sad”, “Angry”, “Surprised”, “Disgusted”, “Calm”, and “Confused.”

Despite Amazon's bold claims, the efficacy of emotion recognition is in dispute. A recent study reviewing over 1,000 academic papers on emotion recognition found that the technique is deeply flawed—there just isn't a strong enough correlation between facial expressions and actual human emotions, and common methods for training algorithms to spot emotions present a host of other problems.

Nevertheless, activists say these technologies are especially harmful in the hands of government agencies like Immigration and Customs Enforcement (ICE) and Customs and Border Patrol (CBP).

Last year, Amazon pitched its Rekognition system to ICE, triggering widespread backlash from human rights advocates and its own employees. In July, researchers discovered that ICE used a different facial recognition system to search through driver’s license databases in more than a dozen U.S. states.

“Amazon provides the technological backbone for the brutal deportation and detention machine that is already terrorizing immigrant communities,” said Audrey Sasson, the Executive Director of Jews For Racial and Economic Justice, in an email to Motherboard. “[A]nd now Amazon is giving ICE tools to use the terror the agency already inflicts to help agents round people up and put them in concentration camps.”

The harmful nature of facial recognition and analysis in the hands of law enforcement has caused some cities to re-think whether the technology can be deployed ethically. San Francisco, CA and Somerville, MA have banned municipal use of face recognition, and a similar measure is being considered in Cambridge, MA this Fall.

The AWS update comes just a few days after a series of coordinated ICE raids, which arrested 680 workers in Mississippi. Activists have responded with intensifying protests against Amazon and data surveillance firm Palantir, demanding an end to the companies’ sale of technology to U.S. government agencies like ICE and CBP.

On Sunday, just before the AWS update rolled out, over 1,000 “Jews Against ICE” protesters occupied an Amazon store in New York City to condemn the company’s role in facilitating family separations and mass-deportations. About 40 people were arrested as protesters blocked entrances and led a service for Tisha B’Av, the Jewish day of mourning.

Sasson, who was at the Sunday action along with members of Jews For Racial and Economic Justice, points out that tech companies have played a significant role in historical atrocities—and today’s Silicon Valley giants now risk repeating history.
“Just as IBM collaborated with the Nazis, Amazon and Palantir are collaborating with ICE today. They’ve chosen which side of history they want to be on.” Sasson told Motherboard. “Working with groups like Mijente, we will hold Amazon accountable for its role in building ICE’s deportation machines, separating families, locking children in cages, and harming immigrant communities.”

Update: This story has been updated with additional context about the efficacy of emotion recognition.