Advertisement
Motherboard

Amazon Says The Face Recognition Tech It Sells to Cops Can Now Detect ‘Fear’

Activists fear more family separations and round-ups as Amazon expands its face surveillance offerings.

by Janus Rose
Aug 13 2019, 9:47pm

Kevin Hagen / Getty Images

Amazon has faced public outrage for providing cloud services to the U.S. government, including law enforcement agencies that conduct mass-raids and separate families at the southern border. Now, Amazon Web Services (AWS) has rolled out more terrifying features for its cloud-based facial recognition system—including, it claims, the ability to detect fear.

“Amazon Rekognition provides a comprehensive set of face detection, analysis, and recognition features for image and video analysis,” a blog post announcing the new features reads. “Face analysis generates metadata about detected faces in the form of gender, age range, emotions,” and other attributes such as whether the subject is smiling.

Emotion recognition is a facial analysis technique that has been marketed by private companies like Affectiva, Kairos, and Amazon. It works by training a machine learning system to look for certain features on a detected face which indicate emotional content. For example, a raised brow could indicate concern or bewilderment, while a downturned mouth could show feelings of repulsion.

The AWS post says that Amazon has updated the range of detectable emotions for Rekognition’s face analysis to include “fear,” adding to a list of seven other emotional states: “Happy”, “Sad”, “Angry”, “Surprised”, “Disgusted”, “Calm”, and “Confused.”

Despite Amazon's bold claims, the efficacy of emotion recognition is in dispute. A recent study reviewing over 1,000 academic papers on emotion recognition found that the technique is deeply flawed—there just isn't a strong enough correlation between facial expressions and actual human emotions, and common methods for training algorithms to spot emotions present a host of other problems.

Nevertheless, activists say these technologies are especially harmful in the hands of government agencies like Immigration and Customs Enforcement (ICE) and Customs and Border Patrol (CBP).

Last year, Amazon pitched its Rekognition system to ICE, triggering widespread backlash from human rights advocates and its own employees. In July, researchers discovered that ICE used a different facial recognition system to search through driver’s license databases in more than a dozen U.S. states.

“Amazon provides the technological backbone for the brutal deportation and detention machine that is already terrorizing immigrant communities,” said Audrey Sasson, the Executive Director of Jews For Racial and Economic Justice, in an email to Motherboard. “[A]nd now Amazon is giving ICE tools to use the terror the agency already inflicts to help agents round people up and put them in concentration camps.”

The harmful nature of facial recognition and analysis in the hands of law enforcement has caused some cities to re-think whether the technology can be deployed ethically. San Francisco, CA and Somerville, MA have banned municipal use of face recognition, and a similar measure is being considered in Cambridge, MA this Fall.