Facial recognition is notorious for misidentifying women more than men, Asian and Black faces more often than white faces, and struggling to recognize anyone non-binary or transgender. State-of-the-art facial recognition systems have already led to false arrests and been banned in several cities.
So naturally, the U.S. Army now wants to take the technology a step further and help military and law enforcement agencies use facial recognition in the dark.
In a recently published paper, first reported by IEEE Spectrum, researchers announced the creation of the largest ever dataset of faces in both the visible and thermal spectrums, designed to train facial recognition models to work in low-light and nighttime settings. The paper was authored by scientists from the Army Research Laboratory in Maryland, defense contractor Booz-Allen Hamilton, West Virginia University, and several other colleges.
The paper’s corresponding author did not immediately respond to a request for comment.
The dataset is large, consisting of 549,712 images collected from 395 subjects, all of whom consented to the collection, according to the paper. And unlike some larger, and widely used training databases, this one includes pictures of the subjects taken in varying degrees of light and by thermal cameras in darkness.
Beyond referencing possible applications for the military, law enforcement, and the healthcare industry, the papers’ authors do not address in the paper how facial recognition should be deployed in the dark or the ethical implications of using even less accurate models.
The authors present the paper as an early step in developing more advanced, accurate thermal facial recognition algorithms. Viewed within the context of facial recognition’s history, that’s worrying, Clare Garvie, a senior associate at Georgetown Law’s Center on Privacy and Technology told Motherboard, because visible-spectrum facial recognition was itself never proven to be a valid forensic science before it was widely adopted in high-stakes scenarios.
“That’s what we skipped over with face recognition as it’s used today,” she said. “We never established the validity or viability of this as a forensic technique, and yet we’ve been using it for the last 20 years. I fear we’re going to do that with thermal … and what that will lead to is a risk of misidentification.”