Tech

Researchers Create 'Master Faces' to Bypass Facial Recognition

According to the paper, their findings imply that facial recognition systems are “extremely vulnerable.”
GettyImages-607358443
Image: Getty Images

Researchers have demonstrated a method to create "master faces," computer generated faces that act like master keys for facial recognition systems, and can impersonate several identities with what the researchers claim is a high probability of success. 

In their paper, researchers at the Blavatnik School of Computer Science and the School of Electrical Engineering in Tel Aviv detail how they successfully created nine "master key" faces that are able to impersonate almost half the faces in a dataset of three leading face recognition systems. The researchers say their results show these master faces can successfully impersonate over 40 percent of the population in these systems without any additional information or data of the person they are identifying. 

Advertisement

The paper cites previous research which showed a similar method for creating master fingerprints. According to the paper, their findings imply that facial recognition systems are “extremely vulnerable.”

The "master key" faces tended to be older, and didn't have glasses or facial hair. 

image1.png

The researchers tested their methods against three deep face recognition systems–Dlib, FaceNet, and SphereFace. Lead author Ron Shmelkin told Motherboard that they used these systems because they are capable of recognizing “high-level semantic features” of the faces that are more sophisticated than just skin color or lighting effects.

The researchers used a StyleGAN to generate the faces and then used an evolutionary algorithm and neural network to optimize and predict their success. The evolutionary strategy then creates iterations, or generations, of candidates of varying success rates. The researchers then used the algorithm to train a neural network, to classify the best candidates as the most promising ones. This is what teaches it to predict candidates’ success and, in turn, direct the algorithm to generate better candidates with a higher probability of passing. 

“We are interested in further exploring the possibility of using the master faces generated by our method in order to help protect existing facial recognition systems from such attacks,” Shmelkin told Motherboard. 

The researchers even predict that their master faces could be animated using deepfake technology to bypass liveness detection, which is used to determine whether a biometric sample is real or fake. 

The paper also notes that white males over the age of 60 in the University of Massachusetts’ Labeled Faces in the Wild (LFW) dataset tended to be less varied compared to younger groups, so much of that group could be covered by a single older master face. Additionally, only two of the nine master faces created were female, which the paper notes matches the “much lower frequency” of female faces in the LFW dataset (22 percent.)

The success of their findings shows how facial recognition software can be flawed and biased. Its continued use by law enforcement has resulted in multiple wrongful arrests, even when it has proven to be easy to tamper with.