Fingerprint readers and iris scanners are just a few of the biometric security mechanisms that manufacturers have been putting in smartphones, tablets, and laptops lately. But while slick and futuristic, these new and unique methods for securing mobile devices inevitably have new and unique vulnerabilities.
Take face authentication, for example. To ensure a stranger can't access someone's phone just by holding a picture of the owner's face in front of its camera, devices that offer face-unlock features have recently implemented ways of detecting motion and "liveness" in a face—essentially, looking for facial movement patterns like blinking in order to tell a "live" face from a flat picture or video.
But in a paper presented earlier this month at the USENIX Security Symposium in Austin, TX, a group of researchers was able to circumvent that safeguard using a virtual reality model of a person's head recreated from a handful of photos taken from social media.
The researchers show it's possible to defeat modern face authentication systems by creating a virtual model derived from high-resolution photos of the device's owner. Essentially, they were able to convince the device it was looking at a live face by attaching it to a VR headset and loading the 3D head model, whose movements are realistically motion-tracked by the device's accelerometers and gyroscopes. The researchers could then further manipulate the 3D head model within the headset to make realistic facial movements like smiling or raising an eyebrow, which face authentication systems often prompt a user to do.
All five of the face authentication systems tested were successfully spoofed with 3D models built from high-resolution photos. Lower resolution photos from social media were also able to spoof all but one of the systems, though each had a somewhat lower success rate than their hi-res versions.
"We argue that such VR-based spoofing attacks constitute a fundamentally new class of attacks that point to a serious weaknesses in camera-based authentication systems: Unless they incorporate other sources of verifiable data, systems relying on color image data and camera motion are prone to attacks via virtual realism," the researchers write, suggesting that a robust face authentication system would need to incorporate some kind of non-public imagery of the user, like a skin heat map.
"Given the widespread nature of high-resolution personal online photos, today's adversaries have a goldmine of information at their disposal for synthetically creating fake face data."