Facial Recognition Caught a Fugitive, But It's Still Got a Ways to Go

Even as facial recognition proves itself, bigger databases will likely lead to more mismatches.

Aug 14 2014, 6:00pm

Using facial recognition software, the FBI have caught a fugitive who was on the run for 14 years. 

Neil Stammer, wanted for child abuse and kidnapping, escaped capture from the US authorities in 1999. The case went cold, but recently one agent took an interest in it. By chance, another agent with the Diplomatic Security Service (DDS) was trying some new facial recognition software.

The purpose of the kit was to uncover passport fraud, and when Stammer's wanted poster photo was put through the scanner to test the software, his face popped up on a Nepalese visa application under a different name. Stammer was then tracked down and arrested in Nepal, according to the FBI.

This certainly shows the power of facial recognition tech. But it's important to remember the specific context it was carried out in. The passport photo, and the visa photo it was matched to, created an ideal situation for law enforcement.

"It's a portrait style image that is full face, looking at the camera, and probably evenly lit. That's the best environment in which to do it," Patrick Usher, technical director of facial biometrics company Aurora, told me. It probably also helped that Stammer didn't decide to radically alter this appearance since making his getaway. 

Usher laid out some of the biggest problems that researchers have had to overcome when trying to make their software more accurate. There's the angle of the face in the photo to consider; the quality of the camera taking the picture; inadvertent pixelation; blur because of a moving target; and different lighting.

"If you've got a database of mugshots which are all uniformly lit, and the person is looking straight into the camera, and you're trying to compare that with a photo of someone whose head is at an angle or whose face is obscured or too small in the frame or unevenly lit that was taken in bright sunlight, then you've got less to compare," Usher said.

Some of these obstacles could soon be overcome, it seems. At the Chinese University of Hong Kong (CUHK), researchers claim they've created a system that is 99.15 percent accurate at recognising faces, irrespective of changes in lighting and camera angles.

"The key challenge of face recognition is to develop effective feature representations for reducing intra-personal variations while enlarging inter-personal differences," Professor Wang told Asian Scientist. In other words, something that can recognise the different features of your face.

Facial recognition technology is undoubtedly getting better, and some of those major problems may be getting solved. But even as the tech proves its worth, there's another issue that's likely to just get worse: scalability. As the databases that are being used to compare the images increase in size, the likelihood of getting a mismatch inevitably increases.

"The bigger the database, the more false positives you're going to get, because somewhere out there is someone who looks like you," said Usher. A false positive is essentially a mismatch: The computer system detects that two images being compared match when they don't.

False positives are "one of the biggest problems with implementing surveillance applications," Usher pointed out. The FBI plans to add millions of faces to its own biometric database. 

For all the successful matches and improved accuracy rates, facial recognition isn't at the level of something like DNA testing, where each sample is easily tested as unique. The bigger the database, the easier it may be to hide.