Facebook is experimenting with AI to fix embarrassing photos of people blinking—a new advancement in the ceaseless crusade to make you look hotter online.
The novel technique uses intelligent “in-painting” to convincingly replicate eyeballs. This was done by training a Generative Adversarial Network (GAN) to recognize open eyes based on a set of reference photos. Essentially, showing it what a human face should look like, so as not to compromise a person’s unique facial structure when it’s retouched.
By training the machine learning system on photos of people with their eyes open, the GAN learned characteristics like eye color and shape, while also considering factors such as a person’s pose or the photo’s lighting.
A pair of Facebook researchers recently published their findings on the company’s website. For the experiment, they trained the GAN on two million 2D-aligned images. This dataset contained images of roughly 200,000 individuals, with at least one photo of each person with their eyes open. For external replication purposes, a dataset of 17,000 celebrity photos, containing 100,000 images, was scraped from the web.
The results are pretty damn good. When asked to guess the “real” photo, an audience selected the AI-generated one 54 percent of the time (or they weren’t sure which one was in-painted), according to the study.
However, the technique has its limitations. Glasses and certain hairstyles caused the network to trip up, the study said. It’s sensitive too; photos without distractors (noise), showing people in non-extreme poses, tended to produce sharper eye results. And without an open-eyes reference photo, the GAN would be unable to perform personalized in-painting.
AI photo editing is understandably appealing because 1) people are lazy, and 2) want to look good all the time. Adobe’s Sensei AI, for example, compares images to “professionally edited shots,” making it a breeze for newbies to get expert-looking photos. Adobe also has an “Open Closed Eyes” option, but it’s slightly more hands on than Facebook’s feature.
In any case, I’m curious to see if and how Facebook rolls this out. Will consent be required before Facebook’s AI can retouch their image? Will we even know that it’s happened? Since law enforcement has mined Facebook photos before, will courts ever argue that Facebook images aren’t valid evidence because AI can manipulate them?
Until then, you can always turn off tagging and make your photos private.
This article originally appeared on VICE US.