If you, like me, suddenly find your Instagram flooded with videos of your friends digitally cosplaying Britney Spears, Virat Kohli, Hulk or Titanic’s Rose, you may be equal parts shocked, amused and confused as to why their cheesy selfies are splattered all over the shitposts that breed on the internet. But a tiny line inscribed on top of their eerily realistic face swap with celebrities is enough to explain everything: made with REFACE APP.
The Reface app, formerly known as Doublicat, uses deep tech artificial intelligence (AI) to create hyper realistic face swaps. Essentially, it allows you to impose any selfie, no matter how unflattering, onto bite-sized source video from the app’s bank. This preloaded bank allows you to reimagine yourself, whether you want to strike an iconic Iron Man pose, wave your hands in the air like you’re Steve Carrell from The Office or sing with a stance that mirrors Freddie Mercury’s, within minutes. In other words, this viral app is helping you create deepfakes with an ease like never before.
In case you didn’t already know, Deepfake AI, the technology that replaces an existing image or video with someone else’s likeness using a machine learning technology called GAN (generative adversarial network), is categorised by criminology experts as the biggest technological threat humanity will face moving forward—even bigger than killer robots. That is because they’re perceived to be harder to defeat as opposed to other AI-enabled future crimes, and could lead to high harm or profit too. In their short span of existence, deepfakes have already been abused endlessly to manipulate everything, from election campaigns to non-consensual pornographic imagery, leading to a landmine of misinformation on the internet that is often difficult to detect and blurring the lines of consent.
Given the situation then, the viral Reface app, which has received over 25 million downloads in more than 100 countries across the globe, could potentially make it tougher to beat the impending deepfake crisis.
Faraz Arif Ansari, an Indian filmmaker, is one such individual who has faced harassment and bullying made possible by the Reface app. Last week, Ansari opened his Instagram to find photos of his face imposed onto objectionable images accompanied with homophobic slurs.
“As a queer Muslim, I have faced a lot of trolling, bullying and harassment growing up, and the way people were trolling me with photos of my face imposed on scantily clad lingerie models to call me things like ‘hoe’, or with jock straps across it, reminded me of that,” Ansari tells VICE. This, he mentions, was done without his consent, with trolls using an image of Ansari from an old interview they found on Google.
Interestingly, this vicious trolling happened just a few days after Ansari had downloaded the app himself, fascinated by its ability to turn a person’s face into humorous GIFs and memes.
Reface is a Ukrainian start-up with the business incorporated in the U.S., providing a free three-day trial that then costs about $16 to continue using with a wider range of videos to choose from.
And though it’s not the first to swap out faces at the touch of a button, its ease of use, accessibility and creepily slick results carry the power of transformation when it comes to putting the deepfake tech in everyone’s hands. While many have embraced this app for a few quick chuckles, many are also concerned what damage such tech can do.
“The app normalises deepfakes, and not everyone understands the concerns arising from them because not everyone has the digital know-how to differentiate what is real and what isn’t,” Apurva Singh, a privacy expert and volunteer legal counsel at Software Freedom Law Center, India, told VICE. Singh points out that the main problem with deepfakes is that not everyone knows how to identify them, since many assume the pixelation that often shows up in manipulated videos is a result of low internet connectivity.
Singh agrees that by turning deepfakes into humorous experiments, the app risks normalising a dangerous culture that is notorious for peddling fake news and online bullying. And though the app doesn’t allow for users to upload their own videos or full-body swaps, such technology already exists anyway and might be exploited by other apps looking to replicate Reface’s commercial success. That the very root of deepfakes lies in diligently cranking out fictional footage of women through a machine-learning meat grinder and posting them to public porn forums is important to remember as well.
Reface was previously pulled up for privacy concerns, including one that says they have the right to “create derivative works from, exhibit, broadcast, publicly perform, and publicly display”, and another that declares, “we may generate revenues, increase goodwill or otherwise increase our value from your use of the Application, including, but not limited to, through the sale of advertising, sponsorships, promotions, usage data.”
However, its makers have maintained that the app does not store your image for more than 24 hours, and does not derive biometric data from your face.
“We transform selfies the user uploads into face embeddings—an anonymized set of numbers, which describe specific features of the face and are enough for a face swap,” Reface told VICE in a statement. When we asked about our concerns when it comes to normalising deepfakes, their response felt generic and evasive. “The tech will also open up opportunities for richly personalized content communities to grow up around stars and influencers — extending how their fans can interact with them,” is all they told us.
However, as Singh points out, the lack of laws regarding facial recognition in countries like India create legal obstacles that may make redressal too little, too late. “By the time you file a police complaint and they investigate it and take down any image or video using your facial features, the damage this content can cause is already done,” she explains to VICE.
Let’s face it: apps like Reface, which lure you in with their simple technology and capture attention by integrating your likeness into trendy video formats, may not set out with the intention to cause harm. But as we evolve into an increasingly digital existence, with everything from our workouts to our self-isolation outfits splashed across various social media for all the world to see, it’s important to remember that the lines between what is real and what can be made to seem real are just one app download away.
Follow Shamani on Instagram.