Photo by MART PRODUCTION from Pexels​ / Edited by Vice
Photo by MART PRODUCTION from Pexels / Edited by Vice

A Popular Face Swap App Is Advertising Deepfakes on Porn Sites

A loophole in the Apple and Google app store policies allows a deepfake app to invite customers to make non-consensual porn, without getting banned.

On the Google Play and Apple App stores, FaceMagic is pitched as a fun and harmless face swapping app: “Do you want to see your best friend, colleague, or boss dancing? Or do you want to morph your face into celebrities?” the app description says on the Google Play and Apple App stores, alongside a video that shows how users can face swap anyone into a video of Robert Downey Jr.’s Iron Man or basketball player Steph Curry. 


But on porn tube sites, ads for FaceMagic have a more nefarious pitch. “Make AI face porn in a sec,” the ad copy says. The ads show a porn video of a popular adult performer, to demonstrate FaceMagic’s ability to seamlessly turn it into a deepfake, replacing the model’s face with another woman’s. “Make deepfake porn in a sec. Create now!”  

FaceMagic’s duplicitous nature—a goofy app for your friends on the strictly controlled app stores and a non-consensual porn creation app on adult sites—confirms what experts have warned is dangerous about faceswapping apps since deepfaking tools were first normalized and repackaged for broad audiences: Even if the apps don’t say they’re designed to create non-consensual porn, they’re still easily capable of producing it. FaceMagic exploits a loophole in Apple’s and Google’s app store policies by not violating any of the rules in its app store pages or official site, but advertising its ability to create content that does on porn sites. 

“It exploded on the adult pornography scene, more so than anywhere else”

Adam Dodge, founder of online harassment victim service EndTAB, said that advertising sexually explicit deepfake capabilities on widely popular porn sites “absolutely” escalates the potential for harm. “What has me worried is these ads are chipping away at the remaining barriers we rely on to keep non-consensual deepfake porn from becoming mainstream,” he said. “Targeted marketing like this simultaneously removes the need for users to spend time searching for an app or develop the expertise to create a deepfake.”


FaceMagic isn’t some secret underground software: it has more than a million downloads in the Google Play and Apple app stores, and tens of thousands of reviews. The app descriptions make no mention of sex or porn, but on Google Play, it’s listed with a “sexual content” rating. One reviewer notes that the developers have “a loophole for doing porn and advantage of being the few that are capable of doing it and make hell of a profit out of it. Smart.”

Motherboard found FaceMagic’s deepfake porn ads on at least four porn sites: Spankbang,, ImageFap, and Iporntoo. The ads we saw used porn videos that are freely available on porn sites.

Motherboard tested FaceMagic and confirmed that, for a price, it can produce a deepfake porn video within seconds. Users have to pay a $10 monthly fee which gets them credits that can be redeemed to produce deepfake videos. After paying, all a user has to do is upload an image of the person they want to put into a porn video and the porn video itself. When uploading media, FaceMagic includes a line at the bottom of a page about how to get a high-quality result: “Do not use face-swap creations for illegal purposes.” Making or disseminating malicious deepfakes are illegal in a small number of states.

FaceMagic doesn’t just easily create deepfake videos, it does so in a way that was considered groundbreaking until recently. When deepfakes first emerged in 2017, they required hundreds ofmany images of the face that was being put into a video, often in the form of still images taken from a short video. In 2019, Samsung developed a method for creating a deepfake using only one source image of a face. FaceMagic, similarly, only needs one source image to instantly produce a deepfake.


“The technology is absolutely getting better. It keeps getting refined, it keeps getting better,” image forensics expert and professor at University of California, Berkeley, Hany Farid told Motherboard. “They're higher resolution, fewer glitches. And it exploded on the adult pornography scene, more so than anywhere else.” Overwhelmingly—despite the popularity of late-night talk show comedy face swaps—the tech is used for non-consensual, fake pornography. Until now, however, we haven’t seen them being advertised so blatantly on porn sites in a way that would perpetuate this type of harm.

There’s no perfect way to prevent people from using deepfake technology to create non-consensual pornography, but there are also some guardrails that deepfake apps can put in place, and that Apple and Google could theoretically force apps to use to prevent them from doing what FaceMagic is suggesting users do in its porn ads. For example, when Motherboard tried to create deepfake porn using Reface, another popular deepfake app on Google Play and the Apple App store, Reface wouldn’t process the video and presented us with an error message. 

“Possible inappropriate content detected,” Reface’s Message said. “Please be aware that we restrict inappropriate content to protect our community.”


As Motherboard has previously reported, automated tools for detecting banned pornographic content are far from perfect and can be circumvented by users, but Reface’s method completely prevented us from producing a pornographic deepfake within its app. FaceMagic does not appear to have any guardrails in place, and the app stores do not appear to require them.

When reached for comment, Apple said it does not have specific rules about deepfake apps, but that section 1.1 of the App Review Guidelines prohibits “Defamatory, discriminatory, or mean-spirited content … particularly if the app is likely to humiliate, intimidate, or harm a targeted individual or group.” However, FaceMagic is still available on the app store and Apple declined to say whether it took any action against the app.

Google’s Play store similarly doesn’t allow sexual content, and also has tons of deepfake apps available. Sexually explicit advertising that directs people to the Play store listing is forbidden. When Motherboard showed Google that a sexually explicit ad was directing people who clicked on it to FaceMagic’s Google Play page, a Google spokesperson said “Our developer policies prohibit apps from using sexually explicit ads to direct users to their store listing. When violations are found, we take appropriate action.” FaceMagic is still available on the Google Play store and Google did not tell Motherboard if it took any action against it. (In the course of reporting this story, FaceMagic’s developer changed its name on the Play store from DeepArt Limited to Insight Technology LTD.)


FaceMagic did not reply to multiple requests for comment, but responded to one of our emails to explain the app’s billing cycle (Motherboard did not ask about FaceMagic’s billing cycle).

Motherboard found that the same developer behind FaceMagic also seemingly operates a website that offers the same deepfaking service, also for a $10 monthly fee. The site has a different name and features no clear connection between itself and FaceMagic, but credit card charges for both services went to FaceMagic’s developer, DeepArt Limited. Motherboard also found ads for that website (which we are not naming, to limit its reach) on that were identical to the FaceMagic porn ads, only they ended with a different logo and sent us to that site when we clicked on the ad. This website also appeared as an ad in Google search results when searching for “deepfakes” and related terms.

The site also doesn’t explicitly promote using the service to create deepfake porn, but its front page features a demo that allows visitors to face-swap a famous model’s face with the face of a famous porn performer. 

The website also has a blog with dozens of bizarre posts, some in broken English, about deepfakes. One post is an overview of the ethical and legal issues with deepfake pornography, while another is just a copy and pasted review of the website’s service posted to one of the largest deepfake porn communities on the internet, including links to images of non-consensual porn. One post warns about the negative aspects of deepfake technology, but then devolved into incoherent, seemingly automatically generated copy.


The site also links to a Discord community where the channel’s apparent admin has posted deepfake porn, but is mostly used by users who are desperately trying to cancel their subscription. Users log in and personally ask the admin to cancel their subscription, or instruct each other to ask their credit cards to block charges from DeepArt Limited. Sometimes the admin will chime in hours and days later and tell users they canceled their subscriptions as requested. Motherboard can confirm that there’s no easy way to cancel a subscription service for the website.

After Motherboard reached out to Discord for comment, the platform removed it for violating its community guidelines that prohibit the sharing of “sexually explicit content of other people without their consent.” Discord has removed deepfake-dedicated servers in the past. “Discord’s Community Guidelines expressly prohibit the promotion or sharing of non-consensual intimate materials, and we continuously invest in proactive measures to keep this content off our service,” a spokesperson told Motherboard in a statement. “Our Safety Team takes immediate action when we become aware of this content, including banning users, shutting down servers, and when appropriate, engaging with the proper authorities.”


Because of the way targeted advertising works online, serving ads to someone who exhibited interest in deepfake porn or clicked on a deepfake porn ad can create a feedback loop of getting more ads for deepfake porn. 

“As soon as content targeting covers the user potentially interested in the topic, we immediately receive ads for more face-swap apps,” internet law scholar Karolina Mania told Motherboard. “The availability of these applications and the context in which they are advertised evoke a clear message. This message ignores the abuse of the law and calls for actions that violate someone's image.” 

Motherboard contacted each of the porn sites where we spotted the FaceMagic ads. After Motherboard found deepfakes spreading on porn tube sites in recent years, many sites added rules against deepfakes to their terms of use. 

“Deepfake porn is decidedly a violence against women issue, so what we're really seeing is a violence against women and girls ad campaign”

A spokesperson for SpankBang said deepfakes are strictly forbidden on the platform. “Moreover, almost all ad campaigns on SpankBang are managed by third-party advertising networks. This includes the ads that you have shared here. Both of these apps use external networks to display their campaigns on SpankBang,” they said. “Our Trust and Safety team has been alerted of this violation already. They have reached out to the networks to track down and disable these campaigns.”

Advertisement said that because it’s a site for animated content only, and doesn’t host any real-life videos, there are no deepfakes allowed by default. “Our ads are run through various ad networks, it is impossible for us to screen the ads as there are 1000s of advertisers. They can target different countries, time, devices etc,” a spokesperson said. “We expect our ad networks to screen the ads and prevent bad actors from advertising.” Ad networks are a middle man between publishers and advertisers, and website administrators often don’t specifically select which ads run on their sites. 

We reached out to two ad networks that served the deepfake porn ads, Clickadilla and Adspyglass, and did not receive responses. 

“Advertising the service in this light should be a disqualification for every party thinking of working with”

The ads themselves are examples of non-consensual deepfakes, using clips from content owned by adult studios without the permission of the people featured or the content owners. Non-consensual deepfakes aren’t just harmful for the people whose faces appear in them, but the porn performers whose work and images of their bodies are being stolen and used without consent, too. Two of the performers Motherboard contacted for comment said they’d never given the app developers permission to use their images. 

“I personally had no idea they were using this scene in an advert,” Ella Hughes, whose scene with the studio Blacked is featured in one of the ads, told Motherboard. “We won’t get paid anything extra for this deepfake creator, [and] I don’t agree with the whole concept of face swapping on porn videos at the risk of it being used for revenge porn,” she said. Hughes noted that the rights to the images, however, belong to Vixen Group, which owns Blacked. Vixen did not respond to a request for comment. 

It’s not just faking porn, either, but faking credibility across the tech industry. On its website, the app developers list several “industry solution” partners, projects they’re collaborating with or were hired to complete. When reached for comment, one of these, Dutch creative entrepreneur Tijmen Mulder, told Motherboard that they’d never worked with this company before. 

“We’ve had contact about a project that never came into fruition. We’re shocked that they use our name on their website and use sensitive information about a potential project as a quote on the website,” Mulder said. “Also, we think AI fake porn is a despicable practice and is denigrating to the person depicted. Advertising the service in this light should be a disqualification for every party thinking of working with”

“An anonymous growing army of non-consensual deepfake porn creators who don't think they're doing anything wrong keeps me up at night. Deepfake porn is decidedly a violence against women issue, so what we're really seeing is a violence against women and girls ad campaign,” Dodge said. “I can only imagine how disempowering and insulting that must be for women everywhere. It makes my blood boil and more men need to give a damn about this and speak out because deepfake porn isn't a women's issue, it affects us all. Not to mention these ads run the risk re-traumatizing victims of sexual violence.” 

Farid said he’s far from optimistic, at this point, that legal repercussions and state legislation against non-consensual deepfakes will ever change how the tech is used. His favorite response to the threat comes from actress Scarlett Johansson, who said of the tech in 2018, “I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself.” He’s more concerned with the people—academics, specifically—who make the technology that enables deepfakes without considering the repercussions of what they’re doing.

“I think there should be pressure on the Apples and Googles, and I think there should be regulatory pressure,” Farid said. “And I think there should be pressure on academics like me to say, ‘Look, maybe we shouldn't just ask how we do something, maybe we should ask if we should do something.’ ... I don't think you can say that technology is inherently benign. It's being weaponized in exactly the way you think it's going to be weaponized: against women and vulnerable populations.”