The AI image-generating craze has entered its next phase of absurdity: creating fake profile pics that make you look good on dating apps and social media.
For $19, a service called PhotoAI will use 12-20 of your mediocre, poorly-lit selfies to generate a batch of fake photos specially tailored to the style or platform of your choosing. The results speak to an AI trend that seems to regularly jump the shark: A “LinkedIn” package will generate photos of you wearing a suit or business attire, while the “Tinder” setting promises to make you “the best you’ve ever looked”—which apparently means making you into an algorithmically beefed-up dudebro with sunglasses.
Videos by VICE
There are also options that generate artsy Polaroids, photoshop you into memes, or make hyper-stylized portraits that copy the aesthetics of popular artists. After submitting your photos, the site promises to return results in 12 hours. The AI model used to generate your photos is also deleted after seven days, according to the site’s privacy policy.
Motherboard tested PhotoAI’s service by uploading 12 photographs of Motherboard executive editor Emanuel Maiberg. We picked the “Tinder” package and less than four hours later received a link to a gallery of 78 images that appear to include images that would work well for both Tinder and Linkedin profiles. Like many AI-generated images, in some instances they’re impressive on first sight, but are not entirely convincing if you were to scrutinize them in the manner any reasonable person would a Tinder profile picture. In other instances, they are alternatingly hilarious and horrifying.
In these images, Emanuel really does look like he’s got some great productivity hacks to post to his Linkedin profile, but if you look closely you’ll see his mouth is very realistic (every images has some small flaws like this):
In these images, seemingly for the Tinder package, PhotoAI gave Emanuel a leather jacket, sunglasses, and maybe hints of a smoldering facial expression. Again, neither is entirely believable if you look for longer than a second.
According to creator Sebastien Lhomme, PhotoAI works by generating a “fine tune” model from the photos a user submits. The results are then filtered through a second small model that applies the chosen style, and then finally into Stable Diffusion, which is publicly available. In other words, you’re not paying for fancy proprietary AI technology, but for a service that simply feeds your photos into a pre-existing AI image generator. Similar services have cropped up in recent months that, for a fee, will use AI to generate text prompts… which can then, of course, be used to generate photos with AI.
Absurdity aside, the rise of AI image generators has caused controversy in recent months. Some working artists have complained that tools like DALL-E and Midjourney have effectively duplicated their styles after using their artwork as training material without permission. So while generating AI self-portraits might be silly and fun, it’s still not clear where and when they are okay to use—legally or morally speaking.
It’s also not clear whether AI-generated selfies violate the terms of service on dating apps, which have rules against impersonation and misrepresentation. But at the moment, there doesn’t seem to be anything stopping someone from trying to catfish people on Tinder using flattering photos generated by an AI.
Lhomme claims that he won’t be held responsible for how people use the photos his service generates. He points out that even without the help of AI, people have been using Photoshop or hiring freelancers to modify their photos for a long time—although perhaps not with such immediate results.
“The tech is so new, the use cases it will solve over the next few months/years are inevitably going to lead to interesting questions about legality and morality,” Lhomme told Motherboard. “Things will be blurry for a while, and I think collaboration will need to happen between all the parties involved to decide what are the best rules and responsibilities for everyone to ensure the ethical usage of such technologies.”
PhotoAI’s terms of service state that users are “not allowed to upload photos of other people,” or “upload naked or pornographic photos.” In practice, however, there aren’t any bulletproof ways of enforcing these kinds of rules. For example, DALL-E creator OpenAI has manually created filters that automatically reject certain types of images. But dedicated users of Stable Diffusion have found ways to generate freaky porn and other NSFW art by simply hosting the AI model on their own servers, sans filters.
The large potential for lawsuits and copyright violations have made some platforms wary of AI-generated images. While some stock photo sites have explicitly banned the sale of photos created by tools like DALL-E, others have embraced them—Shutterstock recently announced it would partner with OpenAI to offer generated images on the site.
Nevertheless, Lhomme suggests that AI-generated photos will eventually become ubiquitous. After all, he notes, selfies are already digitally manipulated by the image-tuning software that runs on smartphone cameras—and most people are none the wiser.
“Of course, the tech isn’t quite there yet, but it will be soon,” said Lhomme. “And once the AI-generated photos are indistinguishable from ‘real’ photos, the question of whether they are real or not will become irrelevant.”