Tech

Think You Can Tell the Difference Between Human and AI? Take This Test

You’re probably incapable of telling the difference between something an AI and human generated art
People who don't exist
Image: Nvidia image

What makes a painting uniquely human? Is music generated by artificial intelligence really that different from what a human makes? Can you tell when a machine learning program has created the image you’re looking at? 

If you think you can tell the difference between human and AI, take this online quiz. The survey is part of a report about AI and AI-generated content. According to its creator, a Polish journalist and writer named Kazimierz Rajnerowicz, people haven’t been faring well.

Advertisement

“I'm crazy about chatbots, deepfakes, and AI-created art and music,” Rajnerowicz told Motherboard in an email. He created the survey as part of a project for Tidio Blog, a company that makes customer service chatbots. “This project has gone a little off the rails.”

“Can't people really tell the difference between AI-created images and real photos and images?” Rajnerowicz said on Reddit. “I already got more than 400 responses...but I am surprised that the results are so poor. Do people really have trouble distinguishing between a DeepDreamGenerator photo and a painting? When I prepared the examples they seemed obvious to me. There is a clear hint in almost every one of them, but so far the best score is 13/21.”

The survey takes the respondent through examples of paintings, music, text, memes, and photographs. Humans made some and some are generated by AI programs like the generative adversarial networks (GAN) that make pictures of humans that don’t exist. The survey asks you to figure out which are human-made and which are AI-made, then justify your choice.

Advertisement

When you’re looking at an AI generated image within the context of an article explaining how the image is created, it’s easier to see the tell-tale signs of the computer’s hand. But the survey’s images are decontextualized. People have to make judgement calls based on the images alone, and that can be more difficult than you think.

It’s kind of shocking (and fun) to find out how easily a robot can trick us into thinking a human created a piece of art or a bit of music. But the survey also probably isn't rigorous enough to teach us anything definitive about the dangers of AI-created media. This probably won’t be peer reviewed, no journal articles will be published about it, and it’s commercial work done for a chatbot site. 

When Rajnerowicz spoke with Motherboard, the pool of results had grown from 400 to 1,200 but the results hadn’t improved. 

“Maybe it's a matter of me hanging out with IT guys, but I thought people were much better at judging whether something was artificial or not,” he said. “It seems to me that there is a big visual literacy problem emerging. Some people think that technology isn't very advanced yet and that it's easy to tell what's fake and what's real, and it turns out that the general public doesn't catch these subtle differences at all. Once people who are AI-literate realize this, it will be really easy to abuse this power.”

If you know what to look for, there’s tells in the art generated by AI. DeepDream generated images have weird artifacting. GAN created images have strange distortions, typically around a person’s neck and ears. AI created music can have a strange atonality. Even knowing this, the survey is tough. At Motherboard, our best and brightest could do no better than 13/21, the same high score seen by the survey creator.