Tech

AI Art is Powered by ISIS Executions and Non-Consenual Porn

To create the powerful AI art generators, engineers are training programs to soak up as much data as possible without questioning where it’s coming from.
1663777790165-1534267432542-1517852141647-headerimageaipornsl
Screen Shot 2021-02-24 at 3
Hacking. Disinformation. Surveillance. CYBER is Motherboard's podcast and reporting on the dark underbelly of the internet.

AI art has gotten wildly popular over the past year. Programs like Midjourney and Dall-E are generating incredible images and incredible controversy. But these programs don’t exist in a vacuum. AI’s require billions of images to learn how and what to draw. Where are they getting those pictures? They’re hoovering them up on the internet. A place full of child porn, ISIS execution videos, and non-consensual adult images. With AI it’s all garbage in, garbage out. So who controls this data and is there anything we can do about it?

Advertisement

On this episode of Cyber, Motherboard writer Chloe Xiang walks us through the ins and outs of the AI trained on ISIS execution images.

Stories discussed in this episode:

ISIS Executions and Non-Consensual Porn Are Powering AI Art

Amazon Driver Fired for Posting Photo of Customer’s Dildo to Reddit

'The Silence of My Critics Speaks for Itself:' Hans Niemann Says He Is Being Unfairly Attacked in Chess Scandal

We’re recording CYBER live on Twitch. Watch live during the week. Follow us there to get alerts when we go live. We take questions from the audience and yours might just end up on the show.

Subscribe to CYBER on Apple Podcasts or wherever you listen to your podcasts.

Sign up for Motherboard’s daily newsletter for a regular dose of our original reporting, plus behind-the-scenes content about our biggest stories.