FYI.

This story is over 5 years old.

neural networks

Artificial Intelligence Turned Bob Ross into a Terrifying Psychedelic Nightmare

Will AI one day do digital drugs and have deep dreams of ‘The Joy of Painting' like we do?
Images courtesy of the artist, unless otherwise noted.

These days, the late Bob Ross seems closer to us than ever. The Joy of Painting episodesare streaming on YouTube, Netflix, and even Twitch, and he's even unknowingly launched an ASMR role-playing fantasy subculture. Another kind of Ross revival today comes in the form of the deep dream video from designer, artist, and roboticist Alexander Reben, Deeply Artificial Trees. This time, the results aren't just visual, though the animal-based neural network aesthetics are still here and more phantasmagorical than ever. Reben adds to the overall trippiness by processing Bob Ross's voice as if AI were resurrecting him in a cyberdelic-infused drug trip.

Advertisement

Reben tells Creators that he see something strangely human and unexpected occurring with his deep learning experiments. After training models to reproduce the "soul" of a human through their voices, he realized that something very nearly magical was happening: a brief glimpse into the inevitable merger of humanity and technology.

Screencap by the author.

"This artwork represents what it would be like for an AI to watch Bob Ross on LSD (once someone invents digital drugs)," says Reben. "It shows some of the unreasonable effectiveness and strange inner workings of deep learning systems. The unique characteristics of the human voice is learned and generated as well as hallucinations of a system trying to find images which are not there."

Reben created Deeply Artificial Trees through a few different deep learning techniques. He says this gave him an insight into what these algorithms are "thinking." Reben processed each video frame on either a VGG model on Keras or Google Deep Dream models on TensorFlow. The audio, on the other hand, is synthesized from a WaveNet model of Ross's voice and combined with portions of reversed audio.

Screencap by the author.

As Reben points out, Deeply Artificial Trees builds off of recent video Deep Tongues, a deep learning-generated voice that has an average of 100 speakers, which Reben embedded into a sculpture with an anatomically correct mouth. The video also grew out of Reben's recent work on a related video, Artificially intelligent impressions, in which he tasked a machine learning system to understand what makes a person's voice unique.

Advertisement

"After seeing all the responses online, it is interesting to hear that people feel like watching the video is somehow changing the way in which they are thinking," says Reben. "It in itself is acting a bit like a digital drug: people's brains seem to have a hard time trying to make sense of what they see."

Screencap by the author.

Reben is certainly correct in this conclusion. While these machines are effectively "thinking," AIs are a long way off from being sentient and doing cyberdelic digital drugs. So, at least for human viewers, watching Deeply Artificial Trees is very much like a psychedelic trip: absolutely fascinating, often beautiful and intermittently nightmarish.

Alexander Reben's Deeply Artificial Trees is screening at the opening weekend of an exhibition at Nicodim Gallery in Los Angeles. Click here to see more of Alexander Reben's work.

Related:

Dive into Deep Dream Infinity in These Trippy Music Videos [Premiere]

Google's Psychedelic AI Art Takes Twitter by Storm

This Artist Is Teaching Neural Networks to Make Abstract Art