Just when someone might assume the Google Deep Dream neural network is a bit of a novelty, programmer and artist Johan Nordberg ups the ante by giving Deep Dream visuals motion, sending the viewer into a psychedelic, fractalised infinity. Now, the team behind the iOS and Android app Pikazo are bringing neural network imaging to mobile devices, allowing people to apply iconic artist’s styles, and customised ones, to images.
Pikazo users simply upload an image, apply the style, then wait as the neural network works its artistic magic. The results vary from unexpectedly experimental (see: Weta Digital’s Julian Butler’s work) to more garden-variety digital artwork. It takes several minutes for the AI to work, but Pikazo CEO Noah Rosenberg tells The Creators Project that they are working to get the imaging process under 30 seconds, a feat for a neural network.
“What’s interesting is, neither group—the trained professional image makers or the absolute amateurs—are really any better or worse at this,” Rosenberg says. “They’re all making incredible images. In some cases it comes down to not taste or training but who has the best Pinterest skills for finding styles.”
Pikazo works by studying the subject image for content (eyes, nose, a tree). It then studies the style image for details like individual brush strokes if it’s a painting, individual tile pieces if it’s a mosaic, and so on. It then builds a vocabulary out of these details.
“That’s what’s called ‘training the network,’ so in that way it ‘retells the story’ of the subject image using the vocabulary of the style,” Rosenberg says. “I think of it like the pins in a Plinko machine. The subject image pixels go through this maze and bounce around and hopefully fall into buckets that match the style image details.”
Rosenberg says that Pikazo’s neural network looks for best fit places where where a certain brushstroke might fit. It carries out this function hundreds of times, and doesn’t simply look at each pixel but at the relationships between them.
“The end result is that the Pikazo image ‘forms’ the way you’ve probably seen ice crystals form,” Rosenberg explains. “First from a seed, and then over more iterations blooms and grows into place as the system checks for better and better ‘fit.’”
That, of course, is the non-software explanation of what Pikazo does. Rosenberg says that Deep Dream is a tiny part of a “huge stack of convolutional neural network software.” Pikazo uses the neural network stack upon which Deep Dream is built—a project called Torch.
“But the tip of the iceberg that is specific to Deep Dream are a series of trained layers called ‘inception’ that we are not using,” says Rosenberg. “Deep Dream images have eyeballs and dog faces in them because they trained their image layers (the Plinko) using thousands of images of dogs and eyeballs. Ours are trained to look for artwork.”
While it’s definitely interesting to see iconic artists’ styles grafted onto user portraits, landscapes and other images, Pikazo’s neural network is certainly ripe for experimentation with the custom function.
For instance, mixing a selfie of myself on a beach with the Inherent Vice film poster created pretty psychedelic results. Another selfie run through a Japanese edition of William Gibson’s Burning Chrome—a book cover that was paper collaged to look like digital glitch—output a nice cyberpunk blend of analog and neural network art. Blending photographs with other photographs also yields some visually striking results. That type of neural network experimentation is where Pikazo’s real power might lie.
Click here to see a gallery of user’s work at Pikazo Salon.