San Franscisco-based artist and designer Jono Brandel has long been fascinated by synesthesia. He admits he's been chasing the concept for years, first as a VJ and now as a creative coder working on Google's Data Arts Team, where he's worked on projects like projects like This Exquisite Forest and ROME: Three Dreams of Black. His latest personal project may be his best synesthetic approximation yet: a real-time animation generating web application called Neuronal Synchrony.
If the name makes the project sound heady, that's because it is. Named after the physical phenomenon in which "neurons across widely separated areas of the brain spontaneously and instantaneously communicate with one another," neuronal synchrony is believed by some theorists to be the very basis of human consciousness. In the case of Brandel's sound visualizer, the browser-based animations are synched to music and controlled via a Monome 128, a midi controller typically used for audio sequencing.
We spoke with Brandel over email to find out a bit more about the tool, the inspiration behind it, and how he's been using it for live performance and installation work.
The Creators Project: This is the latest in a number of audiovisual experiments you've produced over the years. How did you become interested in this kind of work?
When I was living in Los Angeles I was part of a VJ collective, Collabo. We got our start learning from Dutch duo Machine. They taught us how to choose and create small video clips. Through an analog video mixer they showed us how to play these clips with the musician. Since then, I have made a number of music visualizers that are bound to audio input. In other words, the visuals are automatic, similar to the popular iTunes visualizer Magnetosphere by The Barbarian Group. You don’t have to do anything.
Can you tell us a bit about how you've been using Neuronal Synchrony for live performance?
I’ve been performing with it in a number of ways. The most documented use was to record videos of animations to accompany tracks — for instance, the video for Lusine (above) is a single take of me playing the software. I’ve also used it to accompany live music at shows Lastly, I set it up as an installation at GAFFTA’s most recent event. They had a number of DJ’s playing through the night and a spot where you could come up and play Neuronal Synchrony. The input from the Monome would show up above the DJs on a large projection. This was by far my favorite because it democratized the performance.
Why did you decide to use a Monome as the user interface for this project?
Agency is an important concept to me. For this project, it seemed important to have a human input rather than automatically playing animations. This isn’t an iTunes visualization. There are unexpected moments and it can’t do anything without human input. Because I was commissioned by GAFFTA to make visuals for music events, the Monome came as a natural extension of the musician's tools. It is typically used to create sound, rather than image.
What's the benefit of running these real-time generative graphics in the browser? Or is that just a neat trick you wanted to explore?
It definitely informs the aesthetic. What’s really nice about real-time graphics is that I can make a lot more decisions and calculations at runtime. This allows for varying size, color, orientation, speed, any number of parameters to be changed on the fly. This type of control isn’t really possible with canned animations. That they’re in the browser adds versatility. I can have a number of browser windows open and change configurations easily. It also makes installation smoother because most computers have modern browsers.
What informed your aesthetic choices for this work -- the shapes, colors and forms that are generated and how they correspond to sounds?
I’ve been inspired by a lot of artists before me that are interested in synesthesia. There are so many it’s difficult to list… from the stills of Kandinsky to the first moving images of Viking Eggeling to the more iconic Oskar Fischinger. So, the colors and movements are informed by these artists' works. The shapes are created by the mathematical functions sine and cosine. I chose this as the baseline criteria for each animation, because digital music is described in sound waves.