One of the great treats at this year's Moogfest were the array of installations located at the festival's Broadway Outdoor area. Of note were the Tachyons+ video synthesizer, collctv's BamBam, a light-based synthesizer; and then, of course, ChipCodes, two custom-built Chiptune interfaces that allowed users to manipulate three vintage television sets. Another inspired creation was the Sand Noise Device, an augmented reality and generative synthesizer designed by four students from Cal State East Bay's Multimedia Department.
Created by Devin Dominquez, James Saxon, Jay Van Dyke, and Matt Roads, the Sand Noise Device (SND) is described as a “generative musical system,” meaning that users move tangible, self-illuminating objects around the sandbox, projecting virtual objects onto the sand in the process. An array of sensors detect their positions, and create sound based on their movements. As Van Dyke told me, SND's modular system runs on software custom built using open-source tools like Pure-Data and openFrameworks. The music created in this way sounds like a combination of Eno-esque and Indian-influenced droning with evolving electronic textures.
The device's creators took some time post-Moogfest to talk about the genesis of SND, and what they hope to do in the future in the realm of virtual, interactive installations:
The Creators Project: How was the Sand Noise Device initially conceived?
Devin Dominguez: We are all graduate students in the Multimedia Department at Cal State East Bay. In the program we're constantly working on projects as a group, which is one of the main focuses of this program. After we decided we wanted to work together on our year-long thesis, we all went down to a local bar in Hayward called the Bistro to brainstorm and figure out what everyone wanted out of the project. Three of the four of us are musicians, so we decided it needed to be musical.
James Saxon: Being the non-musician in the group I pushed for accessibility for those not as musically inclined. We decided we wanted something fun that would give people an interactive musical experience.
Jay Van Dyke: We also decided we wanted it to be a physical installation and not just an app or a piece of software. James is great with his hands and is the go-to guy if you need something custom built, so that seemed like a logical direction.
Matt Roads: Since we had just finished an entire year of creating interactive projects at CSUEB, we figured we’d combine a musical experience that was easily accessible and interactive.
How exactly does it work? And, can you describe the original hardware and software that allows it to create generative music?
Roads: It’s a generative musical system in which virtual objects create sound based on their movement and location within a table filled with sand. An array of sensors detect the location of lit tangible objects and the topography of the sand. These parameters affect the movement of certain virtual objects based on an approximation of real world physics. The virtual objects are projected onto the sand and their parameters are sonified.
Van Dyke: The hardware is an array of sensors, a computer, and a projector, and our software is custom built using open-source tools like openFrameworks and Pure-Data.
What's really inspiring about generative electronic music for you guys?
Dominguez: I'm interested in the immediate and visceral parts of music much more than any sort of formal or structural elements. Composing generative music allows me to really focus on how the various sound elements interact in a somewhat static setting. Dealing with musical indeterminacy really forces me to understand the particular function of each individual sound I am working with. I essentially get to focus on making sure a small moment of music works well with all sorts of sonic variations.
Roads: Generative music is amazing to me in that a few small rules can make a piece of music interesting for long periods of time, especially if change is easily discernible by the listener.
Van Dyke: It’s interesting to me that it can be different every time but generated in the same way.
Did you take your cues from any other projects past or present?
Saxon: Yeah, we were influenced by projects like the Reactable, AntiClockwise’s Magic Sandbox, and video games like Rez and Electroplankton.
Dominguez: Some of our musical influences include composers like Brian Eno, La Monte Young, Morton Subotnick and Charlemagne Palestine. We also derived inspiration from North Indian classical music.
How exactly does the device create an "augmented reality”?
Roads: We tried to make the intersection between the projected visuals, audio, and the sandbox itself as seamless as possible. Essentially, we wanted the projections of the virtual objects to seem as though they were physically in the sand. The seemingly physical properties of virtual objects are what drives sonification. Properties like speed and location are directly tied to sonifiable properties like panning and pitch.
What is your hope with the project going into the future?
Roads: I’d love to see the SND break into public spaces—somewhere where people can interact with it. Beyond the SND, I’d love to continue making interactive projects with this group, as we’ve got a pretty good workflow.
Van Dyke: My hope is that we continue to work as a team and come up with new projects. Our working relationship has been pretty great so far and we all have complementary talents.
Saxon: Since we made the project modular, I would really like to explore some of the possible variations that the system offers. We have lots of ideas for different sonic, visual, and interactive works based on the same framework. It was really wonderful installing this project for the public at Moogfest, and I look forward to the possibility of sharing it with more people in more places.
Visit the Sand Noise Device website to learn more about how you can dig your fingers into this ingenious invention.