Watching a four-armed robot with soft mallets for hands bobbing its little metal head to a beat improvised based on music it’s just hearing is enough to ensue a pretty grand amount of existential panic. Will human creativity become obsolete in this machine managed future rapidly enveloping us? Will you find yourself on the floor of a classical music conservatory getting beaten to death by the plushy marimba sticks of a sentient robot being? Gil Weinberg, leader of Georgia Institute of Technology’s Robotic Musicianship Group has brighter ideas in mind for the technical advances of tomorrow. Weinberg was a musician first, playing throughout high school before going on to pursue a double major in music and computer science. Towards the end of his college tenure he decided keeping the subjects apart seemed strange, which is where this fascination with robots and their potential as musicians began. His work started with musically reactive computer programming, and developed into physical robots as his studies continued. One month after he defended his PhD, he was offered the gig at Georgia Tech, and has been leading a research group on the topic of Robotic Musicianship there ever since.
Weinberg's hopes for robotic musicianship center around providing a new approach to how we think about music. “If a robot can actually improvise and do it in a way humans can’t, my hope is that this will push music into a completely new uncharted domain,” he explains. So, how are Weinberg and his compatriots teaching robots to listen like humans and play like machines? The answers lie in a process called deep machine learning, a method of teaching robots by example, based on what we know of how the human brain works. Once the robot reaches a point of understanding and recognizing musical patterns, Weinberg then goes on to play with its ability to create new pieces of music based on what it’s learned. “When we started to work with deep learning, what we had is just numbers and manipulation of numbers representing notes, which created some interesting results but wasn’t mind-blowing. You want the robot to understand music semantically, so understand what is a phrase, what is a segment, understand what is tension. So deep learning is one of the ways to step higher, and get high level features, that get the robots to think the same way we do,” says Weinberg.
The Robotic Musicianship Group has worked on a variety of projects since Weinberg's involvement began in 2003. Two of the most notable are Shimon, the aforementioned marimba-playing robot which can perceive, interact, and improvise with human musicians. Then there's the “Smart Arm,” a robotic prosthesis that gives amputee drummers the ability to play again by creating an arm with two drumsticks attached, one which is triggered by sensing movement in the performer’s existing muscles, the other which can improvise based on what’s being played and push the drummer to creatively respond. The focus of both of these projects is to create opportunity for compelling and creative interactions between machines and humans.
Weinberg's goal with his work is to create robots that can incite the feelings all good art does. He hopes an encounter with one of his projects will bring shivers down your spine, put you in a state of awe, and bring on either giggles or a sweeping wave of sadness. Our association with robots is typically one devoid of feeling—hey seem cold and threatening, computational and unapproachable. We flock to art to provide us with an experience, a bit of humanity captured in a song or on canvas. Thus if art really is holding a mirror up to nature, then these strange metal beings operating on programming designed to imitate our madcap leaps in logic might actually provide the truest picture. Going forward, Weinberg will continue to hold a mirror up to us, and create robots in that image, one genetic algorithm at a time.
Click here to learn more about Georgia Institute of Technology’s Robotic Musicianship Group.