Without hands and other fine motor skills, it’s extremely difficult to play music, or even produce it on modern software and hardware. Most of us don’t have to worry about life without such motor skills, but for those living with disabilities, making music can be frustrating. This is why interaction designer and creative coder Andreas Refsgaard has created Eye Conductor, a music production app that uses facial tracking to translate the human eye’s gaze into musical notes or beats in a drum sequencer, while specific facial movements can trigger effects and octave transposition. With Eye Conductor, the Danish app maker hopes he can give people living with disabilities a new means of making music.
Refsgaard tells The Creators Project that the app’s origins go back to his high school years when he was a caretaker for a young boy who suffered from muscular dystrophy. As his caretaker, Refsgaard was essentially the boy’s extended body. So all the physical tasks the boy was unable to do, Refsgaard did for him.
“A lot of the practical stuff was really straightforward, like picking up stuff, helping him with his clothes, turning on switches, etc.,” Refsgaard says. “But helping him express himself creatively was really difficult: It doesn’t make sense to play an instrument or paint on somebody else's behalf. So I think that experience was very important, even though it is a while back.”
More recently, Refsgaard had been messing around with interactive sound installations, where different sensor inputs resulted in different sound outputs. Having this know-how gave him the confidence to make a tool that would allow people to express themselves by playing music.
Refsgaard was inspired by a project called Eye Writer, a low-cost eye-tracking system originally designed for paralyzed graffiti artist TEMPT1. For prototyping, he used Kyle McDonald’s FaceOSC, a face tracking application for quickly prototyping in various environments.
Eye Conductor users can start by building a drumbeat and then play an instrument of their choice on top of the beat. They can also chose to play single notes, chords or add an arpeggiator that is in sync to the beat.
Raising eyebrows can be used to transpose all played notes up one full octave while opening a mouth can add a delay, reverb or filter effect to the instrument being played. Thresholds for facial gestures can be adjusted and saved to fit the unique abilities of different users. Currently it’s intended for live use, and allows people to play solo or with others.
“It is still only a prototype intended to show the possibilities of the technology, so in the future I hope to expand it, so you can add more tracks and instruments as well as export your compositions,” Refsgaard adds. “But for now it is mainly an instrument and a jamming tool.”
“I believe that the ability to express oneself artistically should be available to all, regardless of physical disabilities or challenges,” he says. “Therefore I wanted to create a solution that operated in the same domain as traditional instruments. Something that gives people a lot of freedom, but also requires them to practice, just like a regular instrument.”
Most of the users who tested the system were unable to speak, but Refsgaard said that judging by the look on their faces it was obvious how much they enjoyed the experience. He learned that it can be a bit tiresome controlling things with eyes, but that didn’t stop some users, who were more than happy to keep going.
Click here to see more work by Andreas Refsgaard.