Sonic Evolution #4 - Meet the Human Cyborg that Can Hear Colors
Neil Harbisson, who is an artist that was born color-blind, can’t recognize the difference between blue or green. This is a problem; his job revolves around creating harmony between primary and tertiary color and he wanted to find a fix. So, while studying at Plymouth University, in the UK, he developed a software that associated color with different sounds and had it drilled into his head.
The software is called the eyeborg and it is attached to Neil’s skull, making him the first legally recognised cyborg. The eyeborg includes a camera that picks up colours and communicates them as a musical tone to his brain, and Wi-Fi and Bluetooth capability that allows the eyeborg to operate wirelessly. Now, ten years later, he can hear color.
We talked to Neil about the evolution of the eyeborg, picking up undiscovered colors and why drilling a hole into your skull may be a pretty smart choice.
Noisey: What is the eyeborg?
Neil: It’s an antenna that picks up colours and sends it to the back of my head. I have an entry inside my skull that allows me to hear the frequencies of the colors through bone conduction.
What was life like before you built the device?
Seeing everything in grayscale had always been normal to me until I was told that I couldn’t see color. The project started because I wanted to find a way to perceive colors.
How has it impacted your life knowing that you couldn’t experience colors fully?
I felt like there was a lot of information that I was missing because color is used as a code and to describe everyday things; I was reminded constantly that I was missing this layer of life. In some cases it made me feel socially excluded because I didn’t understand much of the things that people were saying. If someone asked me if I had seen a man with blue eyes and brown hair, I wouldn’t know. It made me feel disconnected from everyone else.
Right, and how did you start to turn your body into a cyborg?
It originally started as an art project at Plymouth University with Adam Montandon. He was studying cybernetics and I was studying music. He gave a talk about how technology could be used in different ways and afterwards I spoke to him about starting a project to extend my senses. In 2004, I was connected to a computer and its webcam with a pair of headphones. The software managed to pick up twenty five colors and then interpret them as tones. Later I tried to find other people to develop the sensor to pick up more colors. For the last ten years I’ve just been collaborating with people to upgrade it to the point where it’s now a body part.
How would you describe the vibrations of the eyeborg as they come through to you?
When I hear things through the antenna it feels like it’s my brain creating sounds. It depends on the frequency of the vibration; ultraviolet color has a high frequency so I can really feel the vibration of that color whereas infrared differs because it’s a very low frequency.
Tell me how the device has impacted your life.
It makes me feel closer to nature and other animal species because I perceive infrared and ultraviolet colors which only insects and other animals can interpret. Hearing through bone conduction makes me closer to whales and dolphins because they also perceive sound in that way. The fact I have an antenna also makes me feel closer to insects that have antennas. With this technology in my body I don’t feel closer to robots, I feel closer to animals.
That sounds insane! How has it affected your art?
To me there’s no difference between sound and sight, it’s just a union between what I hear and see. If I listen to music or compose music, each note feels like a response or perception to colours. I can’t really compose music without thinking of color. When I paint I combine color and sound, so there’s no separation between visual art and music.
Did your musical training help you remember all the frequencies of the colors?
My training never went as far as differentiating frequencies in such a precise way. Color has three hundred and sixty notes in an octave, whereas music only has twelve notes in an octave. I had to train myself the moment I started using the software. It took me three years to memorize all the microtones.
What's next for the eyeborg? Are you going to upgrade it?
We’re trying to conduct a telephone call to my head, but there’s still some things to work out. The overall aim is to explore the use of bluetooth and Wi-Fi so I can have bone conducted internet connections and telephone calls using the antenna. This can also be used to transmit colors from anywhere in the world so if someone from New York wanted to send me the colors of Times Square, they could just connect to my head and send the colors directly to my skull.
You’re incredible. Thanks Neil.
Dan is on Twitter. Follow him - @KeenDang.