This story is over 5 years old.

I Controlled Google Glass With My Mind

“This is about using the mind and trying to develop an interface that you think through rather than tap through.”
July 10, 2014, 4:52pm
The author trying out MindRDR

Brain-machine interfaces have been around for a while, even in the consumer market, but they haven’t quite yet lived up to their sci-fi promise. Various headsets enable you to track your brain activity or play simple games, but the real allure of mind control tech is in its promise to communicate with devices in a manner as seamless as telekinesis.

This is what MindRDR, a new app that uses Google Glass and a NeuroSky EEG headset to create a simple mind-controlled interface offers: just concentrate for a few seconds, and you can take a picture with the power of your mind.


It’s a limited application, but it’s pulled off with pleasing simplicity. I tried it out at the office of This Place, the Shoreditch-based company behind the tech. Wearing Glass and the EEG headset, I set about concentrating, opting to do times tables in my head. As my brain worked (as George Osborne knows too well, "7 times 8" is particularly difficult), I watched a line progress up the Glass screen. When it reached the top—a process that only took a few seconds—the device took a picture of This Place director Ben Aldred.

Av. Meditation: 42 Av. Attention: 47 #throughmind #throughglass

— MindRDR (@mind_rdr) July 9, 2014

I thought for a while again, and it sent the picture to Twitter, along with some stats on my ability to focus (turns out I’m about as good at relaxing as I am at concentrating).

The simple task is proof of concept of what the mind interface could be able to do, creative director Chloe Kirton explained. “This is about using the mind and trying to develop an interface that you think through rather than tap through,” she said.

The idea for the interface was inspired by the limitations of Google Glass, a device she nevertheless thinks is “fantastic.” “One of the limitations that we noticed was you need to have a really high level of dexterity and movement to actually be able to use Google Glass,” she said. “Sure, it does have the voice control, but there are times when that does just not work for you and you have to use your hand to navigate, which obviously means you have to use your arm.”

And despite what Minority Report might have you believe, waving your arm around all the time is actually really tiring, not to mention a difficult task for people with limited mobility. That’s where the idea of mind control came in. MindRDR is basically a Glass app that turns the EEG readings from the NeuroSky headset (communicated via Bluetooth) into a scale. In this application, when a certain point on that scale is reached, Glass takes a picture.

You don't have to even have the two devices on the same person. Aldred wore the Google Glass while I kept the NeuroSky, so that my thoughts took a picture of myself through his Glass. I'm not sure exactly what the practical applications of that could be, but I guess it could make for some fun if you want to pretend to teleckinetically hack your friend's device.


Av. Meditation: 52 Av. Attention: 58 #throughmind #throughglass

— MindRDR (@mind_rdr) July 9, 2014

Of course, it’s not quite as easy as just thinking “take a picture”—you have to put some effort into it. Kirton said it gets easier with practice, though trying to use it while talking or with your eyes closed is difficult. The rather anxious-looking guy who tried the device before me had the opposite problem; his agitated energy accidentally took a photo when he didn’t mean to.

And it’s definitely got other drawbacks. For one, there’s the discomfort—and fashion disgrace—of wearing two face-computers at the same time. Luckily, the clunkiest bit of the Glass headset is on the opposite side to the main lump of the NeuroSky, so with a little finagling you can get them both behind your ears. But I wouldn’t exactly want to go out in public like that.

Then there are the usual problems with Google Glass: when I got to try it, the device was low on battery and kept overheating. Aldred had to hold it under the air conditioning unit before it would agree to play.

When we got it going, I still had to go through the whole “OK Glass” voice control system to get to the app. But it’s still an intriguing first step to a thought-controlled device, and the possibilities are many.

Creative director Chloe Kirton wearing the device

“Where are we going next? We’re not quite sure. We’re seeing a million different avenues that we can go in,” said Kirton. While it’s fun for anyone to play Jedi with, an brain-powered interface has obvious medical applications, and the team is in discussion with various organisations in relation to people with physical disabilities and brain disorders who might benefit from this kind of technology. They report that Stephen Hawking, who has a motor neurone disease, has shown an interest in its development.

From today, they’ve made the app available open-source on GitHub, with the hope that the next steps for the tech will become clearer as people make use of it. Users will, however, need to invest in Glass and a NeuroSky headset to give it a go.

I left This Place keen to see how this thought interface, or others like it, might progress. Taking and sharing a picture is a cute example, but the real test will be to make a device that can do everything you’d expect it to do normally, solely via brain controls. And preferably with a little less plastic around your forehead.

We’re already sharing some of our biology with our devices; soon we could be sharing some of our neurology too.