I was standing in a serene, sunny atrium when suddenly a loud crash echoed behind me. Instinctively, I whipped around. A massive Tyrannosaurus Rex had rounded the corner and was running full speed right at me.
I froze. If there's one thing Jurassic Park taught us, it's that a T-Rex can't see you if you don't move. The colossal dinosaur stopped within a few feet of me and bent its 10 foot head, blinking as it inspected me. Each of its protruding teeth were the length of my forearm. My heart was racing. I suddenly noticed that I had crouched down to the floor. I was acutely aware that even though this was happening in virtual reality, my body had reacted as though it was real.
What if this Cretaceous VR demo had taken that sense of real to the next level by factoring in my body's real-time physical response to determine what the T-Rex did next?
Using data feeds such as a brain-wave-detecting electroencephalogram (EEG) or excitement-measuring galvanic skin response, we can algorithmically infer cognitive states like stress level, basic sentiment, focus, and attention. And since game engines render scenes in real time, developers could effectively equip the T-Rex to sense my fear. It could react to it, honing in as I became increasingly terrified. It could even require me to relax if I wished not to be eaten.
Oculus Story Studio calls 2015 "Year Zero for VR." Year one will then see the launch of the world's first widely affordable and portable VR device. Today's VR is but the beginning of a rapidly evolving universe, most manifestations of which have yet to be imagined.
"Hybrid" virtual reality experiences, which interact with data from the physical world, loom close on the horizon. One source of data is obvious: our brains and bodies. In the years to come, we'll integrate biometric signals such as brain waves, heart rate, and muscle contraction to make virtual worlds feel impossibly real. "Our bodies will become tools for more intricate interactions," says Autodesk's director of emerging technology, Brian Pene.
As sensors begin feeding rich biometric data into VR—brain waves, muscle contraction, heart rate—we begin to blur the lines between real and virtual
Some hints of what this will look like are already here. The game Mindflex Duel, for example, allows you to challenge a friend to a battle of brain control. As you willfully oscillate between states of relaxation and concentration, a stream of air levitates a ball. Your brain waves move it through hoops in a physical game.
Consumer grade EEG sensors measure electrical activity in the brain, making such toys possible. For most of its 90-year history, EEG has been confined to neuroscience labs. In recent decades, researchers have made great progress decoding neural circuitry patterns corresponding to events, such as recognizing images or having a sexual climax. EEG products designed for the general public are at the forefront of bringing brain biometrics to the masses, and one place they'll see it is virtual reality: in games, job training, and therapy, among other applications.
A wasp lands on your arm. Do you freak out? Swat it? Move slowly and hope it flies away? Turns out there's a specific part of the brain that calls your attention to such matters. Attentional shifts result from stimuli from the LC (locus coeruleus). It's the sole source of the neurotransmitter norepinephrine, a stress hormone that basically hijacks your awareness and prompts you to make a decision. Events that capture your focus are referred to as "attention orienting," explains Paul Sajda, Professor of Bioengineering at Columbia, whose research centers around understanding importance in human brains.
While we can't decode detailed thoughts, we can recognize where and when attention is focused. "We can strategically serve mediating stimuli," explains Paul, "and present a person with an interesting choice at the ideal moment."
Surfacing real-time biofeedback data for people as they consciously focus on modifying that data can dramatically facilitate changes in the brain. For example, a Stanford study looked at people who experience pain. Participants simultaneously focused on turning off their sensation of pain while they watched real-time visualizations of functional neurological data from their own brain. Those who got the biofeedback were more successful in down-regulating their pain. This begins to reveal the degree to which you can and do control your mind.
Such applications are only beginning to be seen in VR, where public hype centers mostly around games and cinematics. VR offers the potential for us to use data about our own cognitive state in real-time to strategically alter and improve our own minds. VR becomes a tool to turn yourself into the person you want to be.
Sajda points out that the LC is conveniently wired up not only to cerebral cortex but also controls pupil dilation. This is why poker players often wear sunglasses. One curious experiment conducted by Sajda and crew demonstrates the potential of detecting and acting on attentional orientation. They measured the LC response times of baseball players through pupillary dilation, looking at how long it took each athlete to register a pitch and consequently decide when and where to swing. For reference, 25 milliseconds (25/1,000 of a second) makes the difference between major and minor leagues. Using attentional shift data, recruiters could pre-screen for talent. More interestingly, they could use real-time biometric feedback from a player's brain, training him to improve his response time.
Conor Russomanno, founder of OpenBCI (Open Sourced Brain Computer Interface), thinks that before we see widespread adoption of brain-controlled VR, we'll see muscle-controlled VR through EMG (electromyography). Much like EEG, EMG detects electrical activity. Except it reads skeletal muscle flexion, which emits a vastly stronger signal than the brain, as a 2012 Old Spice muscle music video conveniently demonstrates. You can also easily consciously control the signal: simply flex a muscle.
Say you were to pair EMG sensors with a virtual experience, freeing yourself from pesky limitations like physics. Companies such as Athos are entering the muscle sensor market, building EMG-inlaid workout gear. How long until hackers repurpose it for VR? You could flex a few muscles, raise your arms, and fly away as Iron Man, for example.
Meanwhile, software ushers in personalized VR. Consider haptics, which can be integrated into wearable "clothing" that provides tactile feedback through vibration. Haptic feedback dramatically improves immersion thanks to a phenomenon known as the disassociation effect. University of Pittsburgh professor Greg Siegle explains that when you see through someone else's eyes and see that person being touched while simultaneously feeling a touch at the same physical location, your brain reconstructs the scenario as though you were there. Receiving a virtual touch today implies that someday you will be able to reach out and feel something you're virtually touching.
Getting a physical sensation from a virtual world is trippy, and we're already seeing this used in porn. Sex toys can sync to VR porn, effectively allowing viewers (participants?) to feel what they see. Maurice Op de Beek of Amsterdam-based erotic technology company Kiiroo says the company is working with consumer EEG manufacturer Emotiv on "the possibility of letting a girl control her vibrator exactly the way she wants it (via EEG) or that an other person can control it remotely."
They're also looking into machine learning to decode signals of preference. "Some people have a hard time saying 'I like that.' With EEG, they don't have to. You can detect what they like and when they are engaged," providing intimate feedback to a lover, even if they're an ocean away. And as Kiiroo's system begins to learn what you like and dislike, it could in theory make recommendations—a Netflix-esque Kama Sutra for the modern era.
Adventure isn't limited to sex. Solomon Rogers, founder of European VRX studio Rewind, says we're in for an overhaul of extreme sports—it's only a matter of time until you can shred slopes as Shaun White, your heart rate factoring into your speed and EEG-detected focus determining the craziness of your stunts.
It's not all adrenaline and orgasms. Today's tech is severely limited by sensor quality. EEGs best detect electrical activity when situated near the scalp using conductive gel, and it's a pain in the ass to get high quality signals.
"Some people have a hard time saying 'I like that.' With EEG, they don't have to."
Price is also a significant barrier. Most EEG researchers use systems costing thousands or tens of thousands of dollars. But consumer EEGs are catching up and with scale, price and quality will improve. Algorithms must also advance if we wish to pick up on more nuanced cognitive states and improve personalized VR experiences. Fortunately, we live in a hacker's world. Most of today's hardware comes with software development kits, or SDKs. We're in the midst of the emergence of virtual reality as a platform driven by a growing global community of creatives and coders.
When you feel a touch in a virtual world, is the experience still purely virtual? As multitudes of technologies begin feeding rich biometric data into VR—brain waves, muscle contraction, heart rate—and pump feedback to you, we begin to blur the lines between real and virtual.
Brian Pene remarks that we're often presented with two scenarios for the future: man versus machine or man as machine. He thinks we will merge. And I think he's right. VR will learn from you. And you will learn from it.
The idea of becoming one with the virtual reality matrix may be scary. But shh, relax. You're going to like it.
Jacked In is a series about brains and technology. Follow along here.