I Stuck My Hands into a Virtual Reality Interface and Felt The Future
​Image: Leap Motion

FYI.

This story is over 5 years old.

Tech

I Stuck My Hands into a Virtual Reality Interface and Felt The Future

We don't usually go gaga for gadgets, but Leap Motion's new haptic feedback controller rules.

Yesterday, using a Leap Motion-enabled Oculus Rift, I experienced something groundbreaking: I could see and use my hands in virtual reality. The experiences were practical, psychedelic, and playful, but also rather stunning to behold, even after several previous VR experiences.

With virtual reality going commercial—a number of of new consumer headsets will the market in 2015 or sooner—the big question becomes, when will we able to actually feel and control their virtual environments? That's where haptic feedback comes in.

Advertisement

Haptic technology virtually approximates the feelings or sensations of movement, vibration, and other tactile forces we take for granted in our three-dimensional reality. On the motion end of things, which is the tech most ready for the market, think of the motion-controlled screens in Minority Report, Avatar, and Iron Man.

The team behind Leap Motion, a smaller controller that senses hand and finger movement, have recently been busy adapting their device for Oculus Rift and other VR headsets (PC-only in beta).

Bringing your hands into VR is a game changer. Image: Leap Motion

In this early iteration of motion-controlled virtual reality, Leap Motion's sensors detect at 300 frames per second. Leap Motion's software then maps this movement into Oculus Rift with CGI skeleton hand models. In the Leap Motion's augmented reality mode, the real world and virtual reality overlapped, with my hands appearing ghostly luminous. In other demos, my hands looked robotic or knowingly virtual.

While it's still early in the device's development, which meant it had to be wired to a laptop and had some glitchiness here and there, the potential is obvious.

In the first demo, I was able to bat some balls around with my hands, not unlike a cat. Fun enough, but then Leap Motion's David Oh asked me, "Do you want to go on an acid trip?" I could only say, "Yes."

Oh transported me into a new virtual world, where I was able to manipulate luminous particles of swirling light. Imagine playing with galaxies or, on a smaller scale, dust particles suspended in a sunbeam, and you'd almost be there. It was trippy enough seeing my virtual hands, but even more so that I was able to control magic light beams in 360 degrees. When Motherboard's Derek Mead tried it, he said he started to hallucinate, which sounds about right.

Advertisement

Controlling buttons virtually is really cool. Image: Leap Motion

In the next demo I flew like Superman through a series of layered, Tron-like grids. While the program was very basic, it still offered the ability to veer left, right, up and down, as well as stop or go slow motion and basically float or hover amidst white grids, all controlled by hand motions. The fact that your hands are tracked and displayed in the Oculus environment made for an incredibly immersive experience. I was also able to play around inside a hyper-colored virtual recreation of CERN's Large Hadron Collider called, naturally enough, ​Collider.

To propel myself through it, I pinched my virtual fingers to create triangles that slingshot my forward. If I clapped my hands, the scene shifted colors and shapes across its full 360 degrees. By putting my hands together, then slowly opening them, I created an amorphous sphere that grew larger the farther I drew my hands apart.

While all of this was fun and occasionally mesmerizing, the practical VR widget that let me control a Minority Report-style computer interface within a 360 degree virtual space might have been the most mind-bending. The first thing I saw was a bunch of lorem ipsum text hanging on a blue screen atop the dark augmented reality background. By gesturing up or down with my hand, I could scroll as if I were using a smartphone's touch screen. Behind me was another screen enabled for up/down scrolling.

The end result is the ability to manipulate virtual environments.

Advertisement

But the widget's most impressive feature was an array of three on/off buttons, beneath which hung a fader, which looked something like the volume slider on the iPhone's Control Center. Holding my left hand up, I could tap the three buttons, turning them off and on, and move the slider left-right with ease. While there obviously wasn't a sensation of touch, the widget's 3D buttons and slider—all floating there in virtual reality—gave me a pretty immersive and convincing sense of control.

It's a bit hard to describe just how cool this is. Basically, imagine living inside of an omnipresent virtual workspace that you can manipulate by poking the air in front of my face—except those gestures are tracked and modeled in a virtual environment with more or less endless possibilities. I can't imagine setting up dozens of touchscreen monitors around my head in real life, but in the virtual world, it made sense.

Sure, this particular app was basic, and the virtual accuracy of real hand movements wasn't always there, but it was interesting to see the next step in VR. I couldn't help but think of motion control's applications, even if it doesn't have the tactile sensation of smartphone navigation. It's also still relatively new, which leads to some curious glitching here and there.

"It's an early prototype. I don't hold this against it at all, but we were in a bright room and the thing kept glitching and at one point there were like, phantom disembodied hands crawling up this guy's shirt," Motherboard's Adrianne Jeffries said after trying it. "I was thinking of it for work, but I guess you could use it for games too. I feel like it's going to get really, really good. Someone should take the prototype and make a glitch art exhibit out of it before that happens, though."

The controller itself mounts onto just about any VR headset. Image: Leap Motion

But imagine users flicking through, say, a Netflix menu to find the movie they'd like to virtually watch. Gamers could use it to move through menus and dashboards, or interact with objects with ease. Educators, especially in science and medicine, would suddenly have a virtual tool that could give students something close to hands-on experience. None of this factors in voice control, which would pair well with motion-enabled VR experiences.

Looking forward, as cameras and sensors for tech like Leap Motion improve, along with with virtual GUIs and environments, the sense of control will improve, and we'll all take a big step towards living in the old Metaverse. All it should take is some creatively ambitious developers using this technology to make it happen.