Virtual Reality Shopping Is On Its Way With Sixense

FYI.

This story is over 5 years old.

Tech

Virtual Reality Shopping Is On Its Way With Sixense

History might repeat itself, "first as tragedy, then as farce," but technology often starts doing something just for fun, and ends up doing something serious. For instance, the computerized closet in the 1995 movie Clueless is real: you can totally "try on" outfits without actually changing or even being in the same room as the clothes. As counterintuitive as it sounds, that same technology can be used to help stroke victims recover mobility and strength.

Advertisement

Danny Woodall is the creative director at a company called Sixense Entertainment, where they've been working on VR for both shopping and physical therapy. They use an electromagnetic motion tracking system instead of a 3D camera, which Woodall told me solves virtual reality's "sight-line problem." I called him up and we chatted about why they chose to go that route, why virtual shoe shopping tops regular shoe shopping, and where they found the virtual room that Woodall trashed with a drone.

I haven't heard about electromagnetic cameras; a lot of people have talked about using Intel cameras for motion tracking, but those are 3D cameras. Can you explain the difference?

3D cameras are using RealSense™ depth finders and people have been using those as a solution for tracking limbs or fingers. What we do is different. The main reason that those aren't working super great in VR is that, as soon as you move your hand away from the field of view of the camera, you no longer have a hand. You can't turn a doorknob if you aren't looking at it.

What we provide is more like a game controller. It's a base station that you put on your desk, and that station puts out an electromagnetic field that has about a nine-foot radius. The controllers are in stem packs, which can be mounted in anything, but they pick up where they are within that magnetic field. With some fancy math, we send data wirelessly back to the host with super low latency so you can get everything updated for five devices very quickly. For VR that's super important, because there's no issue with occlusion, so if you turn around, say, and the base station is behind you, you can still have your hands being tracked through your body. No need to optically see the tracker. That's a good solution for VR.

Advertisement

Very cool. You guys built the tracker?

So we do the hardware side, and we had a Kickstarter that we finished a year and a half ago. We've been shipping out hardware to our backers. We have prototype units in the field. It goes a bit deeper, and then, of course, we have software solutions. My job has been to create compelling demos that show people what virtual reality can be like once you have full-motion input and intuitive input, right? So we've made a bunch of demos, some are games, and then some are like the vRetail project, which is geared towards a digital retail experience.

Can you explain vRetail?

A lot of places you can go to, like Best Buy, you can buy products, but you get limited interaction at most of those places. You can see them and hold them in your hand though. Of course, they have hundreds of other items they could sell, but they don't have the floor space for them, so you can only get them online. Then you have places like Amazon where you can buy whatever you want. They make it so returning an item is so easy that a lot of people will just buy an item and return it if it doesn't work out. For companies like Amazon, or any doing the retail experience in a traditional 2D browser, anything that someone can do to reduce returns would have huge returns financially.

For those of us who are geared toward the consumer side, what we want to show is that once you have these items in front of you, and you can grab them and hold them, you start to appreciate them on a much more personal level. Like, you've actually held them and in some cases, actually interacted with the item. I think what we showed you was mostly shoes, but we have a toy shopping experience too. In one of those we made a drone you can pick up off a shelf and fly around like if you actually had a drone.

Advertisement

I remember watching that drone knocking stuff over. What room was that modeled after?

The toy store room was a demo that Epic made that was for a realistic environment. It happened to be a great environment for us to piggyback on our shopping experience because it's like you're in someone's apartment with a couch and a TV, and they had done a lot of polish on it, so it was a great starting point for us to build on top of that.

What are your other projects? Where else do you see this working?

We are involved with a bunch of different people who are working in rehabilitation. We work with a company that uses tracking technology for rehabilitation. They are handling stroke victims so they can track the patients' limbs and track progress so, if they can only raise their arms so many degrees up, they can tell over time whether that's improved and they can learn a lot from the data that the tracking technology can give.

They also have a bunch of fun games like juggling that allow you to move and get you to want to move. I don't know if you've ever been in rehab, but it's painful, and you don't want to do it. The payoff is way down the road, and it's hard to see immediate results, so a lot of people end up not doing a good job at rehab. In this case, these guys have gone the extra step and tried to make these rehabilitating activities fun and exciting, and so you want to stay there and get better and do better at them.

Advertisement

We also do things with technology for simulation and training. There are people who are doing welding simulators. People put on Oculus Rift and use our tracking solution with a controller so people can practice welding in a safe environment before they ever go into the field and touch a real-live torch. Because of the accuracy and precision of the device, they can tell what kind of weld they're doing and what needs to be done to improve their technique.

You can quantify and trace how well someone is welding?

You can tell how fast they're moving, how close they are, the temperature, all of that stuff. So they have enough information to really help people and train people how to weld before they even have to touch a torch—which is important because that can be dangerous.

So what special computing does this all require?

We use the processing from the CPU, but nothing too unique. We're just pushing CPUs harder than they're used in normal application. One thing we're adding a lot of is a lot of 3D audio—spacialized audio. It's much more advanced than what we've used in gaming speakers so far. It's different when you're in a VR, because the listener is moving around, and the things have to sound like they're coming from a specific place and distance, and the new stuff that's coming out for space and distance really puts you there. But, it pushes the CPUs really hard, to the point where you can only use eight at a time.

Advertisement

It's absolutely amazing. We had a demo where you're blocking laser bolts with laser swords—you get where I'm going with that—but we attached some 3D sounds to a virtual drone that was flying around, and you could listen, close your eyes and follow the VR drone like that. In the past we've been able to get you to look right when something comes out of the right speaker, but there's no fidelity in that. You can cue someone to look that way, but you can't make someone stare right at it.

I've even seen people navigate to one ringing phone among other phones on a table. They go right to it based off the 3D audio. It's come a long way, and it's only getting better.

What's next?

Our main focus is getting the hardware out there to everyone, and then our software team is going to be working on really awesome experiences.

The Intel® Software Innovator program supports innovative independent developers who display an ability to create and demonstrate forward looking projects. Innovators take advantage of speakership and demo opportunities at industry events and developer gatherings.

The Intel® Developer Zone offers tools and how-to information to enable cross-platform app development through platform and technology information, code samples, and peer expertise in order to help developers innovate and succeed. Join communities for the Internet of Things, Android*, Intel® RealSense™ Technology, Modern Code, Game Dev and Windows* to download tools, access dev kits, share ideas with like-minded developers, and participate in hackathons, contests, roadshows, and local events.