This story is over 5 years old.


The Programmer's Greatest Challenge Solved in 'Space Between'

Water has been called “the animator's nightmare,” and “the programmer's greatest challenge,” but when it was time to create a game that demonstrated what you can do with when you swap a game controller for your bare hands—and, uh, your head—

Water has been called "the animator's nightmare," and "the programmer's greatest challenge," but when it was time to create a game that demonstrated what you can do with when you swap a game controller for your bare hands—and, uh, your head—Justin Link and Chronosapien Interactive didn't shy away from the challenge. They dove right in and set their game, Space Between, underwater.

I called Link up to see what he and his young company had planned, and he told me that, with the support he's getting, his game might still be in the sea, but the sky's the limit.


Tell me about Space Between. What do you like about it? What were you trying to do?
Originally, we did Space Between for the Perceptual Computing Challenge. At the time, we were trying to create something that had a really nice aesthetic, because that's something that we felt was missing from the contest. But we were also trying to create something that was a sampler for technology that showed off some of its capabilities, with some of the different data you could get.

We did the original demo and we ended up placing second in the contest. After that, Intel contacted us to develop the full version, but we kept the same approach: strong aesthetic and something of a sampler for RealSense™. We were going for the Wii Sports of RealSense™, but in an underwater-themed environment.

Water comes up a lot as a way of demonstrating the next layer of tech. You remember Wave Race 64? Anyway, how does Space Between demonstrate the full breadth of capabilities? Was it just the 3D camera that you were trying to demonstrate?
It centers around the RealSense™ camera. I think what we were going for, and the stuff we developed, was based around really natural and intuitive gestures. We didn't want to shoehorn an old game style into a new interface, because from what we've seen, that doesn't work very well.

So, one of the things we were looking at was how you could control these creatures in a way where you could just walk right up and understand what you should be doing. That was our approach. We spent a lot of our time developing and exploring. On the technical side of things, we used a lot of out of the box features: things for tracking hand position and head position, and things like that. We did have to create a custom gesture so we could have kind of a wave motion that the player would do.


But to be honest, most of the development that we did with RealSense™ wasn't really technical. It was one of those things where, once you have everything on top of your API, you can access the data pretty easily. But a lot of development time was trying to figure out what works and what doesn't. We could address user feedback, for example, and that was a really big problem. Everyone's used to a mouse and keyboard or a game controller, but nobody really knows how to do gesture controls, so we had to design around that.

What's the objective of the game, and can you only play as the turtle or what?
You actually play as five different creatures or machines. The turtle is the first one in an area called the sunlight zone. Basically, the objective of that game is to swim and collect coins. There are no enemies, but you're avoiding obstacles. There's no lose condition. It's more exploration.

The next game is the twilight zone. To give some context, what you're doing is progressing deeper into the ocean, and each one of these zones is actually a layer in the ocean—sunlight zone, twilight zone, midnight zone, the abyss, and then the trenches. In the second one, the twilight zone, you play as a whale. You lean left or right, or forward or backwards, to go left, right, dive, or ascend. You're sort of fixed on a rail, and like the turtle one, you're just exploring and collecting points. It's more an experience than a game.


The next one though—the midnight zone. You control a school of fish using hand position and openness. And what that will do is allow you to expand or contract your school of fish when you open or close your hand. You can also sort of swim around your environment. That game has a scrolling environment which means you have to navigate it without getting caught or stuck. You also have sharks that are swimming through the environment trying to eat you. That one's definitely a game. It's win or lose. In each of the games, you're collecting points.

Why this medium?
I think what's really appealing about working in this medium is that it's completely wide open. That's one of the things. There's total creative freedom, and you're not really locked in what has been done before, because nothing has been done before. But also, for me, working with emerging technology, it's really about those things being the future of how we interact with computers… so I like doing things that have a foot in the door of the next generation…

Where did you start off?
I started at a media research lab in Central Florida doing similar things, working on simulations, educational games, and research applications. I was doing apps with gesture cameras, mixed reality HMD's, touchscreens and touch tables – stuff like that. I was there for three or four years before moving on.

It's something that's changed just in the last handful of years. 3D cameras are appearing in the wild before people even know they want them. It's cool you're getting a chance to explore that before people even have that on their minds. What's changed in this space in the time that you've been involved?
I think the biggest thing that's changed is that bigger companies like Intel, Microsoft, and Facebook are supporting emerging technology. When I first started out, it was usually a hack. The first thing I created was a touch table, and I had to build it from scratch and hack it to get it to do what I wanted. I had to have an infrared camera and lights. There were no computer infrared cameras, so I had to bust into a Playstation Eye and take out the filter. On the software side, it was much more complicated. All of these libraries were made in other university environments, and they were full of bugs, probably. It wasn't a very good system.

Now, you have these APIs that are supported by big companies, and they do what they say they're going do. It's all documented and clean and much easier to work with, so you can start creating right away.

The Intel® Software Innovator program supports innovative independent developers who display an ability to create and demonstrate forward looking projects. Innovators take advantage of speakership and demo opportunities at industry events and developer gatherings.

The Intel® Developer Zone offers tools and how-to information to enable cross-platform app development through platform and technology information, code samples, and peer expertise in order to help developers innovate and succeed. Join communities for the Internet of Things, Android*, Intel® RealSense™ Technology, Modern Code, Game Dev and Windows* to download tools, access dev kits, share ideas with like-minded developers, and participate in hackathons, contests, roadshows, and local events.