FYI.

This story is over 5 years old.

Tech

Gesture Based Interfaces Will Provide Immersive Infotainment Experiences

With a 3D camera and a projector, gesture based computing could provide infotainment hubs for public uses
Intel Realsense

On International Women's Day, Intel® sponsored events in 16 cities around the world to celebrate women in technology. Ranging from panel discussions at large conferences to small-group, hands-on workshops, the events aimed to foster a sense of confidence and camaraderie among women in a notoriously male-dominated field. Working with allies like the Global Geekettes, the events also spread the word about Intel's Code for Good initiative, which pairs do-good developers inclined towards volunteerism with a like-minded non-profit organization, in order to better the world using programming.

Advertisement

In Civitanova Marche, Italy, for example, an audience of about 30 (mostly) female, established and aspiring techies gathered for an afternoon-long talk on e-commerce, software development and new media, led by five distinguished women speakers. One of those speakers was Francesca Tosi, an Italian software developer and Intel® Innovator. She presented a slideshow on her work with perceptual computing and the Intel® RealSense™ camera.

Such cameras are making it possible for users to experience games and other digital media in all kinds of new ways. Tosi's Florence-based team is at the forefront of these exciting new applications. One of her recent projects involves using an Intel® Edison Board, an Intel® RealSense™ camera, and a projector to transform any ordinary flat surface into an interactive "infotainment" system, which allows users to do things like play chess or surf the web using hand gestures and speech recognition. Motherboard spoke with Tosi about this project and the potential of perceptual computing.

Can you tell me a little bit about your background?

From a professional point of view, I'm a freelance web and mobile developer with a math background. I mostly work in the field of rendering and computer vision in C/C++, Java, and Js. In 2008, I co-founded one of the first Italian Google Developers Group, GDG-Firenze, and a couple of years ago I co-founded another GDG in the area where I'm working right now, GDG Ancona. I think I invest so much energy in these communities and in organizing conferences, tech-talks, and hackathons because I strongly believe that interaction and cooperation is fundamental for a developer to become part of the evolution of new technologies.

Advertisement

How would you describe your latest project?

It's an interactive infotainment system. With the support of a projector (which will eventually be 3D capable), an Intel® RealSense™ camera, and some computational power, it provides the end-user with access to a mix of digital media or gaming using any generic surface, such as a table or wall, as both an input and output device.

The system is intended for public places, like pubs, restaurants or coffee shops. For example, if we were in a restaurant drinking our coffee around a table, we would be able to see the digital media being projected on that table. Using the capabilities of the Intel® RealSense™ technology, which can capture and recognize users' gestures, we'd also be able to interact with the media to have a more immersive experience.

Giving a natural—in the sense of not requiring any physical touch—interaction system for infotainment is a challenge. We're facing the problem of dropping both the "input" and the "output" devices for interaction, transforming a user's body into the control inlet and any superficial surface into the outlet. It's challenging—and awesome, in our opinion.

Intel® RealSense™ and a projector enable the gesture based interface

Where did you get the idea for this?

I love to drink cappuccino for breakfast at the bar, and I really enjoy consuming infotainment. I thought it would be great to be able to interact naturally with it using simple gestures and have it directly on the table where I sit.

Advertisement

What kinds of games have you tried with the system so far?

The basic idea is to use a video projector and an Intel® RealSense™ camera to turn any reasonably smooth surface into a sort of wide touchscreen, so almost any table game can be reproduced. In our still-naïve (but, in our opinion, promising) experiments, a round table can become a nice Risk-like map, where you can set up a battlefield using the fancy features of the camera that can distinguish objects, and drive the armies using simple gestures. Also, a simple, but amusing, multiplayer Ping-Pong game can be set up on a table, even with our experimental hardware.

What hardware does this system require?

We can't do without Intel® RealSense™ and some high-performance computing boards to track the players in real time and augment their table experience. So, for that part, we use an Intel® Edison Board. The projector is the least demanding part.

And what software do you rely on?

Intel® RealSense™ SDK, of course, and a lot of ad-hoc, hand-written code.

How long have you been working with RealSense cameras?

Quite a while, actually. I published a paper about the Intel® RealSense™ camera in February 2015 on Intel® Developer Zone. I started experimenting just after I entered into the program the year before.

One of the projects I became involved in while part of the Intel® Innovator Program was on augmented reality in web applications. Having the opportunity to explore the possibilities of devices capable of measuring depth, such as RealSense™, fascinated me so much that I began reading and learning as much as possible about this awesome technology.

Advertisement

Having a 3D camera, a projector and a computer at each table in a café sounds pretty expensive. Do you think it can be made cost effective?

We do not necessarily anticipate it being installed at each individual table, but instead as a sort of attraction to have at maybe a few of the tables, in an entertainment zone, perhaps. It is not so unusual to have interactive games like Wii or Xbox in a bar or restaurant. I think that, over time, as the technology becomes more prevalent, the costs will drop accordingly.

What are some other challenges you ran into with this project?

Dealing with 3D is not so easy. I really love working with gesture and speech recognition, but it's a challenge. We're working with a lot of difficult algorithms. Sometimes you have problems with noise or incomplete data due to ambience.

What do you think this gesture recognition tech will look like in, say, 3 years? What will we be able to do that we can't do right now? What are you looking forward to most?

In our experiments, the current cameras are good enough for general gaming in both accuracy and, with reasonably powerful computing resources (which are not so expensive nowadays), for performance. What we are looking for next is an evolution in long-distance measurements.

Do you think people will soon be able to bring their own camera and projector set-up to, say, a table at a park, and just do it themselves? Could it be made cheap and portable enough?

This is a very good question, and idea. Unfortunately, the most compelling problem for an outdoor employment of our system is the fact that a regular projector cannot produce images with strong light. But, with the diffusion of VR devices, you can easily imagine how to replace the projector with a VR transmitting system and use the Intel® RealSense™ camera to track the players. Intel® RealSense technology works well indoors and outdoors, so it could be an idea to explore.

The Intel® Developer Zone offers tools and how-to information to enable cross-platform app development through platform and technology information, code samples, and peer expertise in order to help developers innovate and succeed. Join communities for the Internet of Things, Android*, Intel® RealSense™ Technology, Modern Code and Game Dev to download tools, access dev kits, share ideas with like-minded developers, and participate in hackathons, contests, roadshows, and local events.