How Will the Robots of the Future See Us?
VFX artists make an educated guess about machine vision in Oliver Daly's new sci-fi short, 'Miles.'
All images courtesy Territory Studio.
Whether they're relentlessly attacking or helping us get dressed, robots will use cameras, sensors, and software to look at the world, which means that they'll see things from a very different perspective than humans. The interface design in animator-turned-filmmaker Oliver Daly's new boy-meets-robot short, Miles, is a thought experiment into how tomorrow's machines could "see" the world through data.
"Although developed by humans, [a robot] might perceive human faces differently," posits Territory Studio Head of 3D Peter Eszenyi, who was part of the team that built the POV shots for a robotic character named M.A.X. The team also includes Territory's Creative Director David Sheldon-Hicks, motion designer Ryan Rafferty-Phelan, and artist Andrew Gant, whose work with Kinect we've featured on The Creators Project before. Here's a shot-by-shot breakdown of the interface design they built for the film:
In contrast to Territory's process of creating large-scale films like Guardians of the Galaxy and Jupiter Ascending, Hicks explains that working on Daly's small, crowd-funded short film allowed for a much more direct line of communication with the director, a huge boon when it came to developing the look and feel of M.A.X.'s robo-vision. "Oliver had a strong vision of what he needed, but gave us enough freedom to really innovate, and he was open to being challenged, but equally challenged back. I think due to the relatively small scale of team, we were all able to take risks and experiment, something that’s not always easy on a big production," he says. As a result, they could try new things with the Kinect to visualize the abstract ideas they'd been working with.
"Oliver and I wanted to create a digital dreamscape of wires and points," Gant recounts. "Like the team at Territory, we all shared a love of fractals, frames, pointclouds, and the like. We loved getting lost inside the detailed world of a 3D scan on a computer, and each scan from the Kinect Depth Kit has its own beautiful imperfections and details, which are always fun to explore within every 3D program. This was the basis for our original look for the eyesight of M.A.X. (our hero robot)."
Since M.A.X. turns out to be a military experiment—perhaps the fictional great-grandson of Boston Dynamics' robot dog, Spot—fractals and glitches aren't the entire basis for its vision: Eszenyi researched military heads-up-displays and the vision of predatory animals, as well as drawing upon Territory's previous work on sci-fi films like Ridley Scott's Prometheus, to develop the theory of robot vision he settled on for Miles.
We already know about robots that can draw like humans and that can shoot a perfect game of beer pong, but how advanced is robotic eyesight today? "I think we’re closer to this type of sophistication than we realize," Hicks says. "The footage of remote controlled predator drones and walking robot dogs that can be seen on YouTube give us an idea of what can be achieved already."
Gant takes it one step further, suggesting that he believes "Miles-like technology is definitely already out there, which makes this thought fun and scary at the same time..." He states, "I would love to get inside the ‘minds’ of these bots and see what the data streams look like before they are translated for humans—I’m sure its beautiful!"
See the their theoretical robot sight in action in the Miles short film, which you can watch below. Daly and his team are hoping to expand upon the all-too-brief narrative in a feature length version, for which Daly is still seeking funding.