Last winter's Rogue One came pretty close to bringing an actor back from the dead. Peter Cushing, who died in 1994, appears in CGI form, "reprising" his role as Grand Moff Tarkin from the original Star Wars. Cushing's longtime secretary was overwhelmed when she saw the digital version of the deceased actor.
There's always something unsettling about these exercises in recreating live-action characters in digital form—they never quite manage to climb out of the uncanny valley, so they end up being a little creepy rather than life-like. That may be about to change. A computer scientist at the University of British Columbia is working to create an algorithm that simulates the movement of skin, one of the last hurdles to creating truly life-like digital characters (still no word on when CGI eyes will stop being so scary).
Dinesh Pai and his team have combined new and old tech to capture how skin folds, stretches, wrinkles, and bounces on the body. The process is a lot like traditional motion capture, but Pai's focus is more specific. "Typically when people do motion capture, they're not trying to capture the motion of the skin, they're trying to capture the motion of the bones and the skeleton of the body," he said.
I was surprised to hear how much of this work has to be done from scratch. "One of the main challenges is that it's really never been done before, so we are developing a lot of the measurement methods and estimation methods for the first time," Pai told me.
Skin is also inherently tricky, and its qualities differ from person to person. It's thin, it stretches, it deforms, and it moves in individualistic ways, making its movement really difficult to map effectively. "There isn't one thing called 'skin'," Pai said.
Digital people have been popping up a lot in live-action movies, with the first serviceable human stand-in coming in the form of a de-aged Jeff Bridges in 2010's Tron: Legacy. Bridges' digital counterpart was incredibly detailed and realistic, but still didn't fit seamlessly in with his genuine human co-stars.
The tech had already gotten noticeably better five years later when Ant-Man recreated a Wall Street-era CGI Michael Douglas. That same year, Furious 7 featured a number of shots with a CGI Paul Walker after the actor died suddenly part-way through filming. It was a small step from there to animating Cushing entirely for Rogue One.
Now that we have a solid (if imperfect) baseline, it'll be great to see Pai's research iron out those last remaining kinks in digital characters. Another Tron sequel is rumoured to be in the works, and smart money's on Lucasfilm and Disney continuing to ride the nostalgia wave in future Star Wars outings.
Then there are non-human characters onscreen. Pai's company Vital Mechanics already licensed out its software to the visual effects studio Image Engine Design for their creature effects in last year's Fantastic Beasts and Where to Find Them. More films will likely follow suit. If James Cameron wants his Avatar sequels to be as innovative as the first installment, he might want to consider this kind of CGI skin. (tHe seems to be banking on glasses-free 3D at the moment.)
Pai hopes that his work will extend beyond the movie screen. "Our main goal is actually to try to build realistic models of human skin and soft tissues not just for visual effects," he said. "We are interested in seeing how a better model of human skin and soft tissues can be used to design better products." Those products may include things like wearable technologies and hand-held tools. Down the line this could lead to advances in plastic surgery, said Pai.
Special effects do serve a practical function too. "There are some things that you know that you're doing right when it looks good in animation," said Pai. "The test of 'does it look good on the screen?' is itself actually a very valuable one."
Get six of our favorite Motherboard stories every day by signing up for our newsletter.