We've come a long way from the days of infuriating tank controls in video games, but it can still often feel like you're controlling a whale rather than a stealthy warrior in Assassin's Creed. But a new method posted by Daniel Holden, a researcher at Ubisoft Montreal, uses machine learning to solve one of the biggest headaches still plaguing developers.
Right now, nearly every game relies on a character control system that does its best to seamlessly blend a bunch of fixed animation loops together. The problem is that this method tends to fall apart when it needs to account for the infinite variables of player inputs and different environments. Check out the latest Mass Effect for some less than seamless results.
Holden and the co-authors of the paper Taku Komura and Jun Saito suggest a new method that uses a neural network—a computer program that teaches itself to estimate things using input data (in this case, body positions and where to go next.) Rather than shift between a collection of canned animations, the character animation "takes as input user controls, the previous state of the character, the geometry of the scene, and automatically produces high quality motions that achieve the desired user control."
The co-authors will be presenting the paper at this year's SIGGRAPH, a conference for computer graphics tech.
The impressive demos in the video speak for themselves, though. Instead of the glitchy stanky leg character animations that typically occur whenever a player suddenly switches directions, the neural network method allows for a seamless sidestepping that would make Michael Jackson proud. Whether or not the new method becomes common practice in the industry remains to be seen. But here's to hoping it leads to Assassin's Creed and Mass Effect characters that move around like humans rather than twerking robots.