Is it possible to use motion capture data to refine character movements?


I'm no expert when it comes to this kind of thing, and what BM have accomplished already is incredibly impressive (especially considering they laid the groundwork 10 years ago), but I've seen some developments lately that seem like they were specifically designed for a project like this.

This does use some neural network shenanigans, but it's running in real time:
There's a more in-depth look at that system here.

To summarize, it's a deep learning model that was extensively trained on mocap data. The characters are basically just ragdolls, and the AI learns to apply forces to the skeleton in order to move them around - which seems reminiscent of the way Exanima handles things.

There also this, which is a similar technique being applied to a more traditional non-simulated character.

Now even if it is possible to implement something like this, I imagine it would take a very long time to refine and get the game back into a somewhat balanced state - not to mention taking time away from content development and actually finishing up Exanima. Still, I thought it might be worth considering in the long term.


The sad fact is that technology will outpace this game and you'll be able to do all this fancy physics they spent 10+ years on in a few weeks with AI. Also they won't be able to use any of this tech because they built their engine themselves.


© Copyright 2019 Bare Mettle Entertainment Ltd. All rights reserved.