Jump to content
  • Advertisement
Sebastian Starke

R&D [SIGGRAPH 2018] Mode-Adaptive Neural Networks for Quadruped Motion Control

Recommended Posts

Advertisement

Awesome! Is the 3ms you show the required runtime for the ANN to generate motion from an input as seen on the buttons on top of the screen? / Do you think it can become faster?

Personally i think the result is worth the time, but game designers should start to think about how this tech can create new gaming experiences while dealing with limitations about quantity.

Share this post


Link to post
Share on other sites
Posted (edited)

No, this is the m/s (meters per second) as a user-control to specify how fast the character should move. The locomotion gait then automatically results from this ;)

Edited by Sebastian Starke

Share this post


Link to post
Share on other sites

Ouch, haha :)

But some performance numbers would be very interesting for game devs! (it's not mentioned int the paper either).

I work on physical simulation of bipeds. Here balancing is such a tight constraint that natural locomotion comes out automatically and no machine learning seems necessary for the basics. But for more complex motion beyond walking / running i think both approaches need to be combined.

Share this post


Link to post
Share on other sites

Yeah, these guys at UBC do some nice work - just look for DeepMimic, which will be presented at the same venue where we are presenting our quadruped research. However, the computational requirement for the network is about 2ms per frame on a gaming laptop. It runs on the CPU right now with the Eigen library - running it on the GPU gives a significant speedup of course. From what I have heard, NVIDIA made a demo using the PFNN on humans with ~100 characters in parallel - so I think it's already applicable for practical use :)

Share this post


Link to post
Share on other sites

It's really amazing to see how animation tools are evolving, can't wait to see what games can be made from this.

So what is the expected and hoped for path that this awesome software will take after SIGGRAPH? Are you planing on making your own software with this or is your aim for it to be used directly with game engines?

Looks brilliant.

Share this post


Link to post
Share on other sites

I'm just working on my PhD, that's all ;) However, I think there is a good chance that people might want to integrate this because it's directly developed inside Unity. We will continue researching on other relevant projects for character animation using AI techniques - so let's see... :)

Share this post


Link to post
Share on other sites

Hi Sebastian, this is very interesting! Is there an electronic version of the paper that we can read? I would like to learn more. Thanks!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!