Archived

This topic is now archived and is closed to further replies.

Postures and Gestures: an approach to skeletal animation

This topic is 5330 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I''ve been looking into skeletal animation - my 3D models being mainly humanoid, built in MS3D - and wondering about ways to store the information. I think I''ve come up with a method, and I wanted to see what people thought of it before I went to implement it. Assuming a skeleton model, where each vertex is associated with a single bone, and the bones are in a simple tree structure with a single root: Bone orientation information can be split into two groups: poses (static arrangements of bones) and gestures (the more traditional ''keyframed'' movement of bones over time). Any given pose or gesture only affects a subset of the bones in the skeleton; as such, the legs of my model could be posed independently of the arms. This should allow complex poses to be set up by ''mixing'' other poses together (as each pose/gesture has a blending weight). How''s this as a general plan? Superpig - saving pigs from untimely fates, and when he''s not doing that, runs The Binary Refinery.

Share this post


Link to post
Share on other sites
A problem I see, and I don't know how this is handled, is that certain animations affecting particular parts of the model would look more natural properly affecting the other parts of the model.

For instance, the legs are running, but the torso is looking up. The running stance should change a little depending on the positioning of the torso.

This might not have been so important a few years back, but nowadays, I think the quality of today's games are demanding that this kind of realism be achieved.

I'm thinking that in some cases, some joints actually store more than one orientation. One orientation is used for basic movement, where the other is used as an adjustment in other animations. How that will make the model look between standing still, walking, jogging and in an all out run, I have no idea.

[edited by - Waverider on May 6, 2003 12:17:17 PM]

Share this post


Link to post
Share on other sites
Cheesegrater: basically, yes, making implementation fairly simple. However, a gesture doesn''t have to have all the bones defined on every keyframe - that is, a gesture defines a set of bones which are a subset of the whole model, and each pose within that gesture uses a subset of *those* bones. Each bone is interpolated between keyframes where it''s actually defined.

Waverider: So perhaps a system for defining rules about which pose gets used when? Grouping poses and gestures into groups, and saying ''pick one from this group, depending on the model''s current state?'' That way, different running gestures could be used depending on the pose the top of the body is using.

Of course, I can''t animate for poop, so any talk of realism has to be taken with a pinch of salt.

Superpig
- saving pigs from untimely fates, and when he''s not doing that, runs The Binary Refinery.

Share this post


Link to post
Share on other sites
I don''t know how to do it in MS3D, but for directx skinned mesh allows you to assign more than one joint to a vertex using weights. So in a sense when the legs move "independently" you can still account for the movement of the torso.

Share this post


Link to post
Share on other sites
Mete Cirrigan reports that it''s being written into MS3D version 2. Which is all good, because I as a registered MS3D1 user I might get a discount

Share this post


Link to post
Share on other sites