Jump to content
  • Advertisement
Sign in to follow this  
ne0ndrag0n

Skeletal Animation Transforms are Incorrect: Totally Stumped!

This topic is 597 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've been working on Skeletal animation for the fifth week now and cornered the problem to the vertex shader. The problem; everything looks correct according to the data sent to the GPU and what Blender is telling me. Only, it isn't.

 

To start, I have a simple, 12-vertex mesh with four bones inside:

O9guFCw.png

 

In the initial animation I have, the top part (the brighter half) has the fourth bone within it. It simply tilts to the left and back. Here is the animation at the keyframe where it is fully-tilted:

Y59wIvN.png

 

To test the bone system in my game engine, even independent of any animation, I simply set the top bone to this keyframe statically by rotating the top bone ~89.113 degrees along the Y-axis. This is identical to the way the top bone is rotated in Blender. Unfortunately, I don't get identical results:

DezvaTh.png

 

The transform ends up looking highly exaggerated and incorrect. Here's what I've already verified:

  • Weights and bone indices per vertex are identical to the way Blender portrays them. I have verified this by using apitrace to check the values of the "in" vec3's in my vertex shader.
  • The uniform of all bone transforms is being properly sent to the shader. My shader has bone 0 as always identity (for boneless models). Bone 1 is the root bone at the base of this rectangular mesh and affects the lower verts the most. Bone 2 is unused and not placed on any vertex; bones 3 and 4 are both properly associated with the top verts of the mesh. All transforms are identity save for the top bone, which is only "rotate -89.113 along the Y-axis". Also verified with apitrace.
  • Model loads from file correctly (FBX using Assimp). There are duplicate vertices with the same location for different normals, but these all have identical weights per location.

 

Next, I eventually aimed to narrow this down by trying to match the info Blender gives me in a single iteration of the vertex shader. To test this, I used the values sent to the vertex shader I got from apitrace, and will perform the same transforms in glm, where I can log the results.

 

First, I take the model and place it in the desired keyframe (with the -89.113 deg rotation). Then, to get its vertex locations in this keyframe, I permanently apply the deform to the mesh. Next, I pick one vertex to test using the same algorithm I use on my vertex shader, in CPU, where I can print the values. If this matches what is in Blender, then my method is right.

 

The vertex I select is (-0.5,-0.5,4.0). At this keyframe (according to Blender), it should be (-0.95638, -0.5, 2.63086). Meaning, when all is said and done with my transforms, this is where the vertex shader should put it:

xr5ikfB.png

 

So copying off my vertex shader...

#version 330 core
layout (location = 0) in vec3 position; // The position variable has attribute position 0
layout (location = 1) in vec3 normal; // This is currently unused
layout (location = 2) in vec2 texture;
layout (location = 3) in ivec4 boneIDs;
layout (location = 4) in vec4 boneWeights;

out vec2 fragTexture;

uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;

uniform mat4 bones[ 16 ];

void main()
{
  mat4 boneTransform =
    ( bones[ boneIDs[ 0 ] ] * boneWeights[ 0 ] ) +
    ( bones[ boneIDs[ 1 ] ] * boneWeights[ 1 ] ) +
    ( bones[ boneIDs[ 2 ] ] * boneWeights[ 2 ] ) +
    ( bones[ boneIDs[ 3 ] ] * boneWeights[ 3 ] );

  mat4 mvp = projection * view * model;
  gl_Position = mvp * boneTransform * vec4( position, 1.0f );

  fragTexture = texture;

I create this test code in GLM, just to get the "boneTransform" matrix and transform vertex (-0.5, -0.5, 4.0):

glm::mat4 id( 1.0f ); // ID 0
glm::mat4 bone( 1.0f ); // ID 1
glm::mat4 bone002( 1.0f ); // ID 2
glm::mat4 bone003( 1.0f ); // ID 3

// Keyframe is set to rotate bone003 -89.113 degrees along Y
bone003 *= glm::toMat4( glm::angleAxis( (float)glm::radians( -89.113 ), glm::vec3( 0.0f, 1.0f, 0.0f ) ) );

glm::mat4 xform =
 ( bone002 * 0.087f ) +
 ( bone003 * 0.911f ) +
 ( id * 0 ) +
 ( id * 0 );

glm::vec4 point = xform * glm::vec4( glm::vec3( -0.5f, -0.5f, 4.0f ), 1.0f );

Log::getInstance().debug( "Assert", glm::to_string( point ) );

The result I get is clearly not where Blender says it should be. However, it looks more like where the vertex is going in the original screenshot, which is a start, I guess:

7elrGdS.png

 

And just for good measure, here's the keyframe for that bone as well as its bone weights for the chosen vertex as reported by Blender. Note how the bone transform matrix names correlate with the items above (bone001 is deliberately omitted as it has no influence on any vert):

SkE1dLy.png

 

So...I'm stumped. What am I missing or doing wrong? Is Blender giving me incorrect information or weighing it wrong? This is the only thing I can think of at this point.

Edited by ne0ndrag0n

Share this post


Link to post
Share on other sites
Advertisement

This behaviour looks a lot like something I was dealing with myself about a month ago. If the joint rotation's origin isn't correct, the vertices of the mesh stretch rather than rotate around a common point. In my case it was due to how I was composing the final Translation Matrix for the bone.

 

The bone has a bind pose translation matrix and I needed to apply the desired local rotation. For it to work correctly I had to decompose the bind pose translation matrix into separate Rotation and Translation matrices and apply the local Rotation in between the two.

 

Eg: If the Bind Pose translation matrix is made up of R x T, and the desired local rotation is Y, the final transform needed to be R x Y x T, not Y x (R x T) or (R x T) x Y. 

 

It's confusing, because none of the sources I consulted mentioned the need for anything like this, but without it, I couldn't get it to work properly.

Share this post


Link to post
Share on other sites

Unfortunately I don't think that's the issue; I'm pretty sure the matrices go out properly. The only bone that changes is Bone.003; this matrix goes out as "rotate -89.113 degrees along the Y axis". The brief test I did is an "artificial" environment where I set up the matrices as they are supposed to go into the vertex shader; I'm aiming to get the same result Blender has for a specific vertex.

 

What I can't seem to understand is, what Blender is doing to make this vertex transform to the proper location of (-0.95638, -0.5, 2.63086). When I apply the theory, I get the invalid result. Where is Blender even getting the valid result; am I doing this properly?

Edited by ne0ndrag0n

Share this post


Link to post
Share on other sites

Actually that is your issue:

 

bone003 *= glm::toMat4( glm::angleAxis( (float)glm::radians( -89.113 ), glm::vec3( 0.0f, 1.0f, 0.0f ) ) )

This is taking an input vertex from the origin of the model and rotating a vector from the origin of the model to the point listed. You must translate the bone to 0,0,0, apply the rotation so that it is operating on vector the size distance from the bone origin to the actual vertex, and then translate back to the bone origin.

 

Share this post


Link to post
Share on other sites

Actually that is your issue:

 

bone003 *= glm::toMat4( glm::angleAxis( (float)glm::radians( -89.113 ), glm::vec3( 0.0f, 1.0f, 0.0f ) ) )

This is taking an input vertex from the origin of the model and rotating a vector from the origin of the model to the point listed. You must translate the bone to 0,0,0, apply the rotation so that it is operating on vector the size distance from the bone origin to the actual vertex, and then translate back to the bone origin.
 

 

Unless I misunderstood something, isn't this what I'm already doing? The reason "bone003" is set this way in the "unit-test" like function above is because this is what is sent to the shader after my pose transforms are computed. In the "live" part of the code, each bone matrix sent to my shader (uniform mat4 bones[16]) is composed using this method:

 

Assume the above model has four bones, located at 0,0,0 ("Bone")...0,0,1 ("Bone.001")...0,0,2 ("Bone.002")...and 0,0,3 ("Bone.003"). The bind pose is set as these four transformation matrices in a hierarchy in a "Pose" object. When asking the pose object for a given matrix (let's say Bone.002), the matrix is computed relative to its parents and returned. So by default, asking the pose object for "Bone.003" will give you a transform matrix that translates an object by Z+3. There is another pose object as well, the current pose, and by default this is just a copy of the bind pose object. 

 

When sending an index of the bone uniform, I multiply the inverse of a bind pose bone, by the same bone in the current pose. So for Bone.003 this results in 

shaderBone.003 = inverse(bindPoseBone * bindPoseBone.001 * bindPoseBone.002 * bindPoseBone.003) 
                      * (currentPoseBone * currentPoseBone.001 * currentPoseBone.002 * currentPoseBone.003)

With no animation playing this just results in an identity matrix. With currentPoseBone.003 set to rotate ~89 degrees along Y (as well as its original translation of Z+3), the inverse of bindPoseBone.003 strips out the translation, leaving only the 89 degree rotation.

 

In "bone003" from what you cited above, this is what I did: created a transformation matrix that ONLY rotates the vertex 89 degrees around Y. And that's what I assumed the bone uniform had to be. Is this an incorrect method of calculating the bone uniforms?

Edited by ne0ndrag0n

Share this post


Link to post
Share on other sites

 

created a transformation matrix that ONLY rotates the vertex 89 degrees around Y.

 

Correct, it will rotate the vector from 0,0,0 to the vertex, around the global Y axis.  You want to rotate the vertex by the bones axis. So you must bring the vertex to the bones origin by subtracting the bone translation. Then rotating then translating back to the bone position.

 

My matrices are loaded as a set of translations and a set of rotations. I apply the inverse of the bones bind pose translation. I apply the inverse bind pose, then apply the actual bone rotation (in your instance a y-rotation). Then I take it back to the bones bind pose position. Finally I apply any actual translation that occurred for the animation frame ( for instance if jumping, this would be the root bone translating upwards).

*(bone->Animation_Frames[j]) = frame_translate*translate*(*bone->Animation_Frames[j])*inv*inv_translate;

All you have applied in your test case is 1 of those matrices, which corresponds to a y-rotation about the center of your model. Not the center of Bone 3.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!