Not a question... just my findings and work-around solutions!
I'm going to write this down and post it while it's fresh in my mind...
Having spent several months (I'm ashamed to say) writing my own class for parsing x files and displaying skinned meshes I've realised the thing that held me back the most was not being sure exactly how the data is stored in the x file.
I read the excellent chapter on skinned meshes in Frank D Luna's "Game Programming with DirectX 11" which explained the mechanism of animating the mesh about as well as any article I've read, and there are plenty more tutorials out there. The one problem with it is that it uses a custom file type to hold the mesh data. This then means you either have to write your own file exporter for your modelling software (Blender in my case) or learn to interpret one of the file types it already exports. I had an experiment with Assimp to import the data but got inconsistent results (certainly my fault, looking back at it!) so decided to bite the bullet and start from scratch.
I'm not going to write a tutorial on creating the whole class here. Like I say there are plenty of tutorials explaining where and when to use the offset matrix, lerping between quaternions etc. I realised that most of my time had been wasted trying to work out why my final bone tranformations were all at crazy angles when if fact I hadn't loaded the correct matrices and quaternions in the first place!
My biggest hurdles were finding out which of the many matrices listed in the x file were the offset matrices I needed, and converting from Blender's "Z is up" coordinate system to DirectX's "Y is up" system. This is straight forward enough for static meshes but becomes a bit more of a mental challenge with matrices and quaternions.
There are some parts of the x file that I ignored completely. I'm sure they are there for a good reason but I couldn't figure it out! (and I got my class to work without them)
So, this is to be about how I got the correct data out of my x file rather than how to animate the mesh....
To start with some notes on static meshes...
When exporting I used the following settings ticked in the Blender 2.67 x file exporter... (the 2.67 is important as it has a new script for exporting x files)
Export Meshes
Export Normals
Export UV Coordinates
Export Materials
Everything else is unticked - but you must make sure your blender scene does not contain anything else such as cameras or lights. Also I set the mesh oblject's shading property (Object tools) to flat rather than smooth. That gives us the same number of normals as faces in our mesh. I then wrote a function to smooth the normals once the mesh was loaded.
Your x file should have the mesh data start something like...
Frame MyMesh {
FrameTransformMatrix {
1.000000, 0.000000, 0.000000, 0.000000,
0.000000, 1.000000, 0.000000, 0.000000,
0.000000, 0.000000, 1.000000, 0.000000,
0.000000, 0.000000,-0.407193, 1.000000;;
}
Mesh { // MyMesh mesh
576;
0.195090; 0.980785;-1.000000;,
0.000000; 1.000000;-1.000000;,
0.000000; 0.000000; 1.000000;,
0.000000; 0.000000; 1.000000;,
.
.
.
.
I skipped the FrameTransformMatrix and started reading data at "576;". This is the number of vertices in the mesh. Each vertex's coordinates are then listed in batches of three. Now, here's the important bit... We are changing from Blender's coordinate system to DirectX so you need to read the three numbers in x, z, y order. So vertex 0 will be at:
x = 0.195090, y = -1.000000, z = 0.980785
I read these straight into my vertex data one vertex at a time.
At the end of the vertex data the x file lists the number of faces in the mesh...
.
.
.
-1.000000;-3.000000; 1.000000;,
0.000000;-2.000000; 1.500000;;
192;
3;0,1,2,,
3;7,6,5,,
4;11,10,9,12,,
4;15,14,13,17,
.
.
.
In this case 192 faces. The following list becomes your index buffer, but you can't read them straight in like with the vertex buffer. This is because Blender stores faces made from 3 or 4 vertices. The first number in each line tells you how many vertices make up the next face.
This means the first face of my mesh has three vertices (vertex 0, vertex 1 & vertex 2)
The second face also has three vertices (vertex 7, vertex 6 & vertex 5)
The third face has four vertices and needs splitting into two triangles.
Happily this is easy enough...
The first one is made of vertex 11, vertex 10 and vertex 9.
The second is made of vertex 9, vertex 12 and vertex 11.
And so on... So my index data looks like:
0,1,2,7,6,5,11,10,9,9,12,11,15,14,13,13,17,15... and so on...
After this comes the mesh normals...
.
.
.
3;92,91,90,,
3;95,94,93,;
MeshNormals { // MyMesh normals
192;
-1.000000; 0.000000; 0.000000;,
0.000000; 1.000000;-0.000000;,
1.000000; 0.000000;-0.000000;,
.
.
.
The number of mesh normals should match the number of faces, 192 in this case. If you didn't set the Object shading property in Blender to flat these numbers won't match. I read these into a temporary std::vector<D3DXVECTOR3> in x, z, y order so that my std::vector looks like
myvector[0] = D3DXVECTOR3(-1.000000, 0.000000, 0.000000)
myvector[1] = D3DXVECTOR3(0.000000, -0.000000, 1.000000)
myvector[2] = D3DXVECTOR3(1.000000, 0.000000, 0.000000)
...
At the end of the list of normals comes a list that assigns each vertex with one of the normals we've just loaded...
.
.
.
0.866024; 0.000000; 0.500002;,
0.866026;-0.000000; 0.500000;;
192;
3;0,0,0,,
3;1,1,1,,
4;2,2,2,2,,
4;3,3,3,3,,
.
.
.
This means the normal for our first 3 vertices in our vertex buffer is the D3DXVECTOR3 stored in myvector[0],
the normal for our next 3 vertices is the D3DXVECTOR3 stored in myvector[1],
the normal for our next 4 vertices is the D3DXVECTOR3 stored in myvector[2],
the normal for our next 4 vertices is the D3DXVECTOR3 stored in myvector[3],
... and so on.
Once we have these copied to our vertex data we can delete our temporary std::vector now. We won't need it again.
After this comes the Texture Coordinates, and this is nice and simple...
.
.
.
3;190,190,190,,
3;191,191,191,;
} // End of MyMesh normals
MeshTextureCoords { // MyMesh UV coordinates
576;
0.703402; 0.278490;,
0.689923; 0.268594;,
.
.
.
576 UV coordinates listed (should match the number of vertices) in u, v order. These can be read straight into our vertex data.
Lastly the material data should look something like:
.
.
.
Material Material {
0.640000; 0.640000; 0.640000; 1.000000;;
96.078431;
0.500000; 0.500000; 0.500000;;
0.000000; 0.000000; 0.000000;;
TextureFilename {"MyTexture.bmp";}
}
} // End of MyMesh material list
.
.
.
I restricted myself to one material per mesh to keep things simple and only bothered to load the specular value (96.078431) to copy to the shader, and texture file name.
And that's it for static meshes.
What we need to load in addition to this for a skinned mesh is:
A bone heirarchy
skin weights
offset matrices
animation data
To export these from Blender we need to tick a few more settings in the x file exporter:
Export Skin Weights
Export Armature Bones
Export Rest Position
Export Animations
I found it useful to make sure each bone had at least a keyframe at the start and end of the animation in Blender before exporting.
The bone heirarchy can be worked out from the beginning of the x file
.
.
.
Frame Root {
FrameTransformMatrix {
1.000000, 0.000000, 0.000000, 0.000000,
0.000000,-0.000000, 1.000000, 0.000000,
0.000000, 1.000000, 0.000000, 0.000000,
0.000000, 0.000000, 0.000000, 1.000000;;
}
Frame Armature {
FrameTransformMatrix {
1.000000, 0.000000, 0.000000, 0.000000,
0.000000, 1.000000, 0.000000, 0.000000,
0.000000, 0.000000, 1.000000, 0.000000,
0.000000, 0.000000, 0.407193, 1.000000;;
}
Frame Armature_Bone {
FrameTransformMatrix {
1.000000, 0.000000, 0.000000, 0.000000,
0.000000, 0.000000, 1.000000, 0.000000,
0.000000,-1.000000, 0.000000, 0.000000,
0.000000, 0.000000, 0.000000, 1.000000;;
}
Frame Armature_Bone_001 {
FrameTransformMatrix {
0.168060,-0.301078, 0.938674, 0.000000,
0.873177, 0.487403, 0.000000, 0.000000,
-0.457512, 0.819628, 0.344807, 0.000000,
0.000000, 1.135515,-0.000000, 1.000000;;
}
} // End of Armature_Bone_001
Frame Armature_Bone_002 {
FrameTransformMatrix {
-0.236762,-0.425095,-0.873635, 0.000000,
-0.873635, 0.486582, 0.000000, 0.000000,
0.425095, 0.763238,-0.486582, 0.000000,
0.000000, 1.135515,-0.000000, 1.000000;;
}
} // End of Armature_Bone_002
} // End of Armature_Bone
.
.
.
I ignored all of these matrices but used this structure to load the names of each bone (Armature_Bone, Armature_Bone_001 & Armature_Bone_002) and work out the heirarchy.
Armature_Bone is our root bone.
In this case Armature_Bone_001 & Armature_Bone_002 are both children of the root bone.
If Armature_Bone_002 was listed before the "// End of Armature_Bone_001" then it would have been a child of Armature_Bone_001.
Further down the skin weights are listed like so:
.
.
.
SkinWeights {
"Armature_Bone";
192;
0,
1,
2,
.
.
.
This indicates that Armature_Bone influences 192 vertices, and then goes on to list which ones.
At the end of this list is another list of float values
.
.
.
190,
191;
1.000000,
0.032200,
0.480000,
1.000000,
1.000000,
.
.
.
...which tell us how much Armature_Bone influences each of the listed vertices (the skin weights).
So Armature_Bone has an influence of 1.0 over vertex[0] but only 0.0322 over vertex[1] etc.
Once we have this list for each bone we need to loop through them all to work out which are the four most influential bones for each vertex (and to store those bone IDs and skin weights in each vertex's data).
At the end of the list of skin weights is the offset matrix for that bone...
.
.
.
1.000000,
1.000000;
0.168060, 0.873177,-0.457512, 0.000000,
-0.938673,-0.000000,-0.344807, 0.000000,
-0.301078, 0.487403, 0.819628, 0.000000,
0.464475,-0.751920,-1.264446, 1.000000;;
} // End of Armature_Bone skin weights
.
.
.
Now here we need to be careful because this matrix is in Blender's coordinate system.
The conversion to DirectX goes like this...
Blender's Matrix:
{_11, _12, _13, _14,
_21, _22, _23, _24,
_31, _32, _33, _34,
_41, _42, _43, _44}
becomes in DirectX:
{_11, _13, _12, _14,
_31, _33, _32, _34,
_21, _23, _22, _24,
_41, _43, _42, _44}
So the offset matrix above that we store for Armature_Bone would look like:
{ 0.168060,-0.457512, 0.873177, 0.000000,
-0.301078, 0.819628, 0.487403, 0.000000,
-0.938673,-0.344807,-0.000000, 0.000000,
0.464475,-1.264446,-0.751920, 1.000000; }
Finally we need to load the bone animations. This is listed towards the end of the x file and begins with:
.
.
.
AnimationSet Global {
Animation {
{Armature}
AnimationKey { // Rotation
0;
50;
0;4;-1.000000, 0.000000, 0.000000, 0.000000;;,
1;4;-1.000000, 0.000000, 0.000000, 0.000000;;,
.
.
.
"AnimationSet Global {" tells us we are looking at the start of animation data, and the following animation data is for {Armature}. We can ignore this and skip forward to the animation data for our bones, so we search on for...
.
.
.
49;3; 0.000000, 0.000000, 0.407193;;;
}
}
Animation {
{Armature_Bone}
AnimationKey { // Rotation
0;
50;
0;4;-0.668282,-0.331680,-0.466460,-0.475187;;,
1;4;-0.668557,-0.331353,-0.466652,-0.474719;;,
2;4;-0.669386,-0.330367,-0.467231,-0.473306;;,
.
.
.
This is a list of key frames for the rotation of Armature_Bone (our root bone)
The "0;" tells us that the key frames are for rotation (as opposed to position or scale)
There are "50;" key frames. Each key frame is a time position (frame number in Blender) and then "4;" numbers representing the quaternion of the rotation.
Again, we are converting Blender's coordinate system to DirectX so we read each quaternion in w, x, z, y order. So the first three quaternions we will store for Armature_Bone above would be:
Time Position = 0, w = -0.668282, x = -0.331680, y = -0.475187, z = -0.466460
Time Position = 1, w = -0.668557, x = -0.331353, y = -0.474719, z = -0.466652
Time Position = 2, w = -0.669386, x = -0.330367, y = -0.473306, z = -0.467231
Scale is read in the same way, remembering to swap the order of y and z
.
.
.
AnimationKey { // Scale
1;
50;
0;3; 1.000000, 0.100000, 1.000000;;,
1;3; 1.000000, 0.100300, 1.000000;;,
.
.
.
The "1;" tells us that these key frames are for scaling. There are "50;" key frames with a time position and "3;" numbers to represent the scaling in x, z, y order.
Time Position = 0, x = 1.000000, y = 1.000000, z = 0.100000
Time Position = 1, x = 1.000000, y = 1.000000, z = 0.100300
And again for position...
.
.
.
}
AnimationKey { // Position
2;
50;
0;3; 0.000000, 1.135515,-0.000000;;,
1;3; 0.000000, 1.135515,-0.000000;;,
.
.
.
The "2;" indicates position key frames so this becomes:
Time Position = 0, x = 0.000000, y = -0.000000, z = 1.135515
Time Position = 1, x = 0.000000, y = -0.000000, z = 1.135515
All this key frame data then needs sorting so that each key frame consists of a single time position, a rotation quaternion, a position and a scale.
Each bone animation contains of all the keyframes (50 in this case) for one bone.
Therefore we should have the same number of bone animations as bones in our mesh.
This has become quite a bit longer than expected, but hopefully it might save someone else struggling along with the wrong matrices for months!
I'm reluctant to post my source files as my work-around solutions mean the code will probably not work with all x files, possibly from earlier versions of Blender or other modelling software. I'd rather have it more finished and stable before posting. Sorry, but I hope this post is of some use!
Ian Walters.