Sign in to follow this  

Creating a custom model format that supports skeletal animation - feedback wanted

This topic is 3002 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to create a custom model format that supports skeletal animation. I want something that's relatively easy to work with, but also something that is usable in a "real" project. I want this to be something that can be useful to other people, both for beginners trying to learn more about skeletal animation (so ease of use is important) and possibly to indie developers working on commercial games. It probably won't be "advanced" enough for AAA titles, but I can live with that. [smile] Here's what I came up with so far:
// General Notes:
// This is a binary format.
// All counts (num vertices/indices/etc.) are signed integers.
// Strings are nul-terminated.
// Matrices have 16 floats and are stored row after row.

header - a version number and maybe an ID string

num vertices
for each vertex
    position - 3 floats
    normal - 3 floats
    tex coords - 2 floats
    weights - 3 floats 
    indices - 4 ubytes

// The default values for unused weights are: 1, 0, 0.

num indices
for each index
    index - ushort

num materials
for each material
    ambient - 4 floats
    diffuse - 4 floats
    specular - 4 floats
    emissive - 4 floats
    specular power - float
    texture name - string[256] (just a texture name, no path)

num parts
for each parts
    vertex start - int
    vertex count - int
    index start - int
    index count - int
    material index - int (-1 if no material)

num bone influences - int (0 - 4)

// Zero bone influences means there's no animation data, that is,
// this is a static model

num palette entries
for each entry
    bone name - string[64]
    offset matrix - matrix

num bones
for each bone
    name - string[64]
    local transform - matrix
    sibling index - int
    child index - int

// The first bone in the array is the root.
// The name can be "(Empty Name)" if the name in the original model was "",
// or "(NULL Name)" if the name in the original model was NULL.
// Sibling/child indices are -1 if there's no sibling/child.

num animations
for each animation
    name - string[64]
    duration - float
    
    num bones participating in this animation sequence
    for each bone
        bone name - string[64]
        
        num rotation keyframes
        for each keyframe
            time - float
            rotation quaternion - 4 floats (x, y, z, w)
        
        num scale keyframes
        for each keyframe
            time - float
            scale vector - 3 floats
        
        num translation keyframes
        for each keyframe
            time - float
            translation vector - 3 floats

I tried to come up with a format that is relatively easy to parse, even if it means the actual file is a bit bigger than it could have been. My experience with skeletal animation and model formats is very limited, so I was hoping to get some feedback, as well as suggestions for improvement. Right now I have a few specific questions, but before that, I should probably mention that I'm also working on a converter that converts .X files to this format, and so most of my knowledge and ideas come from working with .X files and the D3DX animation API. I also plan to write a few demos that show how to load, render and do animation with this format, and I hope to release the source code to both the demos and the converter. Now for the questions: - Should there be an offset table specifying the offset to the vertices/indices/etc. from the beginning of the file? - Should each part store its vertices/indices, or should they be stored in one large vertex/index buffer and each part will store offsets/counts? - The material structure seems too basic to do anything interesting. How should I extend it? Store multiple textures? Store the name of an .fx file? Anything else? - Should the data be stored in several files? One file for the rendering data - vertices, indices, materials, parts, the number of bone influences and possibly the offset transforms (since these are unique to each model?). One file for the "skeleton" data - the bone hierarchy and the array of bone names (this array is used to specify the name of the ith bone, which you need to know to correctly build the matrix palette). One file for the animation data - the keyframes. If you have several characters in a game that all use the same model, you probably want to share the rendering data, but not the skeleton data, since each character animates independently from the others. I also think the animation data can be shared between different models. They would have to have a similar shape, but this is not too uncommon. That's all I have so far. I would really appreciate any feedback, both good and bad.

Share this post


Link to post
Share on other sites
Hey Gage. Some quick comments for you.

The first bytes in a data file should specifiy the format. For consistency, use four unsigned character bytes and use ASCII. Format specific information like version numbers, etc., can come after that. That supports a more universal and quick check if the file being loaded is the format expected.

If this is to support cross-platform use, you should define whether it uses big-endian or little-endian storage, either as part of the spec or as a flag in the file itself. Also, if cross-platform is intended, you need to define all data types in terms of number of bytes. E.g., you didn't define "ushort." With today's advancing computers, double-precision floats are often faster than single-precision. They can be converted during loading, but, in binary format, I still need to know how many bytes to read in.

Probably not important: You should define whether wide-character support is supported or not or just make the flat statement that all characters are ASCII.

Is there a reason to specify all counts as signed integers? In particular, if you specify "signed" for all indices, then you can't take full advantage of 32-bit index buffers.

You need to specify whether the model is in a right-hand or left-hand system. There have been no end of problems with mesh file formats in this regard. SMD and several other formats, for instance, are right-handed. X-files are left-handed (one of the few).

Right-hand and left-hand would apply to how matrices are to be handled, also.

for each vertex
If the intent is to support static meshes, or for quick testing, you should add a color to each vertex.

By specifying "ubyte" for the bone indices, you're restricting the influences to the first 256 bones. That's probably sufficient[ATTENTION] [SMILE] but for a consistent format, the bone index data-type should match the "num bones" data-type.

It appears that you're limiting the max bone influences per vertex to 4. That's probably sufficient but not flexible. See next comment.

You need something to indicate the number of bone influences for a vertex, even if you require the storage space for unused influences. Otherwise, it will require unnecessary testing or looping. If you intend that bone weights will be 0 to flag "non-influences," then either the loading code will have to count the number of bone weights > 0 to determine the number of bone influences, or the animation loop will have to access bone matrices it's not going to use and do a lot of multiplications by 0.

num indices
See comment above about matching data-types. "num indices" is signed int, actual indices are ushort. "num indices" and actual indices should probably be unsigned 32bit if you want to support 32bit index buffers.

num parts
Are these subsets of the mesh? If so:

If you intend that the format supports indexed meshes, then you shouldn't specify start index and count for vertices and indices. This forces duplicate vertices which might be a good thing to avoid. Many vertices will be shared between subsets in an indexed mesh.

A subset need only be defined by vertex indices and a material. Those indices will not be in serial order.

Maybe:
for each part: number of indices, list of indices

num bone influences
Why not just "0" or "1" as a flag? The number of influences is per-vertex anyway.

num palette entries and num bones
Any reason to have these separate?

If they're separate, it'll require the loading code to do string searches to match up offsets with locals, etc. In general, the format should support minimizing loading code burden. The file is created once but it may be loaded many times.

How about adding a "parent" index for each bone? That will allow more flexibility in the loading and animation code. I.e., if I want to determine the combined matrix of a bone, I would otherwise have to search all bones to find the bone that has that bone as a child. That, also, relieves some of the burden from the application.

animation frames
You separate all the quats from the vectors, etc. It's true that supports a minimization of storage but requires bookkeeping on the part of the loading app to keep track of 3 different lerps that may or may not have the same time key. Putting all the information for each time-step in one structure requires more storage but simplifies the animation code.

Quote:
Should there be an offset table specifying the offset to the vertices/indices/etc. from the beginning of the file?
I wouldn't think that would be useful. It'll just put more burden on the loading code and allow additional errors.
Quote:
Should each part store its vertices/indices, or should they be stored in one large vertex/index buffer and each part will store offsets/counts?
Indices and materials are all that's required to specify a subset. Let the application decide how to store the vertices and indices. Sometimes one big vertex buffer will be the way to go. Sometimes it may be better to use multiple buffers with index offsets, etc. Don't require duplicated vertices for each subset.
Quote:
material structure .. Store the name of an .fx file?
Interesting. Perhaps not a name but some sort of recommended processing so the user could apply just pipeline if desired, or a default vertex shader. If you get into effects and shaders, then you have to think about what shader versions are supported, individual machine capabilities,etc.
Quote:
the offset transforms (since these are unique to each model?)
Offset transforms are unique to a bone hierachy, not a mesh. The bone influences for a vertex are sufficient and specify which bone offset will be used.
Quote:
Should the data be stored in several files?
You might want to consider having an "animation-only" flag for the file so animations could be stored and loaded separately. Animations can be applied to different meshes (provided the meshes were developed with the same bone hierarchy, which is common).
Quote:
you probably want to share the rendering data, but not the skeleton data, since each character animates independently from the others.
I would expect it to be the other way 'round. Unique meshes for different types of characters but many characters having similar animations ("run", "walk", etc.).

Again, I emphasize the specification of a right-hand versus left-hand statement. That will support conversion of various modeling formats correctly.

Whew.

Share this post


Link to post
Share on other sites
For the material part: don't include any material information in the mesh itself. Textures, shaders, diffuse/ambient/emissive/specular colours, powers, and so on are all material-specific and not mesh-specific, and you should be able to apply any material to any mesh. Perhaps the most you should include material-wise is a default material name which the mesh should use if not assigned any material manually. This material name could then correspond to a material you've created via other means (such as from a script).

Share this post


Link to post
Share on other sites
Quote:
Textures (etc.) .. are all material-specific and not mesh-specific, and you should be able to apply any material to any mesh
I'd have to disagree. Materials are very mesh-specific. E.g., vertex texture coordinates usually apply to a specific texture.

@Gage64-
EDIT: Rats! I wasn't thinking through the problem of duplicate vertices in subsets correctly. It does look like you'll have to have duplicate vertices for vertex locations that separate subsets because each vertex has just one set of texture coordinates. Unless you want to specify an array of tex coords for each vertex, a set of tex coords for each subset the vertex is used in.


[Edited by - Buckeye on August 19, 2009 12:54:00 PM]

Share this post


Link to post
Share on other sites
Buckeye: Thanks! This is just the sort of feedback I was hoping to get - and there's so much of it!

You made a lot of great points and suggestions. I have some questions about some of them, but I'm kind of tired so they will have to wait until tomorrow or maybe Friday.

But I will definitely follow up on this! Your comments make me believe that with enough work, I might actually create something that will be useful to other people. Working on this has been frustrating at times because .X files and the D3DX animation API keep throwing new surprises at me, but with this kind of feedback I might actually produce something descent.


Quote:
Quote:
Textures (etc.) .. are all material-specific and not mesh-specific, and you should be able to apply any material to any mesh

I'd have to disagree. Materials are very mesh-specific. E.g., vertex texture coordinates usually apply to a specific texture.


The problem is that some of the models I've worked with don't specify material information very well. For example, tiny_4anim.x from the SDK has 1 subset but 2 materials, and only the second one contains a texture name. The two materials have different lighting properties, but AFAIK there's nothing in the file that states which material should be applied.

Share this post


Link to post
Share on other sites
Yeah, you just may be able to create a useable format. Stranger things have happened. [SMILE]

I don't have tiny4_anim.x. But, if it contains a MeshMaterialList, that's the table that specifices which vertices (or faces? subsets? I can't remember) each material applies to.

Also, we posted simultaneously. I edited my previous comment with regard to vertices that are "shared" between subsets.

EDIT: It depends on the loader, but, if a MeshMaterialList is not present, it may just default to material 0 for all subsets.

EDIT2: There doesn't seem to be a reason to specify a binary-only format. The file header, after the file-type signature (first 4 bytes), could contain a "binary" or "text" flag, also in ASCII.

Having a text format would have several benefits.

Most important: during testing phases, you can create a text file much more easily than a binary file!

In addition, text file loading and parsing, though a lot more complicated, allows for differences in machine floating point precision without having to know how many bytes "float" is on any particular machine (a cross-platform thing). That is, I can read the text of "3.14159" without worrying about how many bytes it's supposed to be or whether it's big-endian or little-endian.

If you haven't guessed, I've been down the road of loading and converting a lot of different file formats!


[Edited by - Buckeye on August 19, 2009 12:20:22 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
... Unless you want to specify an array of tex coords for each vertex, a set of tex coords for each subset the vertex is used in.


That sounds like it would only complicate the loading code, and the only advantage I see is that it will reduce the file size (probably not by much). So I think I'll just stick to using duplicate vertices.

Also, I don't actually duplicate any vertices in my conversion code. I just copy the vertices straight from the vertex buffer of the mesh, so any duplicate vertices are already there.

Share this post


Link to post
Share on other sites
Quote:
That sounds like it would only complicate the loading code, and the only advantage I see is that it will reduce the file size (probably not by much). So I think I'll just stick to using duplicate vertices.

Sorry about that. My comment on tex coord array was sort of a joke as it would, indeed, complicate the bookkeeping!

Cross-posted again: I added some comments about text-optional vs binary-only format in my previous post.

EDIT: Again about duplicate vertices. If you load an x-file with multiple subsets and then use D3DXMeshSaveToFile (or whatever it is), it seems to just blindly output a complete vertex set for each subset, whether that subset has a different material or not! There's usually a long list of duplicate vertices.

Vertices, even if in different subsets, need not be duplicated if every subset they're used in has the same material.

Subsets may have to be created to limit the number of bones-per-subset for shader capability reasons. Those subsets may still use the same material, however.

NOTE: The duplicate vertices comments are just my engineering way of thinking. When something's going to be created once (a file) but used many times, the burden of processing should be on the file creator, not the loader. Duplicate vertices require more memory, possibly multiple (unnecessary) vertex buffer swaps each frame, etc. You can ignore it all if you want. [WINK]

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
Quote:
Textures (etc.) .. are all material-specific and not mesh-specific, and you should be able to apply any material to any mesh
I'd have to disagree. Materials are very mesh-specific. E.g., vertex texture coordinates usually apply to a specific texture.


Correct, but I think the material itself should be separate from the mesh. For example, take the way Ogre handles this:

// this texture unit is found in a material called "some_material"
texture_unit
{
texture some_texture.tga
// this texture applies to texture coordinate set 1
tex_coord_set 1
}


Then, the mesh would simply have a default material name "some_material" without any further material information, which is contained within the material itself.

Share this post


Link to post
Share on other sites
Quote:
I think the material itself should be separate from the mesh.

Maybe I don't understand what your intent is or what you mean by "mesh." Do you mean in the same file, or stored with the mesh data itself?

It appears you mean storing texture information in a separate file and specifying only a reference to a texture in the "main" file which would be "dereferenced" to load the texture. Is that correct?

If so, that could result in a lot of headaches. If I change tex coords in my model using one texture, it will not render correctly if a different texture is used. If that's not the intent, why not specify the texture directly?

Am I missing your point entirely?

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
If you load an x-file with multiple subsets and then use D3DXMeshSaveToFile (or whatever it is), it seems to just blindly output a complete vertex set for each subset, whether that subset has a different material or not! There's usually a long list of duplicate vertices.


I don't use D3DXMeshSaveToFile. I save all the data myself, so the subset boundaries are preserved.


Quote:
Subsets may have to be created to limit the number of bones-per-subset for shader capability reasons. Those subsets may still use the same material, however.


For simplicity, I've decided to not bother with this, because this would require something like the bone combination table, and I don't want to force the users of this format to use this table. If you wan't to create additional subsets to save on shader constants (possibly duplicating vertices in the process), you'll have to write code to do that.

Quote:
Vertices, even if in different subsets, need not be duplicated if every subset they're used in has the same material.


I'm not sure what you mean. Aside from saving on shader constants, the only reason I can think of to duplicate a vertex is when you have two vertices that share the same position, but don't share tex coords or normals or something else. Otherwise, why would you duplicate it? (again, aside from saving on shader constants).

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
Quote:
I think the material itself should be separate from the mesh.

Maybe I don't understand what your intent is or what you mean by "mesh." Do you mean in the same file, or stored with the mesh data itself?

It appears you mean storing texture information in a separate file and specifying only a reference to a texture in the "main" file which would be "dereferenced" to load the texture. Is that correct?

Correct. In-code you will have a Mesh and you will have a Material, and the mesh can store a pointer to the Material it will use (and, by default, the pointer will point to the "default" material specified within the mesh file).

Quote:

If so, that could result in a lot of headaches. If I change tex coords in my model using one texture, it will not render correctly if a different texture is used.

Correct. You will need to change the texture in the material script.
Quote:
If that's not the intent, why not specify the texture directly?

What happens if you want to change the texture on a certain mesh? Are you going to recompile the whole mesh file? What if you want the same model, but simply with 2 different skins? Putting the material info directly into the mesh file is extremely limiting.

Share this post


Link to post
Share on other sites
Quote:
Are you going to recompile the whole mesh file?
Just a clarification, they're not compiled, they're loaded. If you mean "regenerate," the answer is yes.
Quote:
What if you want the same model, but simply with 2 different skins?
Put 2 different materials in the material array and, when rendering one skin or another, change the material. Because they're in the same file, I would have some assurance that both textures would properly skin the mesh (i.e., the tex coords are compatible).

If you want to be able to extend the number of skins by changing the "texture reference" file, how do you assure the new texture will properly skin the mesh?

I realize you can easily modify a texture with some assurance that it will properly skin the model, then add that texture reference to the "texture reference" file. So, instead of regenerating the mesh file, you regenerate the "texture reference" file. A matter of milliseconds either way, so I'm not sure I see the advantage.

How does the model "know" how many skins are available? It could count the number of entries in the "texture reference" file, but it could also ask for the count of materials available and avoid an extra file manipulation.

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
Quote:
Are you going to recompile the whole mesh file?
Just a clarification, they're not compiled, they're loaded. If you mean "regenerate," the answer is yes.

compile: (v) To put together; to assemble; to make by gathering things from various sources [wink]

Quote:
Put 2 different materials in the material array and, when rendering one skin or another, change the material. Because they're in the same file, I would have some assurance that both textures would properly skin the mesh (i.e., the tex coords are compatible).

What I'm saying is that it's not dynamic that way. What if you want to make a certain object (only a specific instance) glow by giving it an emissive value greater than 0? Etc.

Quote:

If you want to be able to extend the number of skins by changing the "texture reference" file, how do you assure the new texture will properly skin the mesh?

The texture comes from the modeling application you used to UV map the mesh. Of course it's going to properly skin the mesh.

Quote:

So, instead of regenerating the mesh file, you regenerate the "texture reference" file. A matter of milliseconds either way, so I'm not sure I see the advantage.

It's much easier to edit a small line in a human-editable text file than to recompile a binary mesh file.

Quote:

How does the model "know" how many skins are available? It could count the number of entries in the "texture reference" file, but it could also ask for the count of materials available and avoid an extra file manipulation.

It doesn't. The skins are up to the application, not the mesh. The mesh would be the car, the material would be the paint job for that specific instance of the car. You can have a yellow Lamborghini, and you can have a black Lamborghini with orange flames on the side.

Share this post


Link to post
Share on other sites
You make some good points with respect to flexibility. Guess it's a matter of preference. Having to keep track of multiple files with no guarantee they're compatible could lead to more problems than the flexibility warrants.
Quote:
The texture comes from the modeling application you used to UV map the mesh.
If I change the UVs, the mesh file has to be revised anyway. I thought maybe you were suggesting being able to modify textures without changing the mesh UVs. That has some merit.

Share this post


Link to post
Share on other sites
I think storing the texture id's with the mesh is a good idea as they are fairly specific to the mesh, however I would keep the material data seperate. Perhaps store textures with a particular type identifier: Diffuse, Normal, AO, Spec, to name a few.

Keep materials as a seperate file which hold shader parameters, shader id's, texture channels, and any other artist specific piece of data. This way the material can map whatever textures it needs by type to the textures stored within the mesh.

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
I don't have tiny4_anim.x. But, if it contains a MeshMaterialList, that's the table that specifices which vertices (or faces? subsets? I can't remember) each material applies to.


Looking at tiny4_anim.x (which is fortunately an ascii file), it has two MeshMaterialLists. The first has 8 faces and 1 material. The second has 6841 faces and 1 material (this one contains the texture).

At runtime, when I examine the mesh contents with the debugger, I see that both materials are loaded, but there is only one subset, and it contains 6841 faces. It looks like the 8 faces from the first MeshMaterialList aren't loaded, but the material is.

Any idea why this is happening?

EDIT: Nevermind. The problem is elsewhere, but that's a topic for a separate thread.


On a side note, I'm only trying to write a converter for .X files because I thought it would be easier than writing an exporter. Now I'm not so sure...

[Edited by - Gage64 on August 22, 2009 10:32:10 AM]

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
The first bytes in a data file should specifiy the format. For consistency, use four unsigned character bytes and use ASCII. Format specific information like version numbers, etc., can come after that. That supports a more universal and quick check if the file being loaded is the format expected.


Good point, I will add that in.

Quote:
If this is to support cross-platform use, you should define whether it uses big-endian or little-endian storage, either as part of the spec or as a flag in the file itself.


I was going to leave that for later, but you're right. Do you think it should be big-endian or little-endian?

Quote:
Also, if cross-platform is intended, you need to define all data types in terms of number of bytes. E.g., you didn't define "ushort."


Agreed, I'll correct this.

Quote:
Probably not important: You should define whether wide-character support is supported or not or just make the flat statement that all characters are ASCII.


OK, I think I'll stick to just ascii at this point.

Quote:
Is there a reason to specify all counts as signed integers? In particular, if you specify "signed" for all indices, then you can't take full advantage of 32-bit index buffers.


By "counts" I meant the number of indices (and verices, materials, etc.) not the indices themselves. I decided on signed ints because that is the type used most often by most programmers, so I figured I would use it unless I have a really good reason not to.

By the way, I was going to use 16-bit indices, but you're right that using 32-bit would be more flexible. I think I'll also add a flag indicating if the indices are 16-bit or not.

Quote:
You need to specify whether the model is in a right-hand or left-hand system. There have been no end of problems with mesh file formats in this regard. SMD and several other formats, for instance, are right-handed. X-files are left-handed (one of the few).

Right-hand and left-hand would apply to how matrices are to be handled, also.


Good point. Since I'm using D3D (and I think the way the data is arranged is more D3D friendly than OpenGL) I'll use a left-handed system and the D3D matrix layout.

Quote:
for each vertex
If the intent is to support static meshes, or for quick testing, you should add a color to each vertex.


OK.

Quote:
By specifying "ubyte" for the bone indices, you're restricting the influences to the first 256 bones. That's probably sufficient[ATTENTION] [SMILE] but for a consistent format, the bone index data-type should match the "num bones" data-type.


I'm not sure why that would be more consistent, but are saying I should change the type of "num bones" or the bone indices themselves?

Quote:
It appears that you're limiting the max bone influences per vertex to 4. That's probably sufficient but not flexible. See next comment.


I've just always heard that 4 is enough in practice, but I guess it wouldn't hurt to change this.

Quote:
You need something to indicate the number of bone influences for a vertex, even if you require the storage space for unused influences. Otherwise, it will require unnecessary testing or looping. If you intend that bone weights will be 0 to flag "non-influences," then either the loading code will have to count the number of bone weights > 0 to determine the number of bone influences, or the animation loop will have to access bone matrices it's not going to use and do a lot of multiplications by 0.


But that's exactly what "num bone influences" is for?

Quote:
num indices
See comment above about matching data-types. "num indices" is signed int, actual indices are ushort. "num indices" and actual indices should probably be unsigned 32bit if you want to support 32bit index buffers.


About the indices, I agree, but I'm not sure why it's important that "num indices" be unsigned as well?

Quote:
num parts
Are these subsets of the mesh? If so:

If you intend that the format supports indexed meshes, then you shouldn't specify start index and count for vertices and indices. This forces duplicate vertices which might be a good thing to avoid. Many vertices will be shared between subsets in an indexed mesh.

A subset need only be defined by vertex indices and a material. Those indices will not be in serial order.

Maybe:
for each part: number of indices, list of indices


OK, I'll remove the offset and count for the vertices. I just used them since they are specified in D3DXATTRIBUTERANGE, but I guess there's no real need for them.

By the way, should I store all indices in one big list and specify an offset and a count for each part, or should each part contain a list of its indices?

Quote:
num bone influences
Why not just "0" or "1" as a flag? The number of influences is per-vertex anyway.


Are you saying that each vertex should be able to have a different number of bone influences? Wouldn't that make vertex blending much slower? Actually wouldn't it require VS 3.0 to implement?

Quote:
num palette entries and num bones
Any reason to have these separate?


Yes - the former is sometimes smaller than the latter. At least it has been for all the models I tried. I guess the reason for this is that not all bones are indexed by vertices. They only exist to serve as parents for other bones or to allow objects (like a weapon) to be attached to the model.

Quote:
If they're separate, it'll require the loading code to do string searches to match up offsets with locals, etc. In general, the format should support minimizing loading code burden. The file is created once but it may be loaded many times.


Yes, but if my previous comment is correct, this is unavoidable.

Quote:
How about adding a "parent" index for each bone? That will allow more flexibility in the loading and animation code. I.e., if I want to determine the combined matrix of a bone, I would otherwise have to search all bones to find the bone that has that bone as a child. That, also, relieves some of the burden from the application.


But each bone always stores its associated combined transform (in the runtime structures, not in the file), and it's always up to date because you recursively update the bone hierarchy before using any of the bones. Unless having a parent index can be useful for something else?

Quote:
animation frames
You separate all the quats from the vectors, etc. It's true that supports a minimization of storage but requires bookkeeping on the part of the loading app to keep track of 3 different lerps that may or may not have the same time key. Putting all the information for each time-step in one structure requires more storage but simplifies the animation code.


That's what I wanted to do. But looking at the keyframe data for some of the models I used, some of the quat keyframes had a different timestamp than the vector keyframes, so I can't really collapse them into one structure with just one timestamp.

Quote:
Quote:
Should there be an offset table specifying the offset to the vertices/indices/etc. from the beginning of the file?
I wouldn't think that would be useful. It'll just put more burden on the loading code and allow additional errors.


It would be handy if you wanted to load just the skeleton data or just the animation data.

Quote:
Quote:
Should the data be stored in several files?
You might want to consider having an "animation-only" flag for the file so animations could be stored and loaded separately. Animations can be applied to different meshes (provided the meshes were developed with the same bone hierarchy, which is common).


That's why I wanted to use the offset table. I'm not sure how you would do this using just a flag.




BTW, I also need to come up with an extension for the file name. I thought about SAM (Simple Animated Model), but I'd love to hear some other suggestions.

Also, I'd love to hear more opinions about what material properties I should store, and if I should store them in a separate file or not.

Share this post


Link to post
Share on other sites
First: I still suggest you support text formats as well as binary. That can specified in the header info, after the file signature. Having a text format allows for loading values without worrying about endedness. I.e., when reading "1.234" as text, the machine will store it properly. In binary, when loading 0x47ff0876, the endedness makes a difference.

Text format will make debugging much easier, both for creating and loading the file.

So, speaking of little- vs. big-endian: I'd say specify little-endian as that is the Intel format and machines using DirectX are more likely to be Intel.

Number of bone influences
The x-file format does specify a maximum number of skinweights per-vertex and per-face in the XSkinMeshHeader but it doesn't limit the number (that I know of). The number of influences for a vertex are determined by the SkinWeights (if I remember) which specifies which vertices are influenced by a particular bone. A vertex index can appear in a SkinWeight list up to the max number of skinweights per-vertex.

I assumed the constant you specify as "num bone influences" was intended as a maximum. If so, maybe you should rename it "max num bone influences" and put it before the vertex data.

So, any particular animated vertex can have any number of bone influences up to the max. And it's probably easiest to create storage for the max number of influences with each vertex (as you have).

I'm just saying you're limiting that max number to 4. If you run across an x-file that has max 5 influences per-vertex, you'll have to abort.

Maybe something like:

max num bone influences (int)
for each vertex
(pos, normal, tex coords, color)
num-influences
weights[max-bone-influences-1] (float)
indices[max-num-influences] (same units as the bone indices)

That would allow calculation of the vertex matrices with better efficiency.

I used a fixed number of bone influences (VS 2) but I don't see why the num-influences couldn't be a vertex attribute, even in VS2. Having num-influences as a vertex attribute would allow VS > 2 more flexibility and a lot of people have VS > 2 now.

num palette and num bones
It's true that the number of bones that influence vertices is probably less than the total number of bones. But, during loading, a frame will be created for every bone in the palette. That frame will have storage for the local transform, child and sibling indices, whether they're used or not (I would think).

Then the loader will have to look up each frame when it's encountered in the "num bones" loop. Extra time and potential for errors. Why not just load an entire bone frame at once, even if some of the entries are empty? As far as filesize is concerned, a single frame may be smaller due to the need to specify the bone name in the "num bones" section. Either way, it's a matter of just a few bytes and a single frame simplifies loading.

animation only files
With regard to offsets: probably doesn't make a big difference, I guess.

If a text format is supported, offsets can't be used. However, animation-only could be flagged by finding "num vertices" = 0, perhaps.

separation of quats and vectors in animations
Hadn't run across files with separate keyframes. That pretty well answers that, unless you want to include your own lerping routines in your conversion utility?? [SMILE] (NOT)

Share this post


Link to post
Share on other sites
Quote:
Original post by Buckeye
First: I still suggest you support text formats as well as binary. That can specified in the header info, after the file signature. Having a text format allows for loading values without worrying about endedness. I.e., when reading "1.234" as text, the machine will store it properly. In binary, when loading 0x47ff0876, the endedness makes a difference.

Text format will make debugging much easier, both for creating and loading the file.


I agree, I just want to finish the file spec and converter before thinking about how exactly to structure the text format.

Quote:
So, speaking of little- vs. big-endian: I'd say specify little-endian as that is the Intel format and machines using DirectX are more likely to be Intel.


Sounds good.

Quote:
Number of bone influences
The x-file format does specify a maximum number of skinweights per-vertex and per-face in the XSkinMeshHeader but it doesn't limit the number (that I know of). The number of influences for a vertex are determined by the SkinWeights (if I remember) which specifies which vertices are influenced by a particular bone. A vertex index can appear in a SkinWeight list up to the max number of skinweights per-vertex.

I assumed the constant you specify as "num bone influences" was intended as a maximum. If so, maybe you should rename it "max num bone influences" and put it before the vertex data.

So, any particular animated vertex can have any number of bone influences up to the max. And it's probably easiest to create storage for the max number of influences with each vertex (as you have).

I'm just saying you're limiting that max number to 4. If you run across an x-file that has max 5 influences per-vertex, you'll have to abort.

Maybe something like:

max num bone influences (int)
for each vertex
(pos, normal, tex coords, color)
num-influences
weights[max-bone-influences-1] (float)
indices[max-num-influences] (same units as the bone indices)

That would allow calculation of the vertex matrices with better efficiency.

I used a fixed number of bone influences (VS 2) but I don't see why the num-influences couldn't be a vertex attribute, even in VS2. Having num-influences as a vertex attribute would allow VS > 2 more flexibility and a lot of people have VS > 2 now.


Well, I would still like this to be usable with VS 2.0. But I guess I can do both: have a "max num bone influences" and have a "num influences" field per vertex, and anyone using VS 2.0 can just ignore that field.

Quote:
num palette and num bones
It's true that the number of bones that influence vertices is probably less than the total number of bones. But, during loading, a frame will be created for every bone in the palette. That frame will have storage for the local transform, child and sibling indices, whether they're used or not (I would think).

Then the loader will have to look up each frame when it's encountered in the "num bones" loop. Extra time and potential for errors. Why not just load an entire bone frame at once, even if some of the entries are empty? As far as filesize is concerned, a single frame may be smaller due to the need to specify the bone name in the "num bones" section. Either way, it's a matter of just a few bytes and a single frame simplifies loading.


I'm not sure what you mean. What is "the entire bone frame"? Is it the bone hierarchy tree? And what do you mean by "empty entries"?

Quote:
separation of quats and vectors in animations
Hadn't run across files with separate keyframes. That pretty well answers that, unless you want to include your own lerping routines in your conversion utility?? [SMILE] (NOT)


Sorry but again, I'm not sure what you mean. If I have a quat keyframe and a vector keyframe that have different timestamps, I can't possibly collapse them to a single structure. Also, I'm not even sure if the number of quat keyframes and vector keyframes is always the same (I don't remember if I checked that for the models I tried).

Share this post


Link to post
Share on other sites
Quote:
I would still like this to be usable with VS 2.0. But I guess I can do both: have a "max num bone influences" and have a "num influences" field per vertex, and anyone using VS 2.0 can just ignore that field.

Yeah, that's what I was trying to get at. I think it's possible (I haven't done it) to use per-vertex num-influences as a vertex attribute in VS 2.0 But that's really outside this discussion anyhow.[SMILE]
Quote:
What is "the entire bone frame"? Is it the bone hierarchy tree? And what do you mean by "empty entries"?

Sorry. I was using the term "bone frame" to describe the structure used to store each bone's data (name, transform, child, etc.). A structure for holding that data is (usually) created for each bone in the palette, whether or not that bone is an "influence" bone.

So, it's the difference between:

for each bone in palette:
create a data structure
load in palette data (name, offset)

for each bone in hierarchy:
read in name
look up the frame by name
fill in additional information (local matrix, child, sibling)

and

for each bone in palette:
create a data structure
load in all the data (name, offset, local matrix, child, sibling)
// for non-influence bones, hierarchy data will be "empty." That is,
// matrix may be all zeroes or identity, and child=sibling=-1 or something

Quote:
If I have a quat keyframe and a vector keyframe that have different timestamps, I can't possibly collapse them to a single structure.

Gotta do some thinking about this. What I'm thinking (and I may be all wet):

An animation set has a time range. All keyframes in that animation set have to have timestamps that fall in that range. For each bone there are is an array of timestamped variables which may be quats, vectors, etc. If there's a quat with a timestamp, why can't you lerp the vector info to get the value for the same timestamp and store it all in one entry?

Alternatively, if the above lerping can be done, store transforms rather than separate quats and vectors?

EDIT: As implied in a previous post, this is probably too complicated to be worth it.

Share this post


Link to post
Share on other sites
I think we're using some of the terms in different ways. When I say bone hierarchy, I'm talking about the tree/hierarchy of bones (what you're refering to as frames), where each bone has a name, local transform, child/sibling pointers, and a combined transform.

When I'm talking about the matrix palette, I'm talking about an array of matrices. Actually I'm talking about several arrays at once (I should have been more explicit here, sorry): an array of offset transforms, an array of pointers to the combined transform matrices that are in the frames, and the array of final transforms.

(Actually, looking back I think I sometimes used the terms inconsistently - I sometimes used the term "bone" when referring to matrices, and sometimes when referring to frames. Sorry about that as well.)

The three arrays are of the same size (what I was referring to as "num palette entries"), and that size may be smaller than the number of frames in the frame hierarchy. For example, tiny.x has 35 palette entries and 48 frames.

To build the bone hierarchy, there's no need to do any name lookups. In the file, each frame stores the index of the child/sibling (this index has nothing to do with the bone indices stored in the vertices), and you just follow those indices when building the hierarchy.

To build the various matrix arrays, and to get a pointer to a local transform for each keyframe, you do need to perform name lookups...

Wait, maybe you don't. In the file, the frame hierarchy is stored as an array, and you can choose not to convert it to a tree structure. If you do this, it is possible to store frame indices rather than frame names (of course, these indices have nothing to do with the indices stored in each vertex, since those are indices for the array of final transforms, which is arranged in a different order).

Is this what you meant? Is it even close? [smile]

Personally, I'm less comfortable with this. I prefer to work with a "real" tree that uses pointers rather than with a tree represented as an array, even if it forces me to do name lookups. Also, they are done only at load time, so there's no performance penalty.

Share this post


Link to post
Share on other sites
I certainly agree with your general description of the arrays. That's what you can eventually end up with. Although I don't do it exactly the same way, your file format doesn't prevent other methods of loading and storing information.

It seems the palette still needs to be loaded so attachment points can be added to the arrays by applications that need them.

By the way, I still think you should start with a text format[ATTENTION] That's certainly up to you. I'm just speaking from experience writing conversion programs.

With a binary format still in the debugging stage, you're going to load some funky values and wonder where they came from. With a text format, you can look at the text you just loaded and pretty quickly determine where the problem lies.

Share this post


Link to post
Share on other sites
I didn't read much in this thread, but here's one piece of advice: store quaternions for your bone transformations, not matrices. It's 1/4 the size! Seriously, most of the size in a skeletal animation file is from all the bone transformations, so cutting those by 75% is really, really beneficial.

Share this post


Link to post
Share on other sites
Just my $0.02:

- I agree with using text file formats! They make debugging
and development much faster and easier. Also, if you're
worried about space, you can always use standard compression
like gzip. That will be faster than a binary format anyway
(because you'll save so much disk time).

- Have you looked at existing formats? Cal3D and Quake's MD5
formats are very nice, and may do what you want. I was
very impressed with Cal3D and disappointed that no one
seems to have picked it up: http://gna.org/projects/cal3d/
I'm building a game engine now and would use the Cal3D
format, except there don't appear to be any models for it
available. [Interestingly, both Cal3D and Quake switched
to text file format.]

Share this post


Link to post
Share on other sites

This topic is 3002 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this