Creating a custom model format that supports skeletal animation - feedback wanted

Started by
34 comments, last by Gage64 14 years, 7 months ago
Quote:Original post by Buckeye
If you load an x-file with multiple subsets and then use D3DXMeshSaveToFile (or whatever it is), it seems to just blindly output a complete vertex set for each subset, whether that subset has a different material or not! There's usually a long list of duplicate vertices.


I don't use D3DXMeshSaveToFile. I save all the data myself, so the subset boundaries are preserved.


Quote:Subsets may have to be created to limit the number of bones-per-subset for shader capability reasons. Those subsets may still use the same material, however.


For simplicity, I've decided to not bother with this, because this would require something like the bone combination table, and I don't want to force the users of this format to use this table. If you wan't to create additional subsets to save on shader constants (possibly duplicating vertices in the process), you'll have to write code to do that.

Quote:Vertices, even if in different subsets, need not be duplicated if every subset they're used in has the same material.


I'm not sure what you mean. Aside from saving on shader constants, the only reason I can think of to duplicate a vertex is when you have two vertices that share the same position, but don't share tex coords or normals or something else. Otherwise, why would you duplicate it? (again, aside from saving on shader constants).
Advertisement
Quote:Original post by Buckeye
Quote:I think the material itself should be separate from the mesh.

Maybe I don't understand what your intent is or what you mean by "mesh." Do you mean in the same file, or stored with the mesh data itself?

It appears you mean storing texture information in a separate file and specifying only a reference to a texture in the "main" file which would be "dereferenced" to load the texture. Is that correct?

Correct. In-code you will have a Mesh and you will have a Material, and the mesh can store a pointer to the Material it will use (and, by default, the pointer will point to the "default" material specified within the mesh file).

Quote:
If so, that could result in a lot of headaches. If I change tex coords in my model using one texture, it will not render correctly if a different texture is used.

Correct. You will need to change the texture in the material script.
Quote:If that's not the intent, why not specify the texture directly?

What happens if you want to change the texture on a certain mesh? Are you going to recompile the whole mesh file? What if you want the same model, but simply with 2 different skins? Putting the material info directly into the mesh file is extremely limiting.
Quote:Are you going to recompile the whole mesh file?
Just a clarification, they're not compiled, they're loaded. If you mean "regenerate," the answer is yes.
Quote:What if you want the same model, but simply with 2 different skins?
Put 2 different materials in the material array and, when rendering one skin or another, change the material. Because they're in the same file, I would have some assurance that both textures would properly skin the mesh (i.e., the tex coords are compatible).

If you want to be able to extend the number of skins by changing the "texture reference" file, how do you assure the new texture will properly skin the mesh?

I realize you can easily modify a texture with some assurance that it will properly skin the model, then add that texture reference to the "texture reference" file. So, instead of regenerating the mesh file, you regenerate the "texture reference" file. A matter of milliseconds either way, so I'm not sure I see the advantage.

How does the model "know" how many skins are available? It could count the number of entries in the "texture reference" file, but it could also ask for the count of materials available and avoid an extra file manipulation.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Quote:Original post by Buckeye
Quote:Are you going to recompile the whole mesh file?
Just a clarification, they're not compiled, they're loaded. If you mean "regenerate," the answer is yes.

compile: (v) To put together; to assemble; to make by gathering things from various sources [wink]

Quote:Put 2 different materials in the material array and, when rendering one skin or another, change the material. Because they're in the same file, I would have some assurance that both textures would properly skin the mesh (i.e., the tex coords are compatible).

What I'm saying is that it's not dynamic that way. What if you want to make a certain object (only a specific instance) glow by giving it an emissive value greater than 0? Etc.

Quote:
If you want to be able to extend the number of skins by changing the "texture reference" file, how do you assure the new texture will properly skin the mesh?

The texture comes from the modeling application you used to UV map the mesh. Of course it's going to properly skin the mesh.

Quote:
So, instead of regenerating the mesh file, you regenerate the "texture reference" file. A matter of milliseconds either way, so I'm not sure I see the advantage.

It's much easier to edit a small line in a human-editable text file than to recompile a binary mesh file.

Quote:
How does the model "know" how many skins are available? It could count the number of entries in the "texture reference" file, but it could also ask for the count of materials available and avoid an extra file manipulation.

It doesn't. The skins are up to the application, not the mesh. The mesh would be the car, the material would be the paint job for that specific instance of the car. You can have a yellow Lamborghini, and you can have a black Lamborghini with orange flames on the side.
You make some good points with respect to flexibility. Guess it's a matter of preference. Having to keep track of multiple files with no guarantee they're compatible could lead to more problems than the flexibility warrants.
Quote:The texture comes from the modeling application you used to UV map the mesh.
If I change the UVs, the mesh file has to be revised anyway. I thought maybe you were suggesting being able to modify textures without changing the mesh UVs. That has some merit.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

I think storing the texture id's with the mesh is a good idea as they are fairly specific to the mesh, however I would keep the material data seperate. Perhaps store textures with a particular type identifier: Diffuse, Normal, AO, Spec, to name a few.

Keep materials as a seperate file which hold shader parameters, shader id's, texture channels, and any other artist specific piece of data. This way the material can map whatever textures it needs by type to the textures stored within the mesh.
Quote:Original post by Buckeye
I don't have tiny4_anim.x. But, if it contains a MeshMaterialList, that's the table that specifices which vertices (or faces? subsets? I can't remember) each material applies to.


Looking at tiny4_anim.x (which is fortunately an ascii file), it has two MeshMaterialLists. The first has 8 faces and 1 material. The second has 6841 faces and 1 material (this one contains the texture).

At runtime, when I examine the mesh contents with the debugger, I see that both materials are loaded, but there is only one subset, and it contains 6841 faces. It looks like the 8 faces from the first MeshMaterialList aren't loaded, but the material is.

Any idea why this is happening?

EDIT: Nevermind. The problem is elsewhere, but that's a topic for a separate thread.


On a side note, I'm only trying to write a converter for .X files because I thought it would be easier than writing an exporter. Now I'm not so sure...

[Edited by - Gage64 on August 22, 2009 10:32:10 AM]
Quote:Original post by Buckeye
The first bytes in a data file should specifiy the format. For consistency, use four unsigned character bytes and use ASCII. Format specific information like version numbers, etc., can come after that. That supports a more universal and quick check if the file being loaded is the format expected.


Good point, I will add that in.

Quote:If this is to support cross-platform use, you should define whether it uses big-endian or little-endian storage, either as part of the spec or as a flag in the file itself.


I was going to leave that for later, but you're right. Do you think it should be big-endian or little-endian?

Quote:Also, if cross-platform is intended, you need to define all data types in terms of number of bytes. E.g., you didn't define "ushort."


Agreed, I'll correct this.

Quote:Probably not important: You should define whether wide-character support is supported or not or just make the flat statement that all characters are ASCII.


OK, I think I'll stick to just ascii at this point.

Quote:Is there a reason to specify all counts as signed integers? In particular, if you specify "signed" for all indices, then you can't take full advantage of 32-bit index buffers.


By "counts" I meant the number of indices (and verices, materials, etc.) not the indices themselves. I decided on signed ints because that is the type used most often by most programmers, so I figured I would use it unless I have a really good reason not to.

By the way, I was going to use 16-bit indices, but you're right that using 32-bit would be more flexible. I think I'll also add a flag indicating if the indices are 16-bit or not.

Quote:You need to specify whether the model is in a right-hand or left-hand system. There have been no end of problems with mesh file formats in this regard. SMD and several other formats, for instance, are right-handed. X-files are left-handed (one of the few).

Right-hand and left-hand would apply to how matrices are to be handled, also.


Good point. Since I'm using D3D (and I think the way the data is arranged is more D3D friendly than OpenGL) I'll use a left-handed system and the D3D matrix layout.

Quote:for each vertex
If the intent is to support static meshes, or for quick testing, you should add a color to each vertex.


OK.

Quote:By specifying "ubyte" for the bone indices, you're restricting the influences to the first 256 bones. That's probably sufficient[ATTENTION] [SMILE] but for a consistent format, the bone index data-type should match the "num bones" data-type.


I'm not sure why that would be more consistent, but are saying I should change the type of "num bones" or the bone indices themselves?

Quote:It appears that you're limiting the max bone influences per vertex to 4. That's probably sufficient but not flexible. See next comment.


I've just always heard that 4 is enough in practice, but I guess it wouldn't hurt to change this.

Quote:You need something to indicate the number of bone influences for a vertex, even if you require the storage space for unused influences. Otherwise, it will require unnecessary testing or looping. If you intend that bone weights will be 0 to flag "non-influences," then either the loading code will have to count the number of bone weights > 0 to determine the number of bone influences, or the animation loop will have to access bone matrices it's not going to use and do a lot of multiplications by 0.


But that's exactly what "num bone influences" is for?

Quote:num indices
See comment above about matching data-types. "num indices" is signed int, actual indices are ushort. "num indices" and actual indices should probably be unsigned 32bit if you want to support 32bit index buffers.


About the indices, I agree, but I'm not sure why it's important that "num indices" be unsigned as well?

Quote:num parts
Are these subsets of the mesh? If so:

If you intend that the format supports indexed meshes, then you shouldn't specify start index and count for vertices and indices. This forces duplicate vertices which might be a good thing to avoid. Many vertices will be shared between subsets in an indexed mesh.

A subset need only be defined by vertex indices and a material. Those indices will not be in serial order.

Maybe:
for each part: number of indices, list of indices


OK, I'll remove the offset and count for the vertices. I just used them since they are specified in D3DXATTRIBUTERANGE, but I guess there's no real need for them.

By the way, should I store all indices in one big list and specify an offset and a count for each part, or should each part contain a list of its indices?

Quote:num bone influences
Why not just "0" or "1" as a flag? The number of influences is per-vertex anyway.


Are you saying that each vertex should be able to have a different number of bone influences? Wouldn't that make vertex blending much slower? Actually wouldn't it require VS 3.0 to implement?

Quote:num palette entries and num bones
Any reason to have these separate?


Yes - the former is sometimes smaller than the latter. At least it has been for all the models I tried. I guess the reason for this is that not all bones are indexed by vertices. They only exist to serve as parents for other bones or to allow objects (like a weapon) to be attached to the model.

Quote:If they're separate, it'll require the loading code to do string searches to match up offsets with locals, etc. In general, the format should support minimizing loading code burden. The file is created once but it may be loaded many times.


Yes, but if my previous comment is correct, this is unavoidable.

Quote:How about adding a "parent" index for each bone? That will allow more flexibility in the loading and animation code. I.e., if I want to determine the combined matrix of a bone, I would otherwise have to search all bones to find the bone that has that bone as a child. That, also, relieves some of the burden from the application.


But each bone always stores its associated combined transform (in the runtime structures, not in the file), and it's always up to date because you recursively update the bone hierarchy before using any of the bones. Unless having a parent index can be useful for something else?

Quote:animation frames
You separate all the quats from the vectors, etc. It's true that supports a minimization of storage but requires bookkeeping on the part of the loading app to keep track of 3 different lerps that may or may not have the same time key. Putting all the information for each time-step in one structure requires more storage but simplifies the animation code.


That's what I wanted to do. But looking at the keyframe data for some of the models I used, some of the quat keyframes had a different timestamp than the vector keyframes, so I can't really collapse them into one structure with just one timestamp.

Quote:
Quote:Should there be an offset table specifying the offset to the vertices/indices/etc. from the beginning of the file?
I wouldn't think that would be useful. It'll just put more burden on the loading code and allow additional errors.


It would be handy if you wanted to load just the skeleton data or just the animation data.

Quote:
Quote:Should the data be stored in several files?
You might want to consider having an "animation-only" flag for the file so animations could be stored and loaded separately. Animations can be applied to different meshes (provided the meshes were developed with the same bone hierarchy, which is common).


That's why I wanted to use the offset table. I'm not sure how you would do this using just a flag.



BTW, I also need to come up with an extension for the file name. I thought about SAM (Simple Animated Model), but I'd love to hear some other suggestions.

Also, I'd love to hear more opinions about what material properties I should store, and if I should store them in a separate file or not.
First: I still suggest you support text formats as well as binary. That can specified in the header info, after the file signature. Having a text format allows for loading values without worrying about endedness. I.e., when reading "1.234" as text, the machine will store it properly. In binary, when loading 0x47ff0876, the endedness makes a difference.

Text format will make debugging much easier, both for creating and loading the file.

So, speaking of little- vs. big-endian: I'd say specify little-endian as that is the Intel format and machines using DirectX are more likely to be Intel.

Number of bone influences
The x-file format does specify a maximum number of skinweights per-vertex and per-face in the XSkinMeshHeader but it doesn't limit the number (that I know of). The number of influences for a vertex are determined by the SkinWeights (if I remember) which specifies which vertices are influenced by a particular bone. A vertex index can appear in a SkinWeight list up to the max number of skinweights per-vertex.

I assumed the constant you specify as "num bone influences" was intended as a maximum. If so, maybe you should rename it "max num bone influences" and put it before the vertex data.

So, any particular animated vertex can have any number of bone influences up to the max. And it's probably easiest to create storage for the max number of influences with each vertex (as you have).

I'm just saying you're limiting that max number to 4. If you run across an x-file that has max 5 influences per-vertex, you'll have to abort.

Maybe something like:
max num bone influences (int)for each vertex   (pos, normal, tex coords, color)   num-influences   weights[max-bone-influences-1] (float)   indices[max-num-influences] (same units as the bone indices)

That would allow calculation of the vertex matrices with better efficiency.

I used a fixed number of bone influences (VS 2) but I don't see why the num-influences couldn't be a vertex attribute, even in VS2. Having num-influences as a vertex attribute would allow VS > 2 more flexibility and a lot of people have VS > 2 now.

num palette and num bones
It's true that the number of bones that influence vertices is probably less than the total number of bones. But, during loading, a frame will be created for every bone in the palette. That frame will have storage for the local transform, child and sibling indices, whether they're used or not (I would think).

Then the loader will have to look up each frame when it's encountered in the "num bones" loop. Extra time and potential for errors. Why not just load an entire bone frame at once, even if some of the entries are empty? As far as filesize is concerned, a single frame may be smaller due to the need to specify the bone name in the "num bones" section. Either way, it's a matter of just a few bytes and a single frame simplifies loading.

animation only files
With regard to offsets: probably doesn't make a big difference, I guess.

If a text format is supported, offsets can't be used. However, animation-only could be flagged by finding "num vertices" = 0, perhaps.

separation of quats and vectors in animations
Hadn't run across files with separate keyframes. That pretty well answers that, unless you want to include your own lerping routines in your conversion utility?? [SMILE] (NOT)

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Quote:Original post by Buckeye
First: I still suggest you support text formats as well as binary. That can specified in the header info, after the file signature. Having a text format allows for loading values without worrying about endedness. I.e., when reading "1.234" as text, the machine will store it properly. In binary, when loading 0x47ff0876, the endedness makes a difference.

Text format will make debugging much easier, both for creating and loading the file.


I agree, I just want to finish the file spec and converter before thinking about how exactly to structure the text format.

Quote:So, speaking of little- vs. big-endian: I'd say specify little-endian as that is the Intel format and machines using DirectX are more likely to be Intel.


Sounds good.

Quote:Number of bone influences
The x-file format does specify a maximum number of skinweights per-vertex and per-face in the XSkinMeshHeader but it doesn't limit the number (that I know of). The number of influences for a vertex are determined by the SkinWeights (if I remember) which specifies which vertices are influenced by a particular bone. A vertex index can appear in a SkinWeight list up to the max number of skinweights per-vertex.

I assumed the constant you specify as "num bone influences" was intended as a maximum. If so, maybe you should rename it "max num bone influences" and put it before the vertex data.

So, any particular animated vertex can have any number of bone influences up to the max. And it's probably easiest to create storage for the max number of influences with each vertex (as you have).

I'm just saying you're limiting that max number to 4. If you run across an x-file that has max 5 influences per-vertex, you'll have to abort.

Maybe something like:
max num bone influences (int)for each vertex   (pos, normal, tex coords, color)   num-influences   weights[max-bone-influences-1] (float)   indices[max-num-influences] (same units as the bone indices)

That would allow calculation of the vertex matrices with better efficiency.

I used a fixed number of bone influences (VS 2) but I don't see why the num-influences couldn't be a vertex attribute, even in VS2. Having num-influences as a vertex attribute would allow VS > 2 more flexibility and a lot of people have VS > 2 now.


Well, I would still like this to be usable with VS 2.0. But I guess I can do both: have a "max num bone influences" and have a "num influences" field per vertex, and anyone using VS 2.0 can just ignore that field.

Quote:num palette and num bones
It's true that the number of bones that influence vertices is probably less than the total number of bones. But, during loading, a frame will be created for every bone in the palette. That frame will have storage for the local transform, child and sibling indices, whether they're used or not (I would think).

Then the loader will have to look up each frame when it's encountered in the "num bones" loop. Extra time and potential for errors. Why not just load an entire bone frame at once, even if some of the entries are empty? As far as filesize is concerned, a single frame may be smaller due to the need to specify the bone name in the "num bones" section. Either way, it's a matter of just a few bytes and a single frame simplifies loading.


I'm not sure what you mean. What is "the entire bone frame"? Is it the bone hierarchy tree? And what do you mean by "empty entries"?

Quote:separation of quats and vectors in animations
Hadn't run across files with separate keyframes. That pretty well answers that, unless you want to include your own lerping routines in your conversion utility?? [SMILE] (NOT)


Sorry but again, I'm not sure what you mean. If I have a quat keyframe and a vector keyframe that have different timestamps, I can't possibly collapse them to a single structure. Also, I'm not even sure if the number of quat keyframes and vector keyframes is always the same (I don't remember if I checked that for the models I tried).

This topic is closed to new replies.

Advertisement