Jump to content
  • Advertisement
Sign in to follow this  
cskelton

[SOLVED]Bone weights as unsigned bytes won't work

This topic is 4025 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've written a max exporter and a corresponding model loader in my engine for animated models. In an effort to save some GPU memory, I tried sending the bone weights as UBYTE4N and the bone indices as UBYTE4. Now I've seen this work in several people's engines, and I know it's common practice, but for some reason in my engine the bone indices work just fine, but the bone weights give me errors and crash the program. Any ideas? Here are the errors that D3D dumps when I try to use Bone weights as unsigned bytes: Direct3D9: Decl Validator: X290: (Element Error) (Decl Element [3]) Declaration can't map to fixed function FVF because the type for this element is unsupported. Direct3D9: Decl Validator: X249: (Element Error) (Decl Element [3]) Declaration can't map to fixed function FVF because blendweight must use D3DDECLTYPE_FLOAT1/2/3/4. Thanks in advance, c [Edited by - cskelton on November 6, 2007 5:11:59 PM]

Share this post


Link to post
Share on other sites
Advertisement
Looks like the fixed function pipeline can only use floats for bone weights. You would have to write your own shader to support bytes for bone weights.

Share this post


Link to post
Share on other sites
Or I think you have to encode it as a color which would be 4 bytes. Have you checked the DXSDK skinning example?

Share this post


Link to post
Share on other sites

As dgreen02 mentioned, encode the data inside a dword (4 weights) and declare it as a d3dcolor.

When you'll write the shader, you'll get nicely 4 floats in 0-1 range what is what you want and you don't need to do any conversions.

Cheers!

Share this post


Link to post
Share on other sites
It took me the longest time to fix this a few months back. But I guess that had much to do with the fact that I was using a bad X file. Here are some relevant snippets from my code. It may not be the best way, but I couldn't see anything simpler working out.

// This is called right after ConvertToIndexedBlendedMesh:

// Fix UBYTE4 Support
D3DVERTEXELEMENT9 decl[MAX_FVF_DECL_SIZE];
skinned_mesh->GetDeclaration(decl);
{
int i = 0;
while (decl.Method != 0xFF)
{
if (decl.Usage == D3DDECLUSAGE_BLENDINDICES)
{
decl.Type = D3DDECLTYPE_D3DCOLOR;
break;
}
++i;
}
}
skinned_mesh->UpdateSemantics(decl);


// And this is how the shader handles the input.
// This is a simple two-bone-influence shader, but the principle stands

VS_OUTPUT VS_SkeletalBlend(float3 pos : POSITION0,
float1 blend_weight : BLENDWEIGHT0,
float4 blend_indices : BLENDINDICES0,
float3 normal : NORMAL0,
float2 tex_coord : TEXCOORD0)
{
int4 indices = D3DCOLORtoUBYTE4(blend_indices);

int ind0 = indices.x;
int ind1 = indices.y;
float weight0 = blend_weight;
float weight1 = 1 - weight0;
// ...
}


Admiral

Share this post


Link to post
Share on other sites
Thanks guys. I'll have to give that a try.

I have my own shader, but of course by the time everything gets to the shader it's been converted to a float, so there's really nothing on that end that needs to change, right( if I can get it working with UBYTE4N's, that is...)? Are you sure that it can't be done with UBYTE4N? Like I said in the original post, I've looked at code that declared it that way, and I can't find any perceptible difference in our engines(there are differences, of course, but none that should affect this). What could they possibly be doing that I'm not?

Once again, thanks for the help. I'm going to try to implement that in the mean time, but I'd like to see if I can get it working with UBYTE4N, which would give me the same result with a little bit less code.

Share this post


Link to post
Share on other sites
My problem seems to be that it ONLY likes the weights as floats( not even as D3DCOLOR ), and even when it is 4 floats, it works with my "regular" renderer shader( forward rendering or whatever you want to call it ), but when I switch to deferred( which has the same exact VS for calculating the animated verts ) it stops working. Any ideas? I'm guessing it's some D3D init setting that I have, but I've looked at several different init functions and compared them to mine and I can't find anything different.

Thanks in advance,
C

Share this post


Link to post
Share on other sites
I fixed it. It appears that I was using too many shader constants, and as a result things went screwy. Stupid mistake on my part.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!