[SOLVED]Bone weights as unsigned bytes won't work

Started by
8 comments, last by cskelton 16 years, 5 months ago
I've written a max exporter and a corresponding model loader in my engine for animated models. In an effort to save some GPU memory, I tried sending the bone weights as UBYTE4N and the bone indices as UBYTE4. Now I've seen this work in several people's engines, and I know it's common practice, but for some reason in my engine the bone indices work just fine, but the bone weights give me errors and crash the program. Any ideas? Here are the errors that D3D dumps when I try to use Bone weights as unsigned bytes: Direct3D9: Decl Validator: X290: (Element Error) (Decl Element [3]) Declaration can't map to fixed function FVF because the type for this element is unsupported. Direct3D9: Decl Validator: X249: (Element Error) (Decl Element [3]) Declaration can't map to fixed function FVF because blendweight must use D3DDECLTYPE_FLOAT1/2/3/4. Thanks in advance, c [Edited by - cskelton on November 6, 2007 5:11:59 PM]
Advertisement
Looks like the fixed function pipeline can only use floats for bone weights. You would have to write your own shader to support bytes for bone weights.
Steve 'Sly' Williams  Monkey Wrangler  Krome Studios
turbo game development with Borland compilers
Or I think you have to encode it as a color which would be 4 bytes. Have you checked the DXSDK skinning example?

As dgreen02 mentioned, encode the data inside a dword (4 weights) and declare it as a d3dcolor.

When you'll write the shader, you'll get nicely 4 floats in 0-1 range what is what you want and you don't need to do any conversions.

Cheers!
It took me the longest time to fix this a few months back. But I guess that had much to do with the fact that I was using a bad X file. Here are some relevant snippets from my code. It may not be the best way, but I couldn't see anything simpler working out.

// This is called right after ConvertToIndexedBlendedMesh:// Fix UBYTE4 SupportD3DVERTEXELEMENT9 decl[MAX_FVF_DECL_SIZE];skinned_mesh->GetDeclaration(decl);{	int i = 0;	while (decl.Method != 0xFF)	{		if (decl.Usage == D3DDECLUSAGE_BLENDINDICES)		{			decl.Type = D3DDECLTYPE_D3DCOLOR;			break;		}		++i;	}}skinned_mesh->UpdateSemantics(decl);


// And this is how the shader handles the input.// This is a simple two-bone-influence shader, but the principle standsVS_OUTPUT VS_SkeletalBlend(float3 pos           : POSITION0,			   float1 blend_weight  : BLENDWEIGHT0,                           float4 blend_indices : BLENDINDICES0,                           float3 normal        : NORMAL0,                           float2 tex_coord     : TEXCOORD0){        int4 indices = D3DCOLORtoUBYTE4(blend_indices);       int ind0 = indices.x;    int ind1 = indices.y;     float weight0 = blend_weight;    float weight1 = 1 - weight0;    // ...}


Admiral
Ring3 Circus - Diary of a programmer, journal of a hacker.
Thanks guys. I'll have to give that a try.

I have my own shader, but of course by the time everything gets to the shader it's been converted to a float, so there's really nothing on that end that needs to change, right( if I can get it working with UBYTE4N's, that is...)? Are you sure that it can't be done with UBYTE4N? Like I said in the original post, I've looked at code that declared it that way, and I can't find any perceptible difference in our engines(there are differences, of course, but none that should affect this). What could they possibly be doing that I'm not?

Once again, thanks for the help. I'm going to try to implement that in the mean time, but I'd like to see if I can get it working with UBYTE4N, which would give me the same result with a little bit less code.
Oh, and I am using vertex declarations instead of FVF, but it seems to want to map to FVF under the hood anyway.
My problem seems to be that it ONLY likes the weights as floats( not even as D3DCOLOR ), and even when it is 4 floats, it works with my "regular" renderer shader( forward rendering or whatever you want to call it ), but when I switch to deferred( which has the same exact VS for calculating the animated verts ) it stops working. Any ideas? I'm guessing it's some D3D init setting that I have, but I've looked at several different init functions and compared them to mine and I can't find anything different.

Thanks in advance,
C
I fixed it. It appears that I was using too many shader constants, and as a result things went screwy. Stupid mistake on my part.
I'm now using UBYTE4 BLENDINDEX and UBYTE4N BLENDWEIGHTS for my animation

This topic is closed to new replies.

Advertisement