Jump to content
  • Advertisement
Sign in to follow this  
Kinju

Need help about Vertex Blending

This topic is 3950 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all, I would like test the vertex blending, without shaders, without indexed matrices. My vertex structure is: struct Vertex { float x, y, z; // D3DDECLUSAGE_POSITION/D3DDECLTYPE_FLOAT3 float weight[3]; // D3DDECLUSAGE_BLENDWEIGHT/D3DDECLTYPE_FLOAT3 DWORD color; // D3DDECLUSAGE_COLOR/D3DDECLTYPE_D3DCOLOR } I enable vertex blending like this: g_pd3dDevice->SetRenderState( D3DRS_INDEXEDVERTEXBLENDENABLE, FALSE); g_pd3dDevice->SetRenderState( D3DRS_VERTEXBLEND, D3DVBF_2WEIGHTS ); For begin, I put all vertex weights to (1.0f, 0.0f, 0.0f). Since the weights are influences with D3DTS_WORLD, D3DTS_WORLD1 and D3DTS_WORLD2, I presume that setting the first weight to 1.0f and others to 0.0f is like to disable vertex blending. Is it true ? For security, I set D3DTS_WORLD1 and D3DTS_WORLD2 matrix to identity. Because my geometry is not wellformed :'( What is not correct ? Thanks a lot

Share this post


Link to post
Share on other sites
Advertisement
Quote:
I enable vertex blending like this:
g_pd3dDevice->SetRenderState( D3DRS_INDEXEDVERTEXBLENDENABLE, FALSE);


If you enable vertex blending, shouldn't that be TRUE?

Share this post


Link to post
Share on other sites
No because indexed vertex buffer is not yet supported by graphic cards, it is remplaced by shaders.

In fact, I solved my problem, it was the weight loader that was not correct. I think I have still a problem with setting the D3DTS_WORLD1 and D3DTS_WORLD2 matrix.

My D3DTS_VIEW matrice is:
1 0 0 0
0 1 0 0
0 0 1 -40
0 0 0 1

My first bone matrix is
1 0 0 0
0 1 0 0
0 0 1 10
0 0 0 1

My Second bone matrix is
1 0 0 0
0 1 0 0
0 0 1 20
0 0 0 1

When I set the world matrix 0, I inverse my matrix.
For world matrix 1 and 2, I suppose I have to inverse these matrix too, but I don't want if I have to multiply with the "previous" matrix.

Ex:
D3DTS_WORLD1 = inverse(inverse(D3DTS_WORLD0) * Bone1.GetMatrix())
D3DTS_WORLD2 = inverse(inverse(D3DTS_WORLD1) * Bone2.GetMatrix())

Something like this. I will try tonight.

Share this post


Link to post
Share on other sites
Quote:
I think I have still a problem with setting the D3DTS_WORLD1 and D3DTS_WORLD2 matrix.


As far as I know, when using shaders calls like SetTransform() are ignored, so you'll have to pass the matrices as shader constants.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!