Jump to content
  • Advertisement
Sign in to follow this  
_Plotinus

Normals & World Scaling Transform

This topic is 3685 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've got a Blender created mesh exported to a .x file, and code that loads it in and displays it. I've got a directional light facing it, and specular lighting enabled. Everything works fine, until I try to using a scaling World Transform, at which point the normals go crazy. (A) This works fine: D3DXMatrixIdentity(mat); d3dDevice.SetTransform(D3DTS_WORLD, mat); (B) This scales the mesh to 1/2 size as expected, but the normals get screwed up, and a large part of the mesh gets the specular lighting: D3DXMatrixScaling(mat, 0.5, 0.5, 0.5); d3dDevice.SetTransform(D3DTS_WORLD, mat); (C) Oddly, setting the 4th row of the 4th column to 2 with the rest at identity actually works--the mesh gets scaled to 1/2 size, and the normals are still fine: D3DXMatrixIdentity(mat); mat._44:=2.0; d3dDevice.SetTransform(D3DTS_WORLD, mat); However, while (C) works, I'm loathe to just use it, because (1) I've never seen an explaination of what column 4 in the matrix actually is, nor have I ever seen an example where anyone other than me has set the 4th row of the 4th column to any value other than 1 or 0; and (2) because by everything I've read and seen in examples, (B) should work without screwing up normals, so its failure to do so makes me think I've got something bad wrong somewhere. I've tried this, with the same results, both without setting the View and Perspective transforms at all, and with setting the View and Perspective transorms as follows: vEyePt:= D3DXVector3(0.0, 0.0,-5.0); vLookatPt:= D3DXVector3Zero; vUpVec:= D3DXVector3(0.0, 1.0, 0.0); D3DXMatrixLookAtLH(matView, vEyePt, vLookatPt, vUpVec); d3dDevice.SetTransform(D3DTS_VIEW, matView); D3DXMatrixPerspectiveFovLH(matProj, D3DX_PI/4, 1.0, 1.0, 100.0); d3dDevice.SetTransform(D3DTS_PROJECTION, matProj); I've hand checked a few of the normals in the .x file to make sure they were 1 unit length, using SQRT(x^2+y^2+z^2), and they all came to 1 (or at least 0.999998, etc.). I've been scouring Google and doing trial and error coding for about 18 hours over the past two days on this, and I'm still where I started (except for discovering (C) by just plugging a number in _44 to see what would happen), so any enlightment, reference, or guesses anyone can provide would be greatly appreciated.

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by BornToCode
U can turn on the normalize normal render state and that should fix your problem.
I believe it is D3DRS_NORMALIZE


Bless you, BornToCode. It's actually D3DRS_NORMALIZENORMALS, and yes, that fixed it right up.

Thankyou!

Now, is there any reason not to use (C) above (assuming I want equal scaling in all 3 dimensions)? If, at it seems, doing the scaling by using _44 of the world transform instead of _11, _22, and _33 scales without multiplying the normals, and doesn't need D3DRS_NORMALIZENORMALS to effectively un-multiply them, it seems like that's going to save a lot of unnecessary multiplying and thus run faster. Is this going to bit me later when I add shadows, etc., if I use _44 to do my scaling?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!