Jump to content
  • Advertisement
Sign in to follow this  
ladzin

Weird shader behavior

This topic is 4132 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I've just run into funny issue I can't explain -- maybe some of you guys will have a clue. The thing is that my vertex program doesn't work if I take the normal directly, but works perfectly when I normalize it first, i.e., in Cg: float3 inpNormal = normalize(IN.normal.xyz); I don't understand that, as the normal _is_ already normalized before being sent to the GPU, so this 'normalize' function shouldn't do anything. But, taking the IN.normal.xyz plain does not work...

Share this post


Link to post
Share on other sites
Advertisement
Can you tell us what exactly stops working if you use IN.Normal.xyz ? I mean if IN.Normal.xyz really is a normalizes vector, then it should definetely work. Have you tried checking the assembly of the shader? Maybe visualising the normal vector should help...

Share this post


Link to post
Share on other sites
It sounds like the normals are not unit length, and computations in the shader probably assume that they should be. If you do normalize the normals before sending them to the shader, then it might be that they are scaled by a transformation matrix. If you rotate your normals in the shader, and the transformation matrix contains scaling, then they'll be perturbed and for example light computations will be wrong.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!