Sign in to follow this  
ladzin

Weird shader behavior

Recommended Posts

Hi, I've just run into funny issue I can't explain -- maybe some of you guys will have a clue. The thing is that my vertex program doesn't work if I take the normal directly, but works perfectly when I normalize it first, i.e., in Cg: float3 inpNormal = normalize(IN.normal.xyz); I don't understand that, as the normal _is_ already normalized before being sent to the GPU, so this 'normalize' function shouldn't do anything. But, taking the IN.normal.xyz plain does not work...

Share this post


Link to post
Share on other sites
Can you tell us what exactly stops working if you use IN.Normal.xyz ? I mean if IN.Normal.xyz really is a normalizes vector, then it should definetely work. Have you tried checking the assembly of the shader? Maybe visualising the normal vector should help...

Share this post


Link to post
Share on other sites
It sounds like the normals are not unit length, and computations in the shader probably assume that they should be. If you do normalize the normals before sending them to the shader, then it might be that they are scaled by a transformation matrix. If you rotate your normals in the shader, and the transformation matrix contains scaling, then they'll be perturbed and for example light computations will be wrong.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this