Weird shader behavior

Started by
1 comment, last by Dim_Yimma_H 17 years, 1 month ago
Hi, I've just run into funny issue I can't explain -- maybe some of you guys will have a clue. The thing is that my vertex program doesn't work if I take the normal directly, but works perfectly when I normalize it first, i.e., in Cg: float3 inpNormal = normalize(IN.normal.xyz); I don't understand that, as the normal _is_ already normalized before being sent to the GPU, so this 'normalize' function shouldn't do anything. But, taking the IN.normal.xyz plain does not work...
Advertisement
Can you tell us what exactly stops working if you use IN.Normal.xyz ? I mean if IN.Normal.xyz really is a normalizes vector, then it should definetely work. Have you tried checking the assembly of the shader? Maybe visualising the normal vector should help...
It sounds like the normals are not unit length, and computations in the shader probably assume that they should be. If you do normalize the normals before sending them to the shader, then it might be that they are scaled by a transformation matrix. If you rotate your normals in the shader, and the transformation matrix contains scaling, then they'll be perturbed and for example light computations will be wrong.

This topic is closed to new replies.

Advertisement