Scaling Normals

Started by
2 comments, last by RobTheBloke 14 years, 6 months ago
I was thinking about this common scenario: You scale your geometry with a scaling matrix, but then your normals get scaled as well. This results in incorrect lighting. To combat this, we could use D3DRENDERSTATE_NORMALIZENORMALS (idk what the OpenGL equivalent is), but this is computationally expensive. Why doesn't Direct3D/OpenGL let us provide two world transformation matrices, one for vertices and one (unscaled) for normals? I think it would be faster to normalize a matrix once, then apply it to all normals, than to apply a scaling matrix to all normals and then re-normalizing them.
Advertisement
Thats a darn good question
Another reason to leave behind the fixed-function pipeline and move on to shaders =D
In a shader-based system you can use as many matrices in as many ways as you like.


Some other fixed function graphics APIs (e.g. on game consoles) do allow you to specify 2 versions of the same matrix - the full matrix for transforming positions, and a scale/translation free matrix for transforming normals.
Quote:I think it would be faster to normalize a matrix once, then apply it to all normals, than to apply a scaling matrix to all normals and then re-normalizing them.


Matrix = Scale * Matrix * InverseScale

No need, just pre and post mult the scaling + inverse scaling matrix (might be the other way around depending on the matrix order of your maths lib). points and normals get scaled, however the normals get left unit length.

Things can get a bit more tricky if you are using scaling within a bone hierarchy. In those cases you normally correct with the inverse parent scale, unless the transform is at the end of a chain.

This topic is closed to new replies.

Advertisement