Using normals when calculating lighting

Started by
4 comments, last by Asesh 12 years ago
So I understand what a normal is, but I don't really understand the concept of how it is used when calculating lighting. I understand what normalizing and what a dot product between 2 vectors produce, so I am not necessarily looking for a math lesson(unless there is something I am missing). Thank you!
Advertisement
The most basic answer is that taking the dot product of the surface normal for a point you are shading with the normalized vector to the light source being considered gives you the contribution that light source makes to the perceived color at that point on the surface, if the surface is entirely diffuse, and has no specular/illuminance/more advanced BRDF.

The wikipedia article on lambertian reflectance may help:
http://en.wikipedia....ian_reflectance
A really simple way to think of it is that you want to know how much a surface is facing towards a light. So a surfaces faces more towards a light it gets lit more, if it faces away from the light it gets lit less. The dot product is what you use to determine how much the normal light direction line up, since the dot product is 1 when they point in the same direction.
well, just using the dot product(which is in most examples), does not give you the angle between the 2 vectors. It gives a value that is associated with the angle, but there is a little more math involved to get the actual angle(magnitudes and inverse cosine).

Edit: Is just using the dot product of 2 normalized vectors(direction) sufficient?
Your normal and light direction should always be normalized...it would make no sense to use unnormalized vectors in a dot product for lighting calculations. For the case of normalized vectors the dot product gives you the cosine of the angle between those vectors, which is exactly what you want for lighting (for a perfectly lambertian surface the ratio of light reflected is the cosine of the angle between the surface normal and the incident light direction).
This code might help you...


// Transform the normal and normalize it
oVSOutput.m_vecNormal.xyz = normalize(mul(oVSInput.m_vecNormal, m_matInverseTransposeWorld)).xyz;
float3 fAmbient = m_fAmbientMaterialColor.xyz * m_fAmbientLightColor.xyz; // Compute ambient color
float3 fDiffuse = max(dot(oVSInput.m_vecNormal, m_vecLightPosition), 0.0f) * (m_fDiffuseMaterialColor * m_fDiffuseLightColor).rgb; // Implementation of diffuse lighting
oVSOutput.m_fMaterialColor.rgb = fAmbient + fDiffuse; // Implementation of ambient-diffuse lighting
oVSOutput.m_fMaterialColor.a = m_fDiffuseMaterialColor.a; // Copy the material's alpha component as-is
// Pass out the texture coordinates as is
oVSOutput.m_fTexture = oVSInput.m_fTexture;

This topic is closed to new replies.

Advertisement