# Matrices, normals, lighting and cg

This topic is 4059 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Right, where to start... I have a system setup where I can draw one mesh, translate/rotate to a given offset and draw another mesh, translate/offset, draw a third mesh, and so on and so on. Each tertiary mesh inheriting it's parent's rotation/translation. I also have a cg script which handles vertex blending and vertex lighting. Now, when I send in the first mesh in a series of meshes, the lighting is perfect because the rotation matrix is the same as the world matrix. The problem is when the tertiary meshes go into the script, their normals get screwed up because they are still working as if they were in the world matrix. This messes up the lighting part of the script. My first thought was to multiply the normals by the matrix passed into the script (the same matrix used to multiply the vertexes) and then re-normalize, to get the new normal. This results in no lighting at all on the models - they are pitch black on every poly. So now I'm trying to figure out where the problem lies - I need to figure out how to get the vertex normal into the same matrix space as the vertex. Any ideas? And for reference, the cg script... (yes, it is mostly just the Nvidia examples botched into one script)
void main(	float3 pos : POSITION,
float2 tex : TEXCOORD0,
in float3 pos2 : TEXCOORD1,
in float3 posN : TEXCOORD2,
float3 normal : NORMAL,
out float4 oPos : POSITION,
out float4 color : COLOR,
out float2 oTex : TEXCOORD0,
uniform float4x4  ModelViewProj,
uniform float keyFrameBlend,
uniform float brightness,
uniform float opacity,
uniform float lightDirX,
uniform float lightDirY)
{
oTex = tex;
// lightin vars
float3 lightColor=float3(1,1,1);
float3 lightPosition=pos + float3(lightDirX,lightDirY,-900);
float3 eyePosition=pos + float3(lightDirX,lightDirY,-900);
float3 Ka=float3(0,0,0);
float3 Kd=float3(1,1,1);
float3 Ks=float3(0.05, 0.05, 0.05);
float  shininess=16;
float4 color1, color2;

float3 newPos = lerp(pos, pos2, keyFrameBlend);
oPos = mul( ModelViewProj, float4(newPos,1) );

// calculate vertex lighting - front light
float3 P = oPos.xyz;
float3 N = lerp(normal, posN, keyFrameBlend);

// Compute the diffuse term
float3 L = normalize(lightPosition - P);
float diffuseLight = max(dot(N, L), 0);
float3 diffuse = Kd * lightColor * diffuseLight;

// Compute the specular term
float3 V = normalize(eyePosition - P);
float3 H = normalize(L + V);
float specularLight = pow(max(dot(N, H), 0), shininess);
if (diffuseLight <= 0) specularLight = 0;
float3 specular = Ks * lightColor * specularLight;

color.xyz = (Ka  + diffuse + specular)*brightness;
color.w = opacity;

}



##### Share on other sites
Quote:
 Original post by DamoclesRight, where to start... I have a system setup where I can draw one mesh, translate/rotate to a given offset and draw another mesh, translate/offset, draw a third mesh, and so on and so on. Each tertiary mesh inheriting it's parent's rotation/translation.

So, I'm not completely sure I understand, but I think what you mean is you are doing a hierarchical transform. In other words, you transform are stacked onto top of each other, as in skeletal systems? I'm going to assume that is the case in my answer.

Quote:
 I also have a cg script which handles vertex blending and vertex lighting. Now, when I send in the first mesh in a series of meshes, the lighting is perfect because the rotation matrix is the same as the world matrix. The problem is when the tertiary meshes go into the script, their normals get screwed up because they are still working as if they were in the world matrix. This messes up the lighting part of the script.My first thought was to multiply the normals by the matrix passed into the script (the same matrix used to multiply the vertexes) and then re-normalize, to get the new normal. This results in no lighting at all on the models - they are pitch black on every poly.So now I'm trying to figure out where the problem lies - I need to figure out how to get the vertex normal into the same matrix space as the vertex.Any ideas?And for reference, the cg script...(yes, it is mostly just the Nvidia examples botched into one script)*** Source Snippet Removed ***

Another thing I'm not quite clear about is why you are lerping the normal like this: 'float3 N = lerp(normal, posN, keyFrameBlend);' I'm not very familiar with Cg, but does that say you are blending the normal and the position together? I'm not sure that is what you want to be doing. Instead, you want to transform the normals of your mesh by the current stacked transform.

What I think you should do is take your ModelViewProj, which takes your model positions from model to projection space, and multiply your model normals by the inverse transform of this. Note that each time you move up the transform hierarchy you should change your ModelViewProj matrix that is passed into the Cg shader.

For information on why normals are transformed in this odd way I googled this:

I hope that helps!

##### Share on other sites
Thanks for your help, but after applying your advice the problem changed. Instead of wrong lighting, the lighting for each vertex was either on or off, there was no in-between. My first thought was that I needed to normalize the new normals after transformation, but that just resulted in no lighting at all.

Then it dawned on me, after reading that article you linked, why go through the hassle of inverting matrices when the problem is that matrices transform points not vectors. So all I had to do was get the start point and end point of the normal, transform those by the matrix and then use them to reconstruct the normal. Worked like a charm.

• 32
• 12
• 10
• 9
• 9
• ### Forum Statistics

• Total Topics
631350
• Total Posts
2999478
×