I've got this problem where the shading of my objects changes when my camera moves. The obvious solution is that I'm not transforming my light into eye space before sending it to the shader, but I am doing this. Another thing to note is that when I leave my light AND normals in world space, it works fine.
return diffuse;
}
// get the values from GBuffer
vec4 diffusePass()
{
vec3 pos = vec3(texture2D(positionTexture, UV));
vec3 norm = vec3(texture2D(normalTexture, UV));
vec3 diff = vec3(texture2D(diffuseTexture, UV));
return vec4(shadePixel(pos, norm, diff), 1.0);
}
If I don't multiply the normal by u_NormalMatrix and I don't multiply the light position by the view matrix it works fine. Can't spot why it wont work in view space.
The MVP-Matrix consists of 3 components:
1. Modelmatrix is the scaling, rotation and translation of the object
2. Viewmatrix is the position, lookat, rotation of the camera
3. Projectionmatrix is the "presentation to the 2d screen"
Seems like you use the model*view-matrix for your normalmatrix. Try to use just the modelmatrix for transforming the normal in your shader.
The position of object and light is independent from the position of the camera.
Render the normals to the final color buffer instead of the diffuse so that you can see the actual normal X, Y, and Z values.
As you rotate the camera, each surface of each object should change color. Floors should be green if you are looking straight, but turn black if you look down.
If they do not, the problem is in your normal matrix (which should be the inverse-transpose of the model-view matrix, not just the model-view matrix).
If they do look fine, the problem is in your light direction.
L. Spiro
I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid
Render the normals to the final color buffer instead of the diffuse so that you can see the actual normal X, Y, and X values.
As you rotate the camera, each surface of each object should change color. Floors should be green if you are looking straight, but turn black if you look down.
If they do not, the problem is in your normal matrix (which should be the inverse-transpose of the model-view matrix, not just the model-view matrix).
If they do look fine, the problem is in your light direction.
L. Spiro
The floor was green when the camera is in normal position, as I pitch down the floor fades from green -> teal -> blue. That shouldn't be right? Blue would mean (0, 0, 1) normal value which isn't correct. I tried putting a glm::inverse() around my normal matrix but had the same results.
Perhaps I should multiply the position in my frag shader by model view matrix such as:
vec3 s = normalize(vec3(Light.Position) - (u_ModelViewMatrix * vec4(pos, 1.0)));
Because I'm just passing it through my geom buffer shader unchanged:
Position = in_Position;
EDIT: The shading still changes when the camera moves. If I turn around from origin in the scene and start tilting down, the floor and everything just turns black. And actually i think putting the inverse after the transpose didn't actually change a thing.
It won’t change anything unless you have non-uniform scaling in either the world matrix or the view matrix.
Also I might have mislead you in my previous post. I meant to say the inverse of the camera matrix, which is in itself the view matrix, so rather than taking the inverse of the view matrix it should be just the view matrix. It was minutes before I passed out of sheer sleepiness and I wasn’t thinking straight.
It should likely fix your problem to multiply the light position by the raw view matrix.
L. Spiro
I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid
No problem, I often do too much coding whilst half asleep
So I've multiplied the light position by just the view matrix and the floor doesn't change to black any more when I rotate and look down. The other objects still seem to shade differently when I translate and rotate around the world. I'm still kind of confused because when I leave the normal/light in world space the shading is all static no matter where I move, but as soon as I multiply the normals and the light by the normal/view matrix the shading changes based on my camera (this sounds like it is expected, but I never see this in games anywhere?). I'm not multiplying the position by the modelview matrix before sending it to the Geom Buffer, which is ok because if i do the world looks and behaves weirdly. But even if I multiply the position by the modelview matrix in the frag shader just to calculate the direction of the light to that position in the texture it seems to be have the same way.