Pulling my hair out over D3D11 lighting bug

Started by
9 comments, last by kauna 11 years, 4 months ago
I have a simple model loaded and a point light source that I am trying to get to work correctly, but it's just not happening. I have probably spent around 12 hours over the past two days trying to figure out what is wrong, and I'm starting to get really frustrated, so I thought I would ask here as a last resort.

I have checked and double checked my lighting equation many times with at least 3 different sources. At the moment I am using constant lighting coefficient so that I know for a fact my object will be lit no matter what the distance. My light is EXPLICITLY fixed at the -30.0 position on the Z-axis, yet my object appears as if it is lit from a different direction. My camera is located at ( 0.0, 0.0, -20.0 ) so that I can verify the model is not swallowing the light, and it is not. The light is plenty far away from my model. I've changed the light/camera distance many times and have gotten the same result.

The entire front side of my object should be lit, instead it is lit from a different angle.

I use the DirectXMath library and I was VERY, VERY careful to store the const buffer matrices correctly and with the right alignment. I also made sure my HLSL byte alignment was correct by using float4. I use the row-major compiler options for HLSL so I do not need to use matrix transpose, and my code reflects that.

Please give my code a look and tell me where I've gone wrong, because I obviously cannot figure it out! tongue.png

Bugged lighting:
[attachment=12680:bug.jpg]

Matrix update function:
[attachment=12677:update.jpg]

Vertex shader:
[attachment=12678:vs.jpg]

Pixel shader:
[attachment=12679:ps.jpg]

Advertisement
Make sure that normal.w = 0 in your vert shader before transforming it into world space

Also you need to normalize the normal that comes in to your fragment shader from the interpolators (since it linearly interpolates over the triangle

Otherwise, start shader debugging by visualizing different terms at every step in the shader (just like normal debugging) to isolate your problem.
Also, it seems your light position is in world space, but the surface position you're using is not.
I just added the two things you said in your first post, to make sure the normal is truly normalized at all steps. Unfortunately, it did not fix the bug. Could you further explain your last post concerning world space light position vs surface? In the simple lighting examples I've seen on the internet they have made no distinction between the different view spaces when doing lighting in the pixel shader, yet their lighting was displayed correctly.

Updated normal transform in vertex shader:

float4 tmp_norm = float4( input.norm.x, input.norm.y, input.norm.z, 0.0 );
output.norm = mul( tmp_norm , world_matrix );
output.norm = normalize( output.norm );


Updated normal declaration in pixel shader:

float4 norm_norm = normalize( input.norm );
float3 tmp_norm = float3( norm_norm.x, norm_norm.y, norm_norm.z );

I actually just discovered that if I move my light source a ridiculous distance away (z value of -5000.0) then it works properly. I'm cool with this, but I'm really curious as to why it would not work at a distance that should have given the object enough room, and of course I would like to be able to set my light position to normal values without having to multiply by 5000.
To rule out the possibility that your cbuffer data somehow is invalid: do things appear correct when you hard-code the matrices and/or light position directly into the shaders?
You may declare your normals as float3 from the beginning. Ie. for the input vertex structure.

Also, when using HLSL it is more or less pointless to write code such as :

float3 tmp_norm = float3(input.norm.x, ....) since you may as well write just float3 tmp_norm = input.norm.xyz.

Since your normal is still declared as float4, it will be rotated AND translated by the world matrix, which will give you wrong normals. It doesn't help to set the w-component to 0 in this case.

Here is the code which works for me: (i.vNormal is defined as float3 ):

float3 vNormal = mul(i.vNormal,(float3x3)mWorld);
vNormal = normalize(vNormal);

Cheers!
Thanks for the suggestions, especially the tip about using input.norm.xyz, I feel ignorant now. So, letting HLSL align the buffer automatically should be okay? My old code was float3 for the vertex shader input, but I got paranoid and switched to float4 tongue.png

Is it okay for the input position to be float3, since it can only be passed to the pixel shader as a float4? Should I always set the w-component to zero when I don't need it?
In your fragment shader, the input.pos (surface position) that you're using is in projection space. However your light position is in world space. So when you compute the 'surface to light' vector (tmp_light - tmp_pos), you're doing it with coordinates in different spaces, and certainly not getting the world space direction you're looking for. Send the world position into your fragment shader and use that instead.
Gotcha, so I guess that should explain why the light does not appear to be on the z-axis like it should be, it is instead a bit to the left.

When you say to send the world position to the pixel shader, do you mean in place of the one I'm already sending, or as an extra?

EDIT: Just updated my code and all of a sudden everything works like a charm! Thanks, dude, and thanks to the guy who first mentioned it, I was just too thick-headed to think about passing another position to the pixel buffer. I've definitely learnt my lesson about mixing world/view/proj spaces happy.png That being said, is there a more elegant way to handle this instead of passing two positions?

This topic is closed to new replies.

Advertisement