Jump to content

  • Log In with Google      Sign In   
  • Create Account


Incorrect directional lighting when rotating


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
9 replies to this topic

#1 KaseiFox   Members   -  Reputation: 115

Like
0Likes
Like

Posted 25 December 2012 - 02:33 AM

I've implemented a directional lighting shader, and it works if the object if still, however, when rotated the light moves in an odd way. I keep thinking it's the normals, but I can't spot exactly what's wrong. I've attempted to make a GIF of my program below:

 

f25022d5f06e6017787beef6b0babacd.jpg

 

Snippet of my code setting the shader parameters:

 

...
D3DXMATRIX rotMat, posMat, worldMat;
D3DXMatrixRotationYawPitchRoll(&rotMat, rot.y, rot.x, rot.z);
D3DXMatrixTranslation(&posMat, pos.x, pos.y, pos.z);


D3DXMatrixTranspose(&worldMat, &(rotMat*posMat));
D3DXMatrixTranspose(&param.viewMat, &param.viewMat);
D3DXMatrixTranspose(&param.projMat, &param.projMat);


if(FAILED(devcon->Map(matrixBuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource)))
{
loadedShaders = false;
return false;
}


mbptr = (MatrixBuffer*)mappedResource.pData;
mbptr->world = worldMat;
mbptr->view = param.viewMat;
mbptr->proj = param.projMat;


devcon->Unmap(matrixBuffer, 0);
devcon->VSSetConstantBuffers(0, 1, &matrixBuffer);
...

 

Snippet from vertex shader:

 

PixelInput LightVS(VertexInput input)
{
PixelInput output;


input.pos.w = 1.0f;
output.pos = mul(input.pos, mWorld);
output.pos = mul(output.pos, mView);
output.pos = mul(output.pos, mProj);
output.tex = input.tex;
output.norm = mul(input.norm, (float3x3)mWorld);
output.norm = normalize(output.norm);


return output;
}

 

Snippet from pixel shader:

 

float4 LightPS(PixelInput input) : SV_TARGET
{
float lightIntensity = saturate(dot(input.norm, lightDir));
float4 col = ambientCol;
if(lightIntensity > 0.0f) col += diffuseCol * lightIntensity;
return saturate(col) * shaderTexture.Sample(sampleType, input.tex);
}


Thanks for any help.

 



Sponsor:

#2 hdxpete   Members   -  Reputation: 447

Like
0Likes
Like

Posted 25 December 2012 - 04:00 AM

i dont use seperated model and view matrices on my projects. but a combined modelview matrix. so to me your missing one multiplication. other things i do it not cast to a float3x3 but to set the normal's w component to 0 and multiply with a float4x4. if for some reason your normal's w component was non-0 your light would be very much mucked up.



#3 C0lumbo   Crossbones+   -  Reputation: 2093

Like
0Likes
Like

Posted 25 December 2012 - 04:52 AM

Could you post the code where lightDir gets set?

 

It looks like all your shader code is doing the lighting in worldspace, which is absolutely fine, but it's often done in viewspace. If you have any code that transforms the lightDir into viewspace then that would cause problems similar to what you're seeing - lightDir should remain in worldspace.



#4 KaseiFox   Members   -  Reputation: 115

Like
0Likes
Like

Posted 25 December 2012 - 05:40 AM

i dont use seperated model and view matrices on my projects. but a combined modelview matrix. so to me your missing one multiplication. other things i do it not cast to a float3x3 but to set the normal's w component to 0 and multiply with a float4x4. if for some reason your normal's w component was non-0 your light would be very much mucked up.

 

I've made those changes and get the exact same results :\

 

Could you post the code where lightDir gets set?

 

It looks like all your shader code is doing the lighting in worldspace, which is absolutely fine, but it's often done in viewspace. If you have any code that transforms the lightDir into viewspace then that would cause problems similar to what you're seeing - lightDir should remain in worldspace.

 

The actual value is just set once, as so:

 

out.lightDir = EV3Normalize(EVector3(1, -1, 1));

The cbuffer is set here:

 

if(FAILED(devcon->Map(lightBuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource)))
{
loadedShaders = false;
return false;
}


lbptr = (LightBuffer*)mappedResource.pData;
lbptr->ambientCol = param.ambientCol;
lbptr->diffuseCol = param.diffuseCol;
lbptr->lightDir = param.lightDir;
lbptr->padding = 0.f;


devcon->Unmap(lightBuffer, 0);
devcon->PSSetConstantBuffers(0, 1, &lightBuffer);

When you said lighting is often done in viewspace, do you mean VSSetShader() and PSSetShader() are called once, then all objects are rendered? Because I'm only using the shaders this way so that different objects can get assigned different shaders for their respective materials. How would I go about 'combining' shaders if this is the case?



#5 SpaceRoach   Members   -  Reputation: 166

Like
0Likes
Like

Posted 25 December 2012 - 07:06 AM

When you said lighting is often done in viewspace, do you mean VSSetShader() and PSSetShader() are called once, then all objects are rendered? Because I'm only using the shaders this way so that different objects can get assigned different shaders for their respective materials. How would I go about 'combining' shaders if this is the case?

By this, he means that your calculations should be consistent. Both the normals and the light direction should lie in the same cooordinate frame/space. If you're transforming your normals to viewspace ( transforming them with the view matrix ) then you must also transform the light vector from world space to view space. 

Your normals are in world space alright, but as c0lumbo said, make sure that your light vector is also in world space. Maybe you accidentally multiplied it with the view matrix? ( just guessing)



#6 KaseiFox   Members   -  Reputation: 115

Like
0Likes
Like

Posted 25 December 2012 - 08:24 AM

When you said lighting is often done in viewspace, do you mean VSSetShader() and PSSetShader() are called once, then all objects are rendered? Because I'm only using the shaders this way so that different objects can get assigned different shaders for their respective materials. How would I go about 'combining' shaders if this is the case?

By this, he means that your calculations should be consistent. Both the normals and the light direction should lie in the same cooordinate frame/space. If you're transforming your normals to viewspace ( transforming them with the view matrix ) then you must also transform the light vector from world space to view space. 

Your normals are in world space alright, but as c0lumbo said, make sure that your light vector is also in world space. Maybe you accidentally multiplied it with the view matrix? ( just guessing)

The light direction isn't transformed by anything. I tried to transform the normals by a world*view matrix, and the light direction by both a world*view matrix and just a view matrix, but the same movement exists (although one time the light was a darker shade of grey, if that matters). I should also point out that I don't need to reverse the light direction, despite the fact that it's pointing in the wrong direction (towards the sphere, instead of away). I have no clue what's going on..



#7 MJP   Moderators   -  Reputation: 10067

Like
0Likes
Like

Posted 25 December 2012 - 12:34 PM

i dont use seperated model and view matrices on my projects. but a combined modelview matrix. so to me your missing one multiplication. other things i do it not cast to a float3x3 but to set the normal's w component to 0 and multiply with a float4x4. if for some reason your normal's w component was non-0 your light would be very much mucked up.


Casting the matrix to a float3x3 will give you the same exact result as converting the normal to a float4 with a w-component of 0.

@KaseiFox It's very hard to tell what's wrong with your image without knowing exactly how the object is being rotated, or how you're intending to rotating it. I would assume that you're changing rot.x, rot.y, and rot.z in some way each frame, so why don't you post that code as well?


Edited by MJP, 25 December 2012 - 12:36 PM.


#8 KaseiFox   Members   -  Reputation: 115

Like
0Likes
Like

Posted 25 December 2012 - 10:57 PM

i dont use seperated model and view matrices on my projects. but a combined modelview matrix. so to me your missing one multiplication. other things i do it not cast to a float3x3 but to set the normal's w component to 0 and multiply with a float4x4. if for some reason your normal's w component was non-0 your light would be very much mucked up.


Casting the matrix to a float3x3 will give you the same exact result as converting the normal to a float4 with a w-component of 0.

@KaseiFox It's very hard to tell what's wrong with your image without knowing exactly how the object is being rotated, or how you're intending to rotating it. I would assume that you're changing rot.x, rot.y, and rot.z in some way each frame, so why don't you post that code as well?

 

Sure. I'm only changing rot.y, and the model itself rotates correctly, which is why I'm thinking there's something wrong with the normals being transformed.

while(device->isRunning()) // is true until the window is closed
{
static float rot = 0.0f;
rot += .003f;
test->setRotation(EVector3(0, rot, 0)); // I've also tried using D3DXVECTOR3 to see if it was my vector class, but I get the same results.


device->beginScene(.5f, .5f, .5f); // ClearRenderTargetView(), ClearDepthStencilView() (from device context)


test->render(device, param); // model class, param contains view & projection matrices, along with light direction and color.


device->endScene(); // IDXGISwapChain::Present()
device->messageLoop(); // TranslateMessage(), DispatchMessage(), etc.
}


#9 he3117   Members   -  Reputation: 344

Like
0Likes
Like

Posted 28 December 2012 - 02:38 AM

Hi

I had exactly the same problem.I tracked the normal transform and I found the problem.

for me changing :

 

output.Normal =mul(input.Normal,World);

to

output.Normal =mul(World,input.Normal);

 

solved it or you can transpose the world matrix before multiply it.



#10 cozzie   Members   -  Reputation: 1408

Like
1Likes
Like

Posted 28 December 2012 - 07:38 AM

I solved the same situation by multiplying my normals with the inverse-transpose of the world matrix

(light source was oriented on the individual objects instead of the whole scene):

 

VS_OUTPUT VS_function(VS_INPUT input)
{
    VS_OUTPUT Out = (VS_OUTPUT)0;

    float4 worldPosition = mul(input.Pos, World);
    Out.Pos = mul(worldPosition, ViewProj);

    float4 normal = mul(input.Normal, WorldInvTransp);
    float lightIntensity = dot(normal, DiffLightDir);

    Out.Color = saturate(DiffLightColor * DiffLightIntensity * lightIntensity); 
    Out.Normal = normal;
    Out.TexCoord = input.TexCoord;

    return Out;
}





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS