Jump to content

View more

Image of the Day

#ld38 #screenshotsaturday Mimosa Fizz action gif #2 https://t.co/TUzdppvfUL
IOTD | Top Screenshots

The latest, straight to your Inbox.

Subscribe to GameDev.net Direct to receive the latest updates and exclusive content.

Sign up now

Shadow maps flipped and offset in deferred renderer

4: Adsense

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 Neobim   Members   


Posted 15 January 2013 - 08:30 PM

I've been trying to implement shadow mapping into my deferred renderer, but I seem to have hit a brick wall.


My shadows are coming out mirrored with an offset on the X axis. Top left is the shadow map.




I can add 0.166, a seemingly arbitrary number, to the X axis and multiply it by -1, but testing against the shadowmap's depth doesn't work--I'm assuming this is another symptom of the underlying problem at hand. In that screenshot I'm just multiplying my light attenuation the shadow map depth so I can see where the shadow map is.


Either way it's not much of a solution as that doesn't work if I use anything other than a directional light.



I'm not positive if it's a problem with how I'm rendering the shadowmap or with how I'm transforming coordinates to light-space, but I'm pretty sure it's the latter.


I'm using a deferred renderer, so I'm doing my shadowing on a full screen quad using data from the G-buffer. All my lighting is done in view space, and I'm reconstructing viewspace positions from depth. That's working perfectly for everything else, but I thought it may have been incompatible with the way I was doing shadow maps, so I tried storing full XYZ positions in the G-buffer as well as applying the shadow map in the G-buffer, but I got the same results every way.



Here's how I'm making the matrices for and rendering the shadow map:

glm::mat4 shadowProjectionMatrix = glm::ortho(-15.f, 15.f, -15.f, 15.f, -10.f, 20.f);
glm::mat4 shadowViewMatrix = glm::lookAt( light->GetDirection(), glm::vec3(0,0,0), glm::vec3(0,1,0) );
glm::mat4 shadowMatrix = shadowViewMatrix * shadowProjectionMatrix;

glViewport( 0, 0, shadowMapSize, shadowMapSize );
world->Render( &shadowViewMatrix, &shadowProjectionMatrix );



This is how I'm applying shadows in my light shader:

vec4 PositionWS = inverseCameraViewMatrix * vec4( positionVS.xyz, 1.0 );
vec4 PositionLS = shadowMatrix * PositionWS;

PositionLS = PositionLS * 0.5 + 0.5;

// PositionLS.x = PositionLS.x * -1; // flip x axis
// PositionLS.x += 0.166; // seemingly arbitrary number

float shadowDepth = texture( tShadowMap, PositionLS.xy ).z;

// temporary, display shadow map directly on image
visibility = shadowDepth.z;
attenuation *= visibility;



This is really driving me insane. I've seemingly tried everything to get it to work properly but to no avail. This is the closest I've gotten. What really frustrates me is how this really shouldn't be that hard, once I put my view space position into world space it should just be a matter of multiplying it by the exact same matrices I use to render the shadowmap in the first place.


Any help would be greatly appreciated. I'm really on my last straw here.

Edited by Neobim, 15 January 2013 - 08:32 PM.

#2 NumberXaero   Prime Members   


Posted 15 January 2013 - 09:48 PM

Are you including a shadow map scale/bias matrix in your calculations. http://en.wikipedia.org/wiki/Shadow_mapping

#3 Neobim   Members   


Posted 16 January 2013 - 04:14 AM

Fixed it. I did have a bias, although it wasn't correct, it wasn't the full cause of the issues. The underlying problem was really, really silly.


This line of code...


shadowMatrix = shadowViewMatrix * shadowProjectionMatrix;


Was reversed, and should have been:


shadowMatrix = shadowProjectionMatrix * shadowViewMatrix;


I didn't think that would make a difference, but it did.


Thanks anyway, I appreciate it!

#4 Burnt_Fyr   Members   


Posted 16 January 2013 - 09:18 AM

Matrix multiplication is not commutative, so yes it is a big difference.

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.