Jump to content

  • Log In with Google      Sign In   
  • Create Account


Deferred omni/point light issues

  • You cannot reply to this topic
5 replies to this topic

#1 TheChubu   Crossbones+   -  Reputation: 4075

Like
0Likes
Like

Posted Yesterday, 03:33 AM

Hi! I'm having very annoying issues trying to implement omni lights (ie, point lights).

 

Here is a pic of the issue http://imgur.com/NSbIS77

 

As you can see, there are a bunch of textured meshes, with an omni directional light in the center. For some reason, only the sides facing down get lit, which makes sense for meshes that are above of the light, but not for meshes that are under it (ie, from the camera's view, those meshes should remain dark).

 

This is my vertex shader 

 

This is my fragment shader 

 

Position reconstruction works since its what I use in directional lighting, which works for all directions from what I've tested.

 

For rendering the point lights I use the two step process detailed in this article Killzone 2 Deferred Rendering Extended Notes, ie, geometry only draw with GEQUAL depth test and mark pixels in front of the volume with the stencil mask. Then another pass with the full shader I posted with depth test LEQUAL on the marked pixels.

 

I use bilinear interpolation to get the view ray to reconstruct the position as described in this site derschmale.com - Reconstructing position from depth, then again, I've been using it for directional lighting without issues, albeit without the bilinear interpolation since I just draw a fullscreen quad and use the built in interpolators for the corners. I tried doing the same bilinear interpolation on the directional light to test it if I got it wrong and it works just fine too.

 

For anything else you might need to know, just ask.

 


"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator


Sponsor:

#2 kalle_h   Members   -  Reputation: 1341

Like
1Likes
Like

Posted Yesterday, 12:46 PM

Light dir should be calculated at pixel shader. From pixel calculated position towards light center(this is contant so calculate it at cpu)



#3 Alundra   Members   -  Reputation: 839

Like
1Likes
Like

Posted Yesterday, 02:25 PM

Hey TheChubu !

You have inversed the light direction in the vertex shader :

passLightDir = (mv * inPosition) - pLight.viewCenter;

You are doing : LightDir = SurfacePos - LightPos.

The good calcule is : LightDir = LightPos - SurfacePos.

passLightDir = pLight.viewCenter - (mv * inPosition);

Since your inPosition is vec3 you should do :

passLightDir = pLight.viewCenter - (mv * vec4(inPosition,1.0));

The reason is without the w component on the vector to 1.0, it will just rotate the position but not translate it.

You should compute the LightDir in the pixel shader to avoid problems (just return the view space position in the vertex shader and use it in the pixel shader).

Why not compute the LightDir in the vertex shader ? Because if your geometry is culled you will not have lighting on a large part.

Don't forget to add : if( NdotL > 0.0 ) before add the specular to avoid artifacts.


Edited by Alundra, Yesterday, 02:45 PM.


#4 kalle_h   Members   -  Reputation: 1341

Like
0Likes
Like

Posted Yesterday, 03:10 PM

You can't calculate light direction at vertex shader with deferred shading you only have actual position at pixel shader stage.(its calculated based on depth)



#5 Alundra   Members   -  Reputation: 839

Like
1Likes
Like

Posted Yesterday, 03:57 PM

You can compute the view ray using this vertex shader and use this pixel shader when rendering the sphere geometry (use an icosahedron for better perf) :

void main()
{
  gl_Position = WorldViewProjection * InVertex;
  vec2 NDCPosition = gl_Position.xy / gl_Position.w;
  OutTexCoord0 = 0.5 * NDCPosition + 0.5;
  OutViewRay = vec3( NDCPosition.x * TanHalfFov * Aspect,
                     NDCPosition.y * TanHalfFov,
                     1.0 );
}

Here the pixel shader :

void main()
{ 
  vec3 Position = OutViewRay * texture( DepthMap, OutTexCoord0 ).r;
  vec3 LightVec = normalize( LightPosition - Position );
  float NdotL = dot( Normal, LightVec );
  if( NdotL > 0.0 )
  {
    ...
  }
}

But it's better to not do that because if the sphere (or icosahedron) is culled on a part, you will not have lighting on the culled part.

This culling problem is when a sphere is culled by the far plane.


Edited by Alundra, Yesterday, 04:13 PM.


#6 TheChubu   Crossbones+   -  Reputation: 4075

Like
0Likes
Like

Posted Today, 01:48 AM

All right. Removed those computations from the vertex shader and now I'm only doing this in the fragment shader:

vec3 viewPos = computePosition(viewDepth, texCoord);
vec3 lgtDir = lgt.viewCenter - viewPos;

Where 'viewPos' is the reconstructed view space position of the fragment and lgt.viewCenter is the center of the bounding volume in view space (with 'w' component set to 1 so translation is accounted for). So to get the direction between the fragment and the center of the omni directional light.

 

The only thing I managed is to get it to lit only the top of the meshes, rather than the bottom. ie, things only get lit on +X and -Y directions.

 

I've revised the view matrix, it should be fine (it gets computed with the same code as all other meshes), it gets uploaded to the UBO and all. I'm at a loss of whats wrong, it could be I'm missing something important from that Killzone 2 article?

 

EDIT: I'm starting to think there is some mismatch between the bounding volume center and the view space light center passed through the UBO.


Edited by TheChubu, Today, 03:41 AM.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

 

My journals: dustArtemis ECS framework and Making a Terrain Generator






PARTNERS