Jump to content

  • Log In with Google      Sign In   
  • Create Account


Point-light normal issue


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 LarryKing   Members   -  Reputation: 414

Like
0Likes
Like

Posted 18 June 2012 - 02:31 PM

So in my attempt to shrink my G-Buffer while maintaining the most detail, I thought I had made a breakthrough.
Previously I had been using 3 render-targets for my light pre-pass implementation and 1 more render-target for shadows.

My G-Buffer / Render-targets = ~7.5 MB
  • Depth-map: R32 1024 x 576
  • Light-map: ARGB32 1024 x 576
  • Output : ARGB32 1024 x 576
  • Shadows: R16 512 x 512
What I was doing looked something like this:
  • Draw scene depth to the depth-map
  • Draw the shadow buffer to its render target
  • Draw/Calculate AO and shadows using ONLY the depth-map, to the Light-map
  • Blur the light-map using the output render-target as a temporary buffer
  • Draw scene lighting using depth
  • Draw the scene to the Output buffer using the light-map
  • Ping-pong between the light-map and output for post-process effects

Now this looked FINE, but I wasn't satisfied. I had completely ignored normals and as a result was stuck with only simple point-lights.
So I got to thinking about how I could squeeze in a normal-buffer. The most obvious idea of just making a new render-target would leave me squeezed at ~9.5 MB, which is NOT OK!
So after much pondering I finally came up with a way to do it in the render-targets I already had!
I can use the SAME G-BUFFER!

What I'm doing now, and what I'm stuck on:
  • Draw scene depth to the depth-map and scene Normals to the output render-target
  • Draw the shadow buffer to its render target
  • Draw/Calculate AO and shadows using ONLY the depth-map, to the Light-map
  • Blur the light-map using the Shadow render-target as a temporary buffer (Both AO and Shadows are only a representation of lightness / darkness and can easily be represented as just 1 byte / pixel)
  • Draw scene lighting using depth and Normals!
  • Draw the scene to the Output buffer using the light-map
  • Ping-pong between the light-map and output for post-process effects

Now theoretically this should work, and it does... sort of.
I can go all the way through the render-cycle, and when I go to draw lights at step 5, I have both a depth and a normal buffer!
Unfortunately, when I try to draw lights taking into account the normal-buffer, the lights are calculated incorrectly.

Now after much pondering, searching and tweaking, I have still yet to get over this little obstacle.
I previously had only been using point-lights, just calculating the attenuation with a depth-calculated world position, and that worked fine,
and if I remember correctly, to take normals into account for an omni-directional light I just have to multiply my current output by the dot product of the lightVector and the surface normal, right?

So here's the pixel shader for the point light:
[source lang="cpp"]float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0{ float2 screenSpace = GetScreenCoord(input.ScreenSpace); float lightDepth = input.Depth.x / input.Depth.y; float sceneDepth = tex2D(Depth, screenSpace).r; clip(sceneDepth > lightDepth || sceneDepth == 0 ? -1:1); float4 position; position.x = screenSpace.x * 2 - 1; position.y = (1 - screenSpace.y) * 2 - 1; position.z = sceneDepth; position.w = 1.0f; position = mul(position, iViewProjection); position.xyz /= position.w; // Surface World Position is calculated correctly // Light position is in world space float Distance = distance(LightPosition, position); clip( Distance > LightRadius ? -1:1); float3 normal = (2.0f * tex2D(Normal, screenSpace).xyz) - 1.0f; float3 lightVector = normalize(LightPosition - position); float lighting = saturate(dot(normal, lightVector)); return lighting * (1.1f - Distance / LightRadius) * LightColor;}[/source]

Here is the little bit of shader code that generates the Depth and Normal buffers:
[source lang="cpp"]PreVertexShaderOutput PreVertexShaderFunction(PreVertexShaderInput input){ PreVertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); output.Depth.xy = output.Position.zw; output.Normal = mul(input.Normal, World); return output;}PrePixelOutput PrePixelShaderFunction(PreVertexShaderOutput input){ PrePixelOutput output; output.Normal.xyz = (normalize(input.Normal).xyz * 0.5f) + 0.5f; output.Normal.a = 1; output.Depth = input.Depth.x / input.Depth.y; return output;}[/source]
Now, a couple of pictures would be nice to illustrate what I'm seeing.

- I overrode the final scene drawing to get the normal buffer to show up for this pic, so I am getting some sort of normal value,

Posted Image

- And here's what it looks like with the incorrect lighting, the lights are all squeezed and distorted.
Posted Image

- Here's what it should/used to look like in the light-buffer
Posted Image

So even after typing ALL this up, I still haven't been able to figure this out...
Is it the normals? is it the light? I don't know...

Any help would be greatly appreciated Posted Image
-Thanks in advance

Edited by lastlinkx, 18 June 2012 - 07:59 PM.


Sponsor:

#2 jefferytitan   Members   -  Reputation: 1642

Like
0Likes
Like

Posted 18 June 2012 - 06:53 PM

I'm not positive on this, but I'm guessing that when you get lightVector, LightPosition and position are transformed differently. Maybe you need to do this line before you multiply position by iViewProjection?

#3 LarryKing   Members   -  Reputation: 414

Like
0Likes
Like

Posted 18 June 2012 - 08:01 PM

I'm not positive on this, but I'm guessing that when you get lightVector, LightPosition and position are transformed differently. Maybe you need to do this line before you multiply position by iViewProjection?


I'm pretty positive that the calculated scene-world-position is correct, and the Light position is a parameter, also in world space.
I had been doing attenuation using just the distance between these two points, which had worked fine;
leading me to think my problem is elsewhere.

#4 jefferytitan   Members   -  Reputation: 1642

Like
0Likes
Like

Posted 18 June 2012 - 09:47 PM

Perhaps it's a terminology thing, I would assume that iViewProjection is a projection from 3D to 2D space, and therefore you'd be subtracting a 2D projection from a 3D co-ordinate.

#5 rdragon1   Crossbones+   -  Reputation: 1171

Like
0Likes
Like

Posted 18 June 2012 - 09:52 PM

What space are you storing the normals in? I find it very strange that looking at your normal buffer, the floor and the wall seem to be the same color. Are all of the normals in world or view space?

Also, are you handling storing negative values in your buffer correctly? Typically people take values that are [-1,1] and multiply by 0.5 and add 1 to store them in the buffer between [0,1], then reverse that transformation when reading the normal back out

#6 rdragon1   Crossbones+   -  Reputation: 1171

Like
0Likes
Like

Posted 18 June 2012 - 09:54 PM

I find it very strange that looking at your normal buffer, the floor and the wall seem to be the same color.


I meant to add - since they're at ~90 degrees to each other, at least one component of the normal should be very different

#7 LarryKing   Members   -  Reputation: 414

Like
0Likes
Like

Posted 19 June 2012 - 07:29 AM


I find it very strange that looking at your normal buffer, the floor and the wall seem to be the same color.


I meant to add - since they're at ~90 degrees to each other, at least one component of the normal should be very different


You're right is odd..

All I'm doing to create the normal buffer is multiplying each vertex's normal by the World matrix.
Then moving the [-1, 1] to the [0, 1] range. It seems pretty straight forward.

[source lang="cpp"]PreVertexShaderOutput PreVertexShaderFunction(PreVertexShaderInput input){ PreVertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); output.Depth.xy = output.Position.zw; output.Normal = mul(input.Normal, World); return output;} PrePixelOutput PrePixelShaderFunction(PreVertexShaderOutput input){ PrePixelOutput output; output.Normal.xyz = (normalize(input.Normal).xyz * 0.5f) + 0.5f; output.Normal.a = 1; output.Depth = input.Depth.x / input.Depth.y; return output;}[/source]
Yet something is throwing off my normals. The walls should appear flat.

Here are some more pics:

Posted Image
Posted Image
Posted Image

Any ideas?

#8 LarryKing   Members   -  Reputation: 414

Like
1Likes
Like

Posted 19 June 2012 - 08:31 AM

Well, I feel like a complete idiot!
I found the error... I wasn't sending the normal-buffer to the graphics card!

I have no idea how I missed that :P

Oh, well. Live and learn!

-Thank you to everyone who looked this over




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS