Sign in to follow this  
lastlinkx

Point-light normal issue

Recommended Posts

lastlinkx    484
So in my attempt to shrink my G-Buffer while maintaining the most detail, I thought I had made a breakthrough.
Previously I had been using 3 render-targets for my light pre-pass implementation and 1 more render-target for shadows.

My G-Buffer / Render-targets = ~7.5 MB[list]
[*]Depth-map: R32 1024 x 576
[*]Light-map: ARGB32 1024 x 576
[*]Output : ARGB32 1024 x 576
[*]Shadows: R16 512 x 512
[/list]
What I was doing looked something like this:[list=1]
[*]Draw scene depth to the depth-map
[*]Draw the shadow buffer to its render target
[*]Draw/Calculate AO and shadows using ONLY the depth-map, to the Light-map
[*]Blur the light-map using the output render-target as a temporary buffer
[*]Draw scene lighting using depth
[*]Draw the scene to the Output buffer using the light-map
[*]Ping-pong between the light-map and output for post-process effects
[/list]

Now this looked FINE, but I wasn't satisfied. I had completely ignored normals and as a result was stuck with only simple point-lights.
So I got to thinking about how I could squeeze in a normal-buffer. The most obvious idea of just making a new render-target would leave me squeezed at ~9.5 MB, which is NOT OK!
So after much pondering I finally came up with a way to do it in the render-targets I already had!
I can use the SAME G-BUFFER!

What I'm doing now, and what I'm stuck on:[list=1]
[*]Draw scene depth to the depth-map and scene [b]Normals to the output render-target[/b]
[*]Draw the shadow buffer to its render target
[*]Draw/Calculate AO and shadows using ONLY the depth-map, to the Light-map
[*]Blur the light-map using the [b]Shadow render-target[/b] as a temporary buffer (Both AO and Shadows are only a representation of lightness / darkness and can easily be represented as just 1 byte / pixel)
[*]Draw scene lighting using depth [b]and Normals![/b]
[*]Draw the scene to the Output buffer using the light-map
[*]Ping-pong between the light-map and output for post-process effects
[/list]

Now theoretically this should work, and it does... sort of.
I can go all the way through the render-cycle, and when I go to draw lights at step 5, I have both a depth and a normal buffer!
Unfortunately, when I try to draw lights taking into account the normal-buffer, the lights are calculated incorrectly.

Now after much pondering, searching and tweaking, I have still yet to get over this little obstacle.
I previously had only been using point-lights, just calculating the attenuation with a depth-calculated world position, and that worked fine,
and if I remember correctly, to take normals into account for an omni-directional light I just have to multiply my current output by the dot product of the lightVector and the surface normal, right?

So here's the pixel shader for the point light:
[source lang="cpp"]float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{

float2 screenSpace = GetScreenCoord(input.ScreenSpace);
float lightDepth = input.Depth.x / input.Depth.y;
float sceneDepth = tex2D(Depth, screenSpace).r;
clip(sceneDepth > lightDepth || sceneDepth == 0 ? -1:1);

float4 position;
position.x = screenSpace.x * 2 - 1;
position.y = (1 - screenSpace.y) * 2 - 1;
position.z = sceneDepth;
position.w = 1.0f;

position = mul(position, iViewProjection);
position.xyz /= position.w;

// Surface World Position is calculated correctly
// Light position is in world space

float Distance = distance(LightPosition, position);
clip( Distance > LightRadius ? -1:1);

float3 normal = (2.0f * tex2D(Normal, screenSpace).xyz) - 1.0f;
float3 lightVector = normalize(LightPosition - position);
float lighting = saturate(dot(normal, lightVector));



return lighting * (1.1f - Distance / LightRadius) * LightColor;
}[/source]

Here is the little bit of shader code that generates the Depth and Normal buffers:
[source lang="cpp"]PreVertexShaderOutput PreVertexShaderFunction(PreVertexShaderInput input)
{
PreVertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);
float4 viewPosition = mul(worldPosition, View);
output.Position = mul(viewPosition, Projection);
output.Depth.xy = output.Position.zw;
output.Normal = mul(input.Normal, World);

return output;
}

PrePixelOutput PrePixelShaderFunction(PreVertexShaderOutput input)
{
PrePixelOutput output;

output.Normal.xyz = (normalize(input.Normal).xyz * 0.5f) + 0.5f;
output.Normal.a = 1;
output.Depth = input.Depth.x / input.Depth.y;

return output;
}[/source]
Now, a couple of pictures would be nice to illustrate what I'm seeing.

- I overrode the final scene drawing to get the normal buffer to show up for this pic, so I am getting some sort of normal value,

[img]http://i443.photobucket.com/albums/qq151/link125552/LightIssue2.png[/img]

- And here's what it looks like with the incorrect lighting, the lights are all squeezed and distorted.
[img]http://i443.photobucket.com/albums/qq151/link125552/LightIssue1.png[/img]

- Here's what it should/used to look like in the light-buffer
[img]http://i443.photobucket.com/albums/qq151/link125552/2-Torches.png[/img]

So even after typing ALL this up, I still haven't been able to figure this out...
Is it the normals? is it the light? I don't know...

Any help would be greatly appreciated [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]
[i]-Thanks in advance[/i] Edited by lastlinkx

Share this post


Link to post
Share on other sites
jefferytitan    2523
I'm not positive on this, but I'm guessing that when you get lightVector, LightPosition and position are transformed differently. Maybe you need to do this line before you multiply position by iViewProjection?

Share this post


Link to post
Share on other sites
lastlinkx    484
[quote name='jefferytitan' timestamp='1340067192' post='4950438']
I'm not positive on this, but I'm guessing that when you get lightVector, LightPosition and position are transformed differently. Maybe you need to do this line before you multiply position by iViewProjection?
[/quote]

I'm pretty positive that the calculated scene-world-position is correct, and the Light position is a parameter, also in world space.
I had been doing attenuation using just the distance between these two points, which had worked fine;
leading me to think my problem is elsewhere.

Share this post


Link to post
Share on other sites
jefferytitan    2523
Perhaps it's a terminology thing, I would assume that iViewProjection is a projection from 3D to 2D space, and therefore you'd be subtracting a 2D projection from a 3D co-ordinate.

Share this post


Link to post
Share on other sites
RDragon1    1205
What space are you storing the normals in? I find it very strange that looking at your normal buffer, the floor and the wall seem to be the same color. Are all of the normals in world or view space?

Also, are you handling storing negative values in your buffer correctly? Typically people take values that are [-1,1] and multiply by 0.5 and add 1 to store them in the buffer between [0,1], then reverse that transformation when reading the normal back out

Share this post


Link to post
Share on other sites
RDragon1    1205
[quote name='rdragon1' timestamp='1340077979' post='4950480']
I find it very strange that looking at your normal buffer, the floor and the wall seem to be the same color.
[/quote]

I meant to add - since they're at ~90 degrees to each other, at least one component of the normal should be very different

Share this post


Link to post
Share on other sites
lastlinkx    484
[quote name='rdragon1' timestamp='1340078051' post='4950481']
[quote name='rdragon1' timestamp='1340077979' post='4950480']
I find it very strange that looking at your normal buffer, the floor and the wall seem to be the same color.
[/quote]

I meant to add - since they're at ~90 degrees to each other, at least one component of the normal should be very different
[/quote]

You're right is odd..

All I'm doing to create the normal buffer is multiplying each vertex's normal by the World matrix.
Then moving the [-1, 1] to the [0, 1] range. It seems pretty straight forward.

[source lang="cpp"]PreVertexShaderOutput PreVertexShaderFunction(PreVertexShaderInput input)
{
PreVertexShaderOutput output;

float4 worldPosition = mul(input.Position, World);
float4 viewPosition = mul(worldPosition, View);
output.Position = mul(viewPosition, Projection);
output.Depth.xy = output.Position.zw;
output.Normal = mul(input.Normal, World);

return output;
}

PrePixelOutput PrePixelShaderFunction(PreVertexShaderOutput input)
{
PrePixelOutput output;

output.Normal.xyz = (normalize(input.Normal).xyz * 0.5f) + 0.5f;
output.Normal.a = 1;
output.Depth = input.Depth.x / input.Depth.y;

return output;
}[/source]
Yet something is throwing off my normals. The walls should appear flat.

Here are some more pics:

[IMG]http://i443.photobucket.com/albums/qq151/link125552/Normal5.png[/IMG]
[IMG]http://i443.photobucket.com/albums/qq151/link125552/Normal4.png[/IMG]
[IMG]http://i443.photobucket.com/albums/qq151/link125552/Normal3.png[/IMG]

Any ideas?

Share this post


Link to post
Share on other sites
lastlinkx    484
Well, I feel like a complete idiot!
I found the error... I wasn't sending the normal-buffer to the graphics card!

I have no idea how I missed that :P

Oh, well. Live and learn!

[i]-Thank you to everyone who looked this over[/i]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this