Sign in to follow this  
TJClifton

Point Light Issue With Deferred Rendering

Recommended Posts

TJClifton    106
I've been working on a deferred rendering system (based on[url="http://www.catalinzima.com/tutorials/deferred-rendering-in-xna/point-lights/"] this[/url] excellent XNA tutorial) and all has been going largely well.

However I have hit a snag with point lights. I have followed the code almost exactly (except where I have had to use DX methods instead of the XNA ones in the tutorial) but I just can't seem to get it to work. I understand the theory and I am sure that I am doing something stupid but I just can't figure it out. It has been a couple of days now and nothing I try seems to work. To illustrate the problem here's a screen capture of the problem:

[img]http://www.tjclifton.com/imgs/PointLightProblem.png[/img]

As you can see all the pixels in the sphere are being lit equally - it does not fade out at the edges. From my understanding, the attenuation should deal with this, making light weaker at the edges of the sphere and stronger at the origin of the point source.

However I am using the exact same equations as in the XNA tutorial and cannot see why it is not working for me.

Unfortunately I only have access to the computers at my university at the moment and cannot turn on some of the debug features as I don't have the administrator privileges, so I don't have the luxury of using some of the shader debugging features in PIX.

I have tried outputting the attenuation to the red channel in the pixel shader though and came up with this:

[img]http://www.tjclifton.com/imgs/PointLightAttenuation.png[/img]

Does this look right? To me this looks like the attenuation is the same for every pixel in the sphere which should not be right. They all seem to be getting the maximum amount of light until it reaches the edge of the sphere where it cuts off suddenly because those pixels are not being lit.

If anyone has any insight on this problem it would be greatly appreciated! I can post my code if needed but as I say it's identical really to that in the tutorial I linked above.

Thanks

EDIT

I have outputted the screen position information that gets input into the pixel shader and I get this:

[img]http://www.tjclifton.com/imgs/ScreenPosition.png[/img]

This looks right to me which makes me think that my problem may lie with calculating the world position from the screen position. The code I have for that at the moment is:

//obtain screen position
input.ScreenPosition.xy /= input.ScreenPosition.w;


//obtain textureCoordinates corresponding to the current pixel and align to pixels
float2 texCoord = 0.5f * (float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1);
texCoord -= gHalfPixel;

//get depth
float depthVal = tex2D(depthS, texCoord).r;

//compute screen-space position
float4 position;
position.xy = input.ScreenPosition.xy;
position.z = depthVal;
position.w = 1.0f;


//transform to world space
position = mul(position, gInvViewProjection);
position /= position.w;

Again any help would be greatly appreciated! Edited by TJClifton

Share this post


Link to post
Share on other sites
BCullis    1955
For starters, can you post your shader code? We know all your code isn't identical, since like you say you've had to replace XNA methods with DX9 methods, but it would be nice to rule out any HLSL slip-ups from the start: especially since HLSL errors can go undetected easier than DX code errors - there are a lot of "fail silently" results inside shader code depending on assigned values and register contents.

Share this post


Link to post
Share on other sites
TJClifton    106
Ok sure. Just didn't want to be one of those people who posts 2 pages of code with "help me" added after :)

My vertex shader is pretty simple, just passes on the positions to the pixel shader:


VSOutput VSFunction(VSInput input)
{
VSOutput outVS;
float4 worldPosition = mul(float4(input.Position,1), gWorld);
float4 viewPosition = mul(worldPosition, gView);
outVS.Position = mul(viewPosition, gProjection);
outVS.ScreenPosition = outVS.Position;
return outVS;
}

And my pixelShader:

float4 PSFunction(VSOutput input) : COLOR0
{
//obtain screen position
input.ScreenPosition.xy /= input.ScreenPosition.w;


//obtain textureCoordinates corresponding to the current pixel
//the screen coordinates are in [-1,1]*[1,-1]
//the texture coordinates need to be in [0,1]*[0,1]
float2 texCoord = 0.5f * (float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1);
//allign texels to pixels
texCoord -= gHalfPixel;

//get normal data from the normalMap
float4 normalData = tex2D(normalS, texCoord);
//tranform normal back into [-1,1] range
float3 normal = 2.0f * normalData.xyz - 1.0f;
//get specular power
float specularPower = normalData.a * 255;
//get specular intensity from the colorMap
float specularIntensity = tex2D(colourS, texCoord).a;


//read depth
float depthVal = tex2D(depthS, texCoord).r;

//compute screen-space position
float4 position;
position.xy = input.ScreenPosition.xy;
position.z = depthVal;
position.w = 1.0f;
//transform to world space
position = mul(position, gInvViewProjection);
position /= position.w;

//surface-to-light vector
float3 lightVector = gLightPosition - position;

//compute attenuation based on distance - linear attenuation
float attenuation = saturate(1.0f - length(lightVector)/gLightRadius);
//normalize light vector
lightVector = normalize(lightVector);

//compute diffuse light
float NdL = max(0,dot(normal,lightVector));
float3 diffuseLight = NdL * gLightColour.rgb;

//reflection vector
float3 reflectionVector = normalize(reflect(-lightVector, normal));

//camera-to-surface vector
float3 directionToCamera = normalize(gEyePosition - position.rgb);

//compute specular light
float specularLight = specularIntensity * pow( saturate(dot(reflectionVector, directionToCamera)), specularPower);

//take into account attenuation and lightIntensity.
return attenuation * gLightIntensity * float4(diffuseLight.rgb,specularLight);
}

And in case you want them my structs are:

struct VSInput
{
float3 Position : POSITION0;
};
struct VSOutput
{
float4 Position : POSITION0;
float4 ScreenPosition : TEXCOORD0;
};

Share this post


Link to post
Share on other sites
TJClifton    106
I'm sure the values for texCoord are ok as are the various values I extract from my textures like the depth value because when I output that information to the screen it matches up with my depth buffer, normal buffer and so on. I reckon I must be doing something stupid when transforming my position to world space, because I can't think what else it could be.

Share this post


Link to post
Share on other sites
TJClifton    106
Also outputting my position information after it has been transformed results in a black screen which must mean 0s for the x, y and z values which surely can't be right.

Share this post


Link to post
Share on other sites
TJClifton    106
I suppose I should post my C++ code as well as it's definitely possible I've made a mistake here.

[source lang="cpp"]void DeferredRenderer::drawPointLight(D3DXVECTOR3& lightPos, D3DXVECTOR3& lightCol, float lightRadius, float lightIntensity)
{
HR(mPLightFX->SetTexture(mhColourMap, mColourRT));
HR(mPLightFX->SetTexture(mhNormalMap, mNormalRT));
HR(mPLightFX->SetTexture(mhDepthMap, mDepthRT));

D3DXMATRIX sphereWorld, S, T;
D3DXMatrixScaling(&S, lightRadius, lightRadius, lightRadius);
D3DXMatrixTranslation(&T, lightPos.x, lightPos.y, lightPos.z);
sphereWorld = S*T;

D3DXMATRIX view, proj;
view = gCamera->view();
proj = gCamera->proj();
HR(mPLightFX->SetMatrix(mhPLightWorld, &sphereWorld));
HR(mPLightFX->SetMatrix(mhPLightView, &view));
HR(mPLightFX->SetMatrix(mhPLightProjection, &proj));

HR(mPLightFX->SetValue(mhPLightLightPosition, &lightPos, sizeof(D3DXVECTOR3)));
HR(mPLightFX->SetValue(mhPLightLightColour, &lightCol, sizeof(D3DXVECTOR3)));
HR(mPLightFX->SetFloat(mhPLightLightRadius, lightRadius));
HR(mPLightFX->SetFloat(mhPLightLightIntensity, lightIntensity));

HR(mPLightFX->SetValue(mhEyePosition, &gCamera->pos(), sizeof(D3DXVECTOR3)));
D3DXMATRIX inverseVP = gCamera->viewProj();
D3DXMatrixInverse(&inverseVP, 0, &inverseVP);
HR(mPLightFX->SetMatrix(mhInvViewProjection, &inverseVP));
HR(mPLightFX->SetValue(mhHalfPixel, &mHalfPixel, sizeof(D3DXVECTOR2)));

float camToCentre = sqrt(pow(gCamera->pos().x - lightPos.x, 2) +
pow(gCamera->pos().y - lightPos.y, 2) +
pow(gCamera->pos().z - lightPos.z, 2));

if (camToCentre < lightRadius)
{
HR(gd3dDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CW));
}
else
{
HR(gd3dDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CCW));
}
HR(gd3dDevice->SetRenderState(D3DRS_ZWRITEENABLE, false));

UINT numPasses = 0;
HR(mPLightFX->Begin(&numPasses, 0));
HR(mPLightFX->BeginPass(0));
HR(mPLightFX->CommitChanges());
drawObject(mSphere, sphereWorld);
HR(mPLightFX->EndPass());
HR(mPLightFX->End());
HR(gd3dDevice->SetRenderState(D3DRS_ZWRITEENABLE, true));
HR(gd3dDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CCW));

}[/source]

drawObject() simply calls DrawSubset(0) on the sphere mesh at the moment.

Share this post


Link to post
Share on other sites
TJClifton    106
For me all I can think of is that it must have something to do with this line

position = mul(position, gInvViewProjection);

Before that the values seem normal and after that everything seems to be zeroed out. But I'm sure I'm setting that parameter properly in the application code. And the handles are set up ok as well for the effect. I can't think of anywhere else which could be causing the problem but at the same time can't see how it could be wrong either.

Share this post


Link to post
Share on other sites
Seabolt    781
So the first step whenever you're getting something like this is to bring up PIX. It'll allow you to debug your shaders and is invaluable. But otherwise, if you're getting zeros after that line, make sure your InvViewProjection is valid prior to setting in on the shader, and inside the shader. I'm sure your GBuffer is fine, and the shader looks fine, so my initial guess goes to bad constants.

Share this post


Link to post
Share on other sites
TJClifton    106
Right I feel like a complete idiot now but I've found the problem.

I was certain the pixel shader was getting garbage data for gInvViewProjection but equally sure that the matrix in the application code was fine.

I [b][i]thought [/i][/b]I'd double checked my shader handles as well but I think this is a good example of why coding when you're tired is a bad idea. Turns out I was passing in [i]mhInvViewProjection[/i] to my shader which is the handle for a different shader to the one I'm using for the point light. What I should have been passing in was [i]mhPLightInvViewProjection[/i]. I knew I was doing something stupid just couldn't figure out what!

Anyway I now have all sorts of new problems but the lighting value is now correct :)

Thanks again for the help!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this