Jump to content

  • Log In with Google      Sign In   
  • Create Account


Point Light Issue With Deferred Rendering


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
8 replies to this topic

#1 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 September 2012 - 07:04 AM

I've been working on a deferred rendering system (based on this excellent XNA tutorial) and all has been going largely well.

However I have hit a snag with point lights. I have followed the code almost exactly (except where I have had to use DX methods instead of the XNA ones in the tutorial) but I just can't seem to get it to work. I understand the theory and I am sure that I am doing something stupid but I just can't figure it out. It has been a couple of days now and nothing I try seems to work. To illustrate the problem here's a screen capture of the problem:

Posted Image

As you can see all the pixels in the sphere are being lit equally - it does not fade out at the edges. From my understanding, the attenuation should deal with this, making light weaker at the edges of the sphere and stronger at the origin of the point source.

However I am using the exact same equations as in the XNA tutorial and cannot see why it is not working for me.

Unfortunately I only have access to the computers at my university at the moment and cannot turn on some of the debug features as I don't have the administrator privileges, so I don't have the luxury of using some of the shader debugging features in PIX.

I have tried outputting the attenuation to the red channel in the pixel shader though and came up with this:

Posted Image

Does this look right? To me this looks like the attenuation is the same for every pixel in the sphere which should not be right. They all seem to be getting the maximum amount of light until it reaches the edge of the sphere where it cuts off suddenly because those pixels are not being lit.

If anyone has any insight on this problem it would be greatly appreciated! I can post my code if needed but as I say it's identical really to that in the tutorial I linked above.

Thanks

EDIT

I have outputted the screen position information that gets input into the pixel shader and I get this:

Posted Image

This looks right to me which makes me think that my problem may lie with calculating the world position from the screen position. The code I have for that at the moment is:

//obtain screen position
input.ScreenPosition.xy /= input.ScreenPosition.w;


//obtain textureCoordinates corresponding to the current pixel and align to pixels
float2 texCoord = 0.5f * (float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1);
texCoord -= gHalfPixel;

//get depth
float depthVal = tex2D(depthS, texCoord).r;

//compute screen-space position
float4 position;
position.xy = input.ScreenPosition.xy;
position.z = depthVal;
position.w = 1.0f;


//transform to world space
position = mul(position, gInvViewProjection);
position /= position.w;

Again any help would be greatly appreciated!

Edited by TJClifton, 13 September 2012 - 08:10 AM.


Sponsor:

#2 BCullis   Crossbones+   -  Reputation: 1813

Like
0Likes
Like

Posted 13 September 2012 - 08:07 AM

For starters, can you post your shader code? We know all your code isn't identical, since like you say you've had to replace XNA methods with DX9 methods, but it would be nice to rule out any HLSL slip-ups from the start: especially since HLSL errors can go undetected easier than DX code errors - there are a lot of "fail silently" results inside shader code depending on assigned values and register contents.
Hazard Pay :: FPS/RTS in SharpDX
DeviantArt :: Because right-brain needs love too

#3 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 September 2012 - 08:23 AM

Ok sure. Just didn't want to be one of those people who posts 2 pages of code with "help me" added after :)

My vertex shader is pretty simple, just passes on the positions to the pixel shader:


VSOutput VSFunction(VSInput input)
{
VSOutput outVS;
float4 worldPosition = mul(float4(input.Position,1), gWorld);
float4 viewPosition = mul(worldPosition, gView);
outVS.Position = mul(viewPosition, gProjection);
outVS.ScreenPosition = outVS.Position;
return outVS;
}

And my pixelShader:

float4 PSFunction(VSOutput input) : COLOR0
{
//obtain screen position
input.ScreenPosition.xy /= input.ScreenPosition.w;


//obtain textureCoordinates corresponding to the current pixel
//the screen coordinates are in [-1,1]*[1,-1]
//the texture coordinates need to be in [0,1]*[0,1]
float2 texCoord = 0.5f * (float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1);
//allign texels to pixels
texCoord -= gHalfPixel;

//get normal data from the normalMap
float4 normalData = tex2D(normalS, texCoord);
//tranform normal back into [-1,1] range
float3 normal = 2.0f * normalData.xyz - 1.0f;
//get specular power
float specularPower = normalData.a * 255;
//get specular intensity from the colorMap
float specularIntensity = tex2D(colourS, texCoord).a;


//read depth
float depthVal = tex2D(depthS, texCoord).r;

//compute screen-space position
float4 position;
position.xy = input.ScreenPosition.xy;
position.z = depthVal;
position.w = 1.0f;
//transform to world space
position = mul(position, gInvViewProjection);
position /= position.w;

//surface-to-light vector
float3 lightVector = gLightPosition - position;

//compute attenuation based on distance - linear attenuation
float attenuation = saturate(1.0f - length(lightVector)/gLightRadius);
//normalize light vector
lightVector = normalize(lightVector);

//compute diffuse light
float NdL = max(0,dot(normal,lightVector));
float3 diffuseLight = NdL * gLightColour.rgb;

//reflection vector
float3 reflectionVector = normalize(reflect(-lightVector, normal));

//camera-to-surface vector
float3 directionToCamera = normalize(gEyePosition - position.rgb);

//compute specular light
float specularLight = specularIntensity * pow( saturate(dot(reflectionVector, directionToCamera)), specularPower);

//take into account attenuation and lightIntensity.
return attenuation * gLightIntensity * float4(diffuseLight.rgb,specularLight);
}

And in case you want them my structs are:

struct VSInput
{
float3 Position : POSITION0;
};
struct VSOutput
{
float4 Position : POSITION0;
float4 ScreenPosition : TEXCOORD0;
};

#4 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 September 2012 - 08:27 AM

I'm sure the values for texCoord are ok as are the various values I extract from my textures like the depth value because when I output that information to the screen it matches up with my depth buffer, normal buffer and so on. I reckon I must be doing something stupid when transforming my position to world space, because I can't think what else it could be.

#5 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 September 2012 - 08:31 AM

Also outputting my position information after it has been transformed results in a black screen which must mean 0s for the x, y and z values which surely can't be right.

#6 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 September 2012 - 08:38 AM

I suppose I should post my C++ code as well as it's definitely possible I've made a mistake here.

[source lang="cpp"]void DeferredRenderer::drawPointLight(D3DXVECTOR3& lightPos, D3DXVECTOR3& lightCol, float lightRadius, float lightIntensity){ HR(mPLightFX->SetTexture(mhColourMap, mColourRT)); HR(mPLightFX->SetTexture(mhNormalMap, mNormalRT)); HR(mPLightFX->SetTexture(mhDepthMap, mDepthRT)); D3DXMATRIX sphereWorld, S, T; D3DXMatrixScaling(&S, lightRadius, lightRadius, lightRadius); D3DXMatrixTranslation(&T, lightPos.x, lightPos.y, lightPos.z); sphereWorld = S*T; D3DXMATRIX view, proj; view = gCamera->view(); proj = gCamera->proj(); HR(mPLightFX->SetMatrix(mhPLightWorld, &sphereWorld)); HR(mPLightFX->SetMatrix(mhPLightView, &view)); HR(mPLightFX->SetMatrix(mhPLightProjection, &proj)); HR(mPLightFX->SetValue(mhPLightLightPosition, &lightPos, sizeof(D3DXVECTOR3))); HR(mPLightFX->SetValue(mhPLightLightColour, &lightCol, sizeof(D3DXVECTOR3))); HR(mPLightFX->SetFloat(mhPLightLightRadius, lightRadius)); HR(mPLightFX->SetFloat(mhPLightLightIntensity, lightIntensity)); HR(mPLightFX->SetValue(mhEyePosition, &gCamera->pos(), sizeof(D3DXVECTOR3))); D3DXMATRIX inverseVP = gCamera->viewProj(); D3DXMatrixInverse(&inverseVP, 0, &inverseVP); HR(mPLightFX->SetMatrix(mhInvViewProjection, &inverseVP)); HR(mPLightFX->SetValue(mhHalfPixel, &mHalfPixel, sizeof(D3DXVECTOR2))); float camToCentre = sqrt(pow(gCamera->pos().x - lightPos.x, 2) + pow(gCamera->pos().y - lightPos.y, 2) + pow(gCamera->pos().z - lightPos.z, 2)); if (camToCentre < lightRadius) { HR(gd3dDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CW)); } else { HR(gd3dDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CCW)); } HR(gd3dDevice->SetRenderState(D3DRS_ZWRITEENABLE, false)); UINT numPasses = 0; HR(mPLightFX->Begin(&numPasses, 0)); HR(mPLightFX->BeginPass(0)); HR(mPLightFX->CommitChanges()); drawObject(mSphere, sphereWorld); HR(mPLightFX->EndPass()); HR(mPLightFX->End()); HR(gd3dDevice->SetRenderState(D3DRS_ZWRITEENABLE, true)); HR(gd3dDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CCW));}[/source]

drawObject() simply calls DrawSubset(0) on the sphere mesh at the moment.

#7 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 13 September 2012 - 09:05 AM

For me all I can think of is that it must have something to do with this line

position = mul(position, gInvViewProjection);

Before that the values seem normal and after that everything seems to be zeroed out. But I'm sure I'm setting that parameter properly in the application code. And the handles are set up ok as well for the effect. I can't think of anywhere else which could be causing the problem but at the same time can't see how it could be wrong either.

#8 Seabolt   Members   -  Reputation: 633

Like
1Likes
Like

Posted 13 September 2012 - 01:23 PM

So the first step whenever you're getting something like this is to bring up PIX. It'll allow you to debug your shaders and is invaluable. But otherwise, if you're getting zeros after that line, make sure your InvViewProjection is valid prior to setting in on the shader, and inside the shader. I'm sure your GBuffer is fine, and the shader looks fine, so my initial guess goes to bad constants.
Perception is when one imagination clashes with another

#9 TJClifton   Members   -  Reputation: 106

Like
0Likes
Like

Posted 14 September 2012 - 08:14 AM

Right I feel like a complete idiot now but I've found the problem.

I was certain the pixel shader was getting garbage data for gInvViewProjection but equally sure that the matrix in the application code was fine.

I thought I'd double checked my shader handles as well but I think this is a good example of why coding when you're tired is a bad idea. Turns out I was passing in mhInvViewProjection to my shader which is the handle for a different shader to the one I'm using for the point light. What I should have been passing in was mhPLightInvViewProjection. I knew I was doing something stupid just couldn't figure out what!

Anyway I now have all sorts of new problems but the lighting value is now correct :)

Thanks again for the help!




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS