Jump to content
  • Advertisement
Sign in to follow this  
majorbrot

[solved] PointLight renders as HalfSphere

This topic is 2206 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi guys,

at the moment i'm trying to implement Deferred Shading in my Engine (using XNA), following CatalinaZimas articles (-> http://www.catalinzi...ndering-in-xna/ ). When it comes to rendering Point Lights i get strange results. The light is only rendered as a half sphere, here is a screenshot:

dsbug.png

I know that the model and the pointlight-effect file are correct, the sphere comes from the artice, and for testing i used the effectfile from the sample, getting the same results. Matrices should be right, as they are only Scale * Translation, Culling is also working.
The project is embedded in WinForms, could there be some missing properties for the ContentImporter?

Do you have any ideas about what could be wrong? I can post some Code as well, just say what you need to see.
Hope, someone can help, thanks in advance,

major.

Share this post


Link to post
Share on other sites
Advertisement
Yes, other lighting works as expected too. The terrain is editable, so there shouldnt be a problem with normals, i would have noticed it before.

Share this post


Link to post
Share on other sites
I did some deferred shading with XNA too, iirc you need to set the DeptStencilState to None and back to Default after rendering the lights

Share this post


Link to post
Share on other sites
You're right, that's the way i do it, too. I played a lot with depthStencilStates. Either you see no lighting at all or that half sphere.
I even tried to rotate the model, but it always looks the same. Distance from light to camera and viewdirection don't change anything... No idea what could be wrong.
Thanks for your replies.

Share this post


Link to post
Share on other sites
I've seen this bug several times. Look over your shaders. You forgot to normalize some vector, or you compressed something in the range [-1,1] to [0,1] and forgot to uncompress it

Share this post


Link to post
Share on other sites
Reading your post, it makes sense, but after looking over my shaders i can't find anything wrong. Is it right that the mistake has to be in the Terrain-Shader? PointLightShader is copied for testing from the sample, so i know it works. And the half point light is already in the lightmap, so the combineEffect can't be the problem. Here's the code for my terrainShader:


PixelShaderOutput PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
PixelShaderOutput output;
//+++++++++++++++++++++++++++++++++++
//++++ C O L O R - S E C T I O N ++++
//+++++++++++++++++++++++++++++++++++

//Get Colorvalues from Textures and calculate final TextureColor
float3 rTex = tex2D(RTextureSampler, input.UV * rTile);
float3 gTex = tex2D(GTextureSampler, input.UV * gTile);
float3 bTex = tex2D(BTextureSampler, input.UV * bTile);
float3 aTex = tex2D(ATextureSampler, input.UV * aTile);
float3 baseTex = tex2D(BaseTextureSampler, input.UV * BaseTile);

float3 baseWeight = clamp(1.0f - input.Weights.x - input.Weights.y - input.Weights.z - input.Weights.w, 0, 1);
float3 texColor = baseWeight * baseTex;
texColor += input.Weights.x * rTex + input.Weights.y * gTex + input.Weights.z * bTex + input.Weights.w * aTex;

output.Color.rgb = texColor;
output.Color.a = specularIntensity;

//+++++++++++++++++++++++++++++++++++
//+++ N O R M A L - S E C T I O N +++
//+++++++++++++++++++++++++++++++++++

//Process VertexNormal and bring it in [0,1] range
float3 vnormal = 0.5f * (input.Normal + 1.0f);
normalize(vnormal);

//Get Normals from NormalMaps (already in [0,1] range)
float3 baseNorm = tex2D(NormalSampler, input.UV * BaseTile);// * 2.0 - 1.0;
float3 rNorm = tex2D(rNormalSampler, input.UV * rTile).rgb;// * 2.0 - 1.0;
float3 gNorm = tex2D(gNormalSampler, input.UV * gTile).rgb;// * 2.0 - 1.0;
float3 bNorm = tex2D(bNormalSampler, input.UV * bTile).rgb;// * 2.0 - 1.0;
float3 aNorm = tex2D(aNormalSampler, input.UV * aTile).rgb;// * 2.0 - 1.0;
float3 normal = normalize(baseWeight * baseNorm);

//Add Vertex- and TextureNormals up
normal = normalize((normal + input.Weights.x * rNorm + input.Weights.y * gNorm + input.Weights.z * bNorm + input.Weights.w * aNorm) + vnormal);

output.Normal.rgb = normal;//0.5f * (normal + 1.0f);
output.Normal.a = specularPower;

//+++++++++++++++++++++++++++++++++++
//++++ D E P T H - S E C T I O N ++++
//+++++++++++++++++++++++++++++++++++

//Depth is VertexShaderOutput.Position.z & .w
output.Depth = input.Depth.x / input.Depth.y;
output.Stencil = float4(1.0f, 0, 0, 1.0f);
return output;
}


Is there something wrong with it? =/
Thank you all,

major

Share this post


Link to post
Share on other sites


[color=#000000]output[color=#666600].[color=#660066]Normal[color=#666600].[color=#000000]rgb [color=#666600]=[color=#000000] normal[color=#666600];[color=#880000]//0.5f * (normal + 1.0f);


what is your render target format? Does it support storing positive and negative values? If not, you were right with outputting (normal + 1.0f) * 0.5f, but you have to make sure to do the inverse correctly in the light shader when you read the normal

Share this post


Link to post
Share on other sites

[color=#880000]//Get Normals from NormalMaps (already in [0,1] range)
[color=#000000]float3 baseNorm [color=#666600]=[color=#000000] tex2D[color=#666600]([color=#660066]NormalSampler[color=#666600],[color=#000000] input[color=#666600].[color=#000000]UV [color=#666600]*[color=#000000] [color=#660066]BaseTile[color=#666600]);[color=#880000]// * 2.0 - 1.0;

The code you're using to read the normal maps from textures also looks suspicious. Your comment about "already in [0,1] range" is strange, because the correct range for each component of the normal vector is [-1,1]. Make sure you're reading them all correctly.

It's very handy to have debug views where you can render the normals, etc from your shaders to help sanity check the values you have to help debug this stuff



[color=#880000]//Process VertexNormal and bring it in [0,1] range
[color=#000000]float3 vnormal [color=#666600]=[color=#000000] [color=#006666]0.5f[color=#000000] [color=#666600]*[color=#000000] [color=#666600]([color=#000000]input[color=#666600].[color=#660066]Normal[color=#000000] [color=#666600]+[color=#000000] [color=#006666]1.0f[color=#666600]);
[color=#000000]normalize[color=#666600]([color=#000000]vnormal[color=#666600]);


This also looks wrong, You definitely don't want to encode the normal in the [0,1] range before normalizing the vector. this should read float3 vnormal = normalize(input.Normal); Edited by rdragon1

Share this post


Link to post
Share on other sites
That is, because i output the Normals in a Texture to look them up in a seperate effect. I can't store values in [-1,1] range in a Texture, so i transform it and bring it back in [-1,1] in the other effect. Or am i missing something?

Edit: RenderTargetFormat is Color, so it can store values in [0,1] range. Edited by majorbrot

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!