# [solved] PointLight renders as HalfSphere

This topic is 2023 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi guys,

at the moment i'm trying to implement Deferred Shading in my Engine (using XNA), following CatalinaZimas articles (-> http://www.catalinzi...ndering-in-xna/ ). When it comes to rendering Point Lights i get strange results. The light is only rendered as a half sphere, here is a screenshot:

I know that the model and the pointlight-effect file are correct, the sphere comes from the artice, and for testing i used the effectfile from the sample, getting the same results. Matrices should be right, as they are only Scale * Translation, Culling is also working.
The project is embedded in WinForms, could there be some missing properties for the ContentImporter?

Do you have any ideas about what could be wrong? I can post some Code as well, just say what you need to see.
Hope, someone can help, thanks in advance,

major.

##### Share on other sites
Is the normal of your plane correct?

##### Share on other sites
Yes, other lighting works as expected too. The terrain is editable, so there shouldnt be a problem with normals, i would have noticed it before.

##### Share on other sites
I did some deferred shading with XNA too, iirc you need to set the DeptStencilState to None and back to Default after rendering the lights

##### Share on other sites
You're right, that's the way i do it, too. I played a lot with depthStencilStates. Either you see no lighting at all or that half sphere.
I even tried to rotate the model, but it always looks the same. Distance from light to camera and viewdirection don't change anything... No idea what could be wrong.
Thanks for your replies.

##### Share on other sites
I've seen this bug several times. Look over your shaders. You forgot to normalize some vector, or you compressed something in the range [-1,1] to [0,1] and forgot to uncompress it

##### Share on other sites
Reading your post, it makes sense, but after looking over my shaders i can't find anything wrong. Is it right that the mistake has to be in the Terrain-Shader? PointLightShader is copied for testing from the sample, so i know it works. And the half point light is already in the lightmap, so the combineEffect can't be the problem. Here's the code for my terrainShader:

 PixelShaderOutput PixelShaderFunction(VertexShaderOutput input) : COLOR0 { PixelShaderOutput output; //+++++++++++++++++++++++++++++++++++ //++++ C O L O R - S E C T I O N ++++ //+++++++++++++++++++++++++++++++++++ //Get Colorvalues from Textures and calculate final TextureColor float3 rTex = tex2D(RTextureSampler, input.UV * rTile); float3 gTex = tex2D(GTextureSampler, input.UV * gTile); float3 bTex = tex2D(BTextureSampler, input.UV * bTile); float3 aTex = tex2D(ATextureSampler, input.UV * aTile); float3 baseTex = tex2D(BaseTextureSampler, input.UV * BaseTile); float3 baseWeight = clamp(1.0f - input.Weights.x - input.Weights.y - input.Weights.z - input.Weights.w, 0, 1); float3 texColor = baseWeight * baseTex; texColor += input.Weights.x * rTex + input.Weights.y * gTex + input.Weights.z * bTex + input.Weights.w * aTex; output.Color.rgb = texColor; output.Color.a = specularIntensity; //+++++++++++++++++++++++++++++++++++ //+++ N O R M A L - S E C T I O N +++ //+++++++++++++++++++++++++++++++++++ //Process VertexNormal and bring it in [0,1] range float3 vnormal = 0.5f * (input.Normal + 1.0f); normalize(vnormal); //Get Normals from NormalMaps (already in [0,1] range) float3 baseNorm = tex2D(NormalSampler, input.UV * BaseTile);// * 2.0 - 1.0; float3 rNorm = tex2D(rNormalSampler, input.UV * rTile).rgb;// * 2.0 - 1.0; float3 gNorm = tex2D(gNormalSampler, input.UV * gTile).rgb;// * 2.0 - 1.0; float3 bNorm = tex2D(bNormalSampler, input.UV * bTile).rgb;// * 2.0 - 1.0; float3 aNorm = tex2D(aNormalSampler, input.UV * aTile).rgb;// * 2.0 - 1.0; float3 normal = normalize(baseWeight * baseNorm); //Add Vertex- and TextureNormals up normal = normalize((normal + input.Weights.x * rNorm + input.Weights.y * gNorm + input.Weights.z * bNorm + input.Weights.w * aNorm) + vnormal); output.Normal.rgb = normal;//0.5f * (normal + 1.0f); output.Normal.a = specularPower; //+++++++++++++++++++++++++++++++++++ //++++ D E P T H - S E C T I O N ++++ //+++++++++++++++++++++++++++++++++++ //Depth is VertexShaderOutput.Position.z & .w output.Depth = input.Depth.x / input.Depth.y; output.Stencil = float4(1.0f, 0, 0, 1.0f); return output; } 

Is there something wrong with it? =/
Thank you all,

major

##### Share on other sites

[color=#000000]output[color=#666600].[color=#660066]Normal[color=#666600].[color=#000000]rgb [color=#666600]=[color=#000000] normal[color=#666600];[color=#880000]//0.5f * (normal + 1.0f);

what is your render target format? Does it support storing positive and negative values? If not, you were right with outputting (normal + 1.0f) * 0.5f, but you have to make sure to do the inverse correctly in the light shader when you read the normal

##### Share on other sites

[color=#880000]//Get Normals from NormalMaps (already in [0,1] range)
[color=#000000]float3 baseNorm [color=#666600]=[color=#000000] tex2D[color=#666600]([color=#660066]NormalSampler[color=#666600],[color=#000000] input[color=#666600].[color=#000000]UV [color=#666600]*[color=#000000] [color=#660066]BaseTile[color=#666600]);[color=#880000]// * 2.0 - 1.0;

The code you're using to read the normal maps from textures also looks suspicious. Your comment about "already in [0,1] range" is strange, because the correct range for each component of the normal vector is [-1,1]. Make sure you're reading them all correctly.

It's very handy to have debug views where you can render the normals, etc from your shaders to help sanity check the values you have to help debug this stuff

[color=#880000]//Process VertexNormal and bring it in [0,1] range
[color=#000000]float3 vnormal [color=#666600]=[color=#000000] [color=#006666]0.5f[color=#000000] [color=#666600]*[color=#000000] [color=#666600]([color=#000000]input[color=#666600].[color=#660066]Normal[color=#000000] [color=#666600]+[color=#000000] [color=#006666]1.0f[color=#666600]);
[color=#000000]normalize[color=#666600]([color=#000000]vnormal[color=#666600]);

This also looks wrong, You definitely don't want to encode the normal in the [0,1] range before normalizing the vector. this should read float3 vnormal = normalize(input.Normal); Edited by rdragon1

##### Share on other sites
That is, because i output the Normals in a Texture to look them up in a seperate effect. I can't store values in [-1,1] range in a Texture, so i transform it and bring it back in [-1,1] in the other effect. Or am i missing something?

Edit: RenderTargetFormat is Color, so it can store values in [0,1] range. Edited by majorbrot

##### Share on other sites
You're missing that you're calling normalize() on these encoded [0-1] vectors, and that's not doing what you want.

Keep everything in its natural [-1,1] form until just before you write it to the render target, then deencode it on the other end when you read it from the texture

##### Share on other sites

I can't store values in [-1,1] range in a Texture, so i transform it and bring it back in [-1,1] in the other effect. Or am i missing something?
The part where you decode your normal maps back to the [-1,1] range is the "[font=courier new,courier,monospace]// * 2.0 - 1.0;[/font]" part of your code, which you've commented out for some reason? As above, when you call normalize on these [0,1] range (packed) normals, you're destroying them. You should be doing all your normal math with values in the [-1,1] range, not the compressed [0,1] range.

Also, you're not performing a tangent-space rotation on your normal-map values and are just averaging them with the vertex normal, this means that you're not using tangent-space normal map textures (where blue points out of a flat surface), and are actually using object-space normal maps (where blue is +Z, regardless of surface orientation).

If the Y-axis is up in your engine, and you're using blue-ish normal maps, then your code is going to produce a normal that's facing sideways (tangential to your plane) instead of sticking up out of the plane, which explains your image. Edited by Hodgman

##### Share on other sites

Also, you're not performing a tangent-space rotation on your normal-map values

Indeed. I would eliminate sampling from normal map textures for now and get your lighting working with vertex normals first. You have a bit more work to do before you can use normal maps (assuming you're using tangent-space normal maps).

##### Share on other sites
I think so too... I just commented out the stuff with normalmaps, for now it's working, although it seems to be a bit hacky, because i have to take the negative normal from the vertex. So i try figuring out, what that is, then i can go a step further and try the normal mapping. For now all NormalCalculations are done on the CPU, but is there an easy way of doing it on the GPU? I'm relatively new to this, so theres a lot to learn ;)

Thanks a lot, you really helped me out.

##### Share on other sites

because i have to take the negative normal from the vertex.

I'm going to guess that your normals are in one space, and your lighting is being done in another space. Are you doing your light calculation in world space? If so, did you transform your normal into world space from (presumably) object space? Or maybe you have an up-axis wrong in some context. Or maybe your light direction vector is pointing in the wrong direction (if you're trying to calculate "N dot L", L should point from the surface to the light)

For now all NormalCalculations are done on the CPU, but is there an easy way of doing it on the GPU? I'm relatively new to this, so theres a lot to learn ;)

Depends. What do you mean by "NormalCalculations" ? Edited by rdragon1

##### Share on other sites
the light calculations are all done in worldspace, so i have a vector for the directional light, that is not transformed in any way and the normals from the terrain arent either, they're just passed from vertex to pixelshader.

With the NdotL thing you mean that technically the sunlight points up? than that seems to be wrong, i thought it would be just the "right direction".

With normalCalculation i mean calculating the vertexNormals from the terrain. now they are calculated when the terrain is created and recalculated when it changes, and are then passed as vertexData. my question was, if it is possible to calculate them in the vertexshader. if i remember correct, i read somewhere that this could be done, but i dont know how, because you dont have access to the neighboring vertices.

thanks, all this stuff makes me feel so newbish again...

Edit: i think the calculation in the vertexShader could be done if i had a heightmap. But thats missing, so i see no chance beside generating one.

EditEdit: Got the NormalMapping working, looks good to me:

thank you again, it helped me so much. Edited by majorbrot