• Create Account

## [solved] PointLight renders as HalfSphere

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

15 replies to this topic

### #1majorbrot  Members

910
Like
0Likes
Like

Posted 07 July 2012 - 08:15 AM

Hi guys,

at the moment i'm trying to implement Deferred Shading in my Engine (using XNA), following CatalinaZimas articles (-> http://www.catalinzi...ndering-in-xna/ ). When it comes to rendering Point Lights i get strange results. The light is only rendered as a half sphere, here is a screenshot:

I know that the model and the pointlight-effect file are correct, the sphere comes from the artice, and for testing i used the effectfile from the sample, getting the same results. Matrices should be right, as they are only Scale * Translation, Culling is also working.
The project is embedded in WinForms, could there be some missing properties for the ContentImporter?

Do you have any ideas about what could be wrong? I can post some Code as well, just say what you need to see.
Hope, someone can help, thanks in advance,

major.

### #2Hodgman  Moderators

49431
Like
0Likes
Like

Posted 07 July 2012 - 08:18 AM

Is the normal of your plane correct?

### #3majorbrot  Members

910
Like
0Likes
Like

Posted 07 July 2012 - 08:28 AM

Yes, other lighting works as expected too. The terrain is editable, so there shouldnt be a problem with normals, i would have noticed it before.

### #4Saoblol  Members

143
Like
0Likes
Like

Posted 07 July 2012 - 12:24 PM

I did some deferred shading with XNA too, iirc you need to set the DeptStencilState to None and back to Default after rendering the lights

### #5majorbrot  Members

910
Like
0Likes
Like

Posted 07 July 2012 - 02:02 PM

You're right, that's the way i do it, too. I played a lot with depthStencilStates. Either you see no lighting at all or that half sphere.
I even tried to rotate the model, but it always looks the same. Distance from light to camera and viewdirection don't change anything... No idea what could be wrong.

### #6rdragon1  Members

1205
Like
0Likes
Like

Posted 07 July 2012 - 02:27 PM

I've seen this bug several times. Look over your shaders. You forgot to normalize some vector, or you compressed something in the range [-1,1] to [0,1] and forgot to uncompress it

### #7majorbrot  Members

910
Like
0Likes
Like

Posted 08 July 2012 - 02:25 AM

Reading your post, it makes sense, but after looking over my shaders i can't find anything wrong. Is it right that the mistake has to be in the Terrain-Shader? PointLightShader is copied for testing from the sample, so i know it works. And the half point light is already in the lightmap, so the combineEffect can't be the problem. Here's the code for my terrainShader:

PixelShaderOutput PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
//+++++++++++++++++++++++++++++++++++
//++++ C O L O R - S E C T I O N ++++
//+++++++++++++++++++++++++++++++++++

//Get Colorvalues from Textures and calculate final TextureColor
float3 rTex = tex2D(RTextureSampler, input.UV * rTile);
float3 gTex = tex2D(GTextureSampler, input.UV * gTile);
float3 bTex = tex2D(BTextureSampler, input.UV * bTile);
float3 aTex = tex2D(ATextureSampler, input.UV * aTile);
float3 baseTex = tex2D(BaseTextureSampler, input.UV * BaseTile);

float3 baseWeight = clamp(1.0f - input.Weights.x - input.Weights.y - input.Weights.z - input.Weights.w, 0, 1);
float3 texColor = baseWeight * baseTex;
texColor += input.Weights.x * rTex + input.Weights.y * gTex + input.Weights.z * bTex + input.Weights.w * aTex;

output.Color.rgb = texColor;
output.Color.a = specularIntensity;

//+++++++++++++++++++++++++++++++++++
//+++ N O R M A L - S E C T I O N +++
//+++++++++++++++++++++++++++++++++++

//Process VertexNormal and bring it in [0,1] range
float3 vnormal = 0.5f * (input.Normal + 1.0f);
normalize(vnormal);

//Get Normals from NormalMaps (already in [0,1] range)
float3 baseNorm = tex2D(NormalSampler, input.UV * BaseTile);// * 2.0 - 1.0;
float3 rNorm = tex2D(rNormalSampler, input.UV * rTile).rgb;// * 2.0 - 1.0;
float3 gNorm = tex2D(gNormalSampler, input.UV * gTile).rgb;// * 2.0 - 1.0;
float3 bNorm = tex2D(bNormalSampler, input.UV * bTile).rgb;// * 2.0 - 1.0;
float3 aNorm = tex2D(aNormalSampler, input.UV * aTile).rgb;// * 2.0 - 1.0;
float3 normal = normalize(baseWeight * baseNorm);

normal = normalize((normal + input.Weights.x * rNorm + input.Weights.y * gNorm + input.Weights.z * bNorm + input.Weights.w * aNorm) + vnormal);

output.Normal.rgb = normal;//0.5f * (normal + 1.0f);
output.Normal.a = specularPower;

//+++++++++++++++++++++++++++++++++++
//++++ D E P T H - S E C T I O N ++++
//+++++++++++++++++++++++++++++++++++

output.Depth = input.Depth.x / input.Depth.y;
output.Stencil = float4(1.0f, 0, 0, 1.0f);
return output;
}


Is there something wrong with it? =/
Thank you all,

major

### #8rdragon1  Members

1205
Like
0Likes
Like

Posted 08 July 2012 - 02:27 AM

output.Normal.rgb = normal;//0.5f * (normal + 1.0f);

what is your render target format? Does it support storing positive and negative values? If not, you were right with outputting (normal + 1.0f) * 0.5f, but you have to make sure to do the inverse correctly in the light shader when you read the normal

### #9rdragon1  Members

1205
Like
0Likes
Like

Posted 08 July 2012 - 02:30 AM

//Get Normals from NormalMaps (already in [0,1] range)
float3 baseNorm = tex2D(NormalSampler, input.UV * BaseTile);// * 2.0 - 1.0;

The code you're using to read the normal maps from textures also looks suspicious. Your comment about "already in [0,1] range" is strange, because the correct range for each component of the normal vector is [-1,1]. Make sure you're reading them all correctly.

It's very handy to have debug views where you can render the normals, etc from your shaders to help sanity check the values you have to help debug this stuff

//Process VertexNormal and bring it in [0,1] range
float3 vnormal = 0.5f * (input.Normal + 1.0f);
normalize(vnormal);

This also looks wrong, You definitely don't want to encode the normal in the [0,1] range before normalizing the vector. this should read float3 vnormal = normalize(input.Normal);

Edited by rdragon1, 08 July 2012 - 02:32 AM.

### #10majorbrot  Members

910
Like
0Likes
Like

Posted 08 July 2012 - 02:39 AM

That is, because i output the Normals in a Texture to look them up in a seperate effect. I can't store values in [-1,1] range in a Texture, so i transform it and bring it back in [-1,1] in the other effect. Or am i missing something?

Edit: RenderTargetFormat is Color, so it can store values in [0,1] range.

Edited by majorbrot, 08 July 2012 - 02:43 AM.

### #11rdragon1  Members

1205
Like
1Likes
Like

Posted 08 July 2012 - 03:00 AM

You're missing that you're calling normalize() on these encoded [0-1] vectors, and that's not doing what you want.

Keep everything in its natural [-1,1] form until just before you write it to the render target, then deencode it on the other end when you read it from the texture

### #12Hodgman  Moderators

49431
Like
1Likes
Like

Posted 08 July 2012 - 03:00 AM

I can't store values in [-1,1] range in a Texture, so i transform it and bring it back in [-1,1] in the other effect. Or am i missing something?

The part where you decode your normal maps back to the [-1,1] range is the "// * 2.0 - 1.0;" part of your code, which you've commented out for some reason? As above, when you call normalize on these [0,1] range (packed) normals, you're destroying them. You should be doing all your normal math with values in the [-1,1] range, not the compressed [0,1] range.

Also, you're not performing a tangent-space rotation on your normal-map values and are just averaging them with the vertex normal, this means that you're not using tangent-space normal map textures (where blue points out of a flat surface), and are actually using object-space normal maps (where blue is +Z, regardless of surface orientation).

If the Y-axis is up in your engine, and you're using blue-ish normal maps, then your code is going to produce a normal that's facing sideways (tangential to your plane) instead of sticking up out of the plane, which explains your image.

Edited by Hodgman, 08 July 2012 - 03:17 AM.

### #13rdragon1  Members

1205
Like
0Likes
Like

Posted 08 July 2012 - 03:22 AM

Also, you're not performing a tangent-space rotation on your normal-map values

Indeed. I would eliminate sampling from normal map textures for now and get your lighting working with vertex normals first. You have a bit more work to do before you can use normal maps (assuming you're using tangent-space normal maps).

### #14majorbrot  Members

910
Like
0Likes
Like

Posted 08 July 2012 - 03:46 AM

I think so too... I just commented out the stuff with normalmaps, for now it's working, although it seems to be a bit hacky, because i have to take the negative normal from the vertex. So i try figuring out, what that is, then i can go a step further and try the normal mapping. For now all NormalCalculations are done on the CPU, but is there an easy way of doing it on the GPU? I'm relatively new to this, so theres a lot to learn ;)

Thanks a lot, you really helped me out.

### #15rdragon1  Members

1205
Like
0Likes
Like

Posted 08 July 2012 - 03:53 AM

because i have to take the negative normal from the vertex.

I'm going to guess that your normals are in one space, and your lighting is being done in another space. Are you doing your light calculation in world space? If so, did you transform your normal into world space from (presumably) object space? Or maybe you have an up-axis wrong in some context. Or maybe your light direction vector is pointing in the wrong direction (if you're trying to calculate "N dot L", L should point from the surface to the light)

For now all NormalCalculations are done on the CPU, but is there an easy way of doing it on the GPU? I'm relatively new to this, so theres a lot to learn ;)

Depends. What do you mean by "NormalCalculations" ?

Edited by rdragon1, 08 July 2012 - 03:55 AM.

### #16majorbrot  Members

910
Like
0Likes
Like

Posted 08 July 2012 - 05:12 AM

the light calculations are all done in worldspace, so i have a vector for the directional light, that is not transformed in any way and the normals from the terrain arent either, they're just passed from vertex to pixelshader.

With the NdotL thing you mean that technically the sunlight points up? than that seems to be wrong, i thought it would be just the "right direction".

With normalCalculation i mean calculating the vertexNormals from the terrain. now they are calculated when the terrain is created and recalculated when it changes, and are then passed as vertexData. my question was, if it is possible to calculate them in the vertexshader. if i remember correct, i read somewhere that this could be done, but i dont know how, because you dont have access to the neighboring vertices.

thanks, all this stuff makes me feel so newbish again...

Edit: i think the calculation in the vertexShader could be done if i had a heightmap. But thats missing, so i see no chance beside generating one.

EditEdit: Got the NormalMapping working, looks good to me:

thank you again, it helped me so much.

Edited by majorbrot, 08 July 2012 - 09:50 AM.

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.