Sign in to follow this  

Deferred shading: Light artifacts

This topic is 3457 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've just got my deferred shader pipeline to a point where I start getting some images out, but I got a problem I can't figure out: When I light my scene with a spotlight I get a pixel-thin outline around objects. Free Image Hosting at www.ImageShack.us Here is a slightly more complex scene with the same problem: Free Image Hosting at www.ImageShack.us All the geometry get a thin edge of lighter pixels. Here is the pixel shader I use in the lighting pass of the deferred shader:
float4x4 SpotLightTransform;

float3 light_position;
float3 light_direction;

struct VS_Output {
	float4 position	: POSITION;
	float2 uv 	: TEXCOORD0;
};

float4 PS_Spot( VS_Output IN ) : COLOR0 {
	float3 view_pos = tex2D( texture_position_sampler, IN.uv ).rgb;
	float3 normal = normalize( tex2D( texture_normal_sampler, IN.uv ).rgb );
	float3 albedo = tex2D( texture_albedo_sampler, IN.uv );
	
	float4 spot_pos = mul( float4( view_pos, 1 ), SpotLightTransform );
	
	float u = ( spot_pos.x / spot_pos.w ) / 2.0f + 0.5f;
	float v = ( spot_pos.y / spot_pos.w ) / 2.0f + 0.5f;
	
	// Clamp light contribution to spotlight texture
	float4 spot_light = 
		u >= 0 && u <= 1 && v >= 0 && v <= 1 ?
		tex2D( texture_spot_sampler, float2( u, 1 - v ) ) :
		float4( 0, 0, 0, 0 );
	
	float3 L = normalize( light_position - view_pos );
	float NL = max( dot( normal, L ), 0 );

	return float4( albedo * NL * spot_light.rgb * spot_light.a, 1 );
}
(I have removed texture samplers and the trivial vertex shader to make it more readable.) My G buffers are D3DFMT_A16B16G16R16F. Can anyone help me track down this bug? /Promethium

Share this post


Link to post
Share on other sites
When you're calculating texture coordinates from normalized device coordinates (like you're doing), you need to offset your texture coordinate by half a pixel to account for the fact that pixels and texels don't have the same alignment in D3D9.

Also FYI, you can save a little bit of calculation by calculating passing the projected position you get in your vertex shader on to your pixel shader, rather than re-calculating it.

Share this post


Link to post
Share on other sites
Quote:
Original post by MJP
When you're calculating texture coordinates from normalized device coordinates (like you're doing), you need to offset your texture coordinate by half a pixel to account for the fact that pixels and texels don't have the same alignment in D3D9.

Yeah, good point. However, changing my spotlight lookup to

tex2D( texture_spot_sampler, float2( u, 1 - v ) + 0.5f / 512.0f )

(The spotlight texture is 512x512) didn't have much effect. Should it be the width/height of the G buffers? Or should the half-pixel be added somewhere else?

Quote:
Original post by MJP
Also FYI, you can save a little bit of calculation by calculating passing the projected position you get in your vertex shader on to your pixel shader, rather than re-calculating it.

I don't understand this one: I don't calculate any projected position in the vertex shader, since I'm just rendering a fullscreen quad.

VS_Output VS_Main( VS_Input IN ) {
VS_Output OUT;
OUT.position = float4( IN.position, 1 );
OUT.uv = IN.uv;
return OUT;
}

Share this post


Link to post
Share on other sites
Quote:
Original post by Promethium
I don't understand this one: I don't calculate any projected position in the vertex shader, since I'm just rendering a fullscreen quad.
*** Source Snippet Removed ***


Ahh I'm sorry, I thought you were doing something else at first. Don't mind me.

Share this post


Link to post
Share on other sites
Quote:
Original post by Promethium
(The spotlight texture is 512x512) didn't have much effect. Should it be the width/height of the G buffers? Or should the half-pixel be added somewhere else?


That's most likely the problem. Try making all textures (except the geometry materials of course) match the same screen resolution.

If you don't like the loss of matching them (most likely in terms of performance vs quality gain) you can also try blurring them.(1st, bilinear filtering for the RTTs if you haven't already)

Good luck with it
Dark Sylinc

Share this post


Link to post
Share on other sites
Quote:
Original post by Matias Goldberg
If you don't like the loss of matching them (most likely in terms of performance vs quality gain) you can also try blurring them.(1st, bilinear filtering for the RTTs if you haven't already)


This got me thinking, and you got it, only in reverse. [smile] The problem was that I was using a bilinear filter when sampling the position and normal buffers. This caused the border between different normals to blur, creating false normals. Disabling filtering removed the artifacts (well, almost, still a bit left).

Thanks for helping!

Share this post


Link to post
Share on other sites

This topic is 3457 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this