Deferred rendering issue

Started by
22 comments, last by Noxil 11 years ago

Hello

I have an issue in my deferred renderer (Dx9).
As seen in the attached image between my meshes there is a small black line.
This line becomes visible depending on the distance between the camera and the meshes.

The problem seems to be in the point light shader.
Since if i remove the light output and just draw color on the screenquad everything looks fine.

GBuffer:


VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
 
    float4 worldPos  = mul( float4(input.Position.xyz, 1.0f), World );
    output.Position  = mul( worldPos, ViewProjection );
    output.UV  = input.UV;
    output.Normal = mul( input.Normal, World );
    output.Depth.x  = output.Position.z;
    output.Depth.y  = output.Position.w;
 
    return output;
}
 
PixelShaderOutput PixelShaderFunction(VertexShaderOutput input) : COLOR0
{ 
    PixelShaderOutput output;
 
    output.Color = tex2D( DiffuseMapSampler, input.UV );
 
    output.Normal.xyz = ( input.Normal + 1.0f ) * 0.5f;    
    output.Normal.a = 1.0f;
 
    float D = input.Depth.x / input.Depth.y;
    output.Depth = float4( D, D, D, 1.0f );
 
    output.Glow = tex2D( GlowMapSampler, input.UV );
 
    return output;
}

Point Light:


 
VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
 
    float4 worldPos  = mul( float4(input.Position.xyz, 1.0f), World );
    output.Position  = mul( worldPos, ViewProjection );
    output.ScreenPosition  = output.Position;
 
    return output;
}
 
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{ 
    //obtain screen position
    input.ScreenPosition.xy /= input.ScreenPosition.w;
 
    //obtain textureCoordinates corresponding to the current pixel
    //the screen coordinates are in [-1,1]*[1,-1]
    //the texture coordinates need to be in [0,1]*[0,1]
    float2 uv = 0.5f * ( float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1.0f );
 
    //allign texels to pixels
    uv += HalfPixel;
 
    //get normal data from the normalMap
    float4 normalData = tex2D( NormalMapSampler, uv );
 
    //tranform normal back into [-1,1] range
    float3 normal = normalize( 2.0f * normalData.xyz - 1.0f );
 
    //read depth
    float depth = tex2D( DepthMapSampler, uv ).r;
 
    //compute screen-space position
    float4 position;
    position.xy = input.ScreenPosition.xy;
    position.z = depth;
    position.w = 1.0f;
 
    //transform to world space
    position = mul( position, InversedViewProjection );
    position /= position.w;
 
    //surface-to-light vector
    float3 lightVector = LightLocation - position;
 
    //compute attenuation based on distance - linear attenuation
    float attenuation = saturate( 1.0f - (length(lightVector) / LightRadius) );
 
    //normalize light vector
    lightVector = normalize( lightVector ); 
 
    //compute diffuse light
    float NdL = saturate( dot(normal, lightVector) );
    float3 diffuseLight = NdL * Color.rgb * Intensity;
 
    return float4( diffuseLight, 1.0f ) * attenuation;
}

Any help is much appreciated! smile.png

Advertisement

I think fisrt you should figure out how to get rid of the giant red lines...

screenRenderTarget -= redLines;

something like that

sorry I have to be an idiot every once in a while.. keeps me feeling healthy. I had a similar problem using open gl once and it was because my meshes were slightly overlapping creating strange artifacts.. maybe try and space the meshes apart slightly more

Hey thanks for the reply.


I have double and tripple checked.

The meshes dont overlap.


Any other suggestion?

Whats the near and far clip set to?

-=[Megahertz]=-

They are set to 1.0f and 1000.0f

there are a couple things to try - first change your background color from black to something else and see if the lines change color with it or if they stay black and you can see them on the outside of the meshes also..

if this is the case (where the lines appear black even with a different background color) the problem is likely with the texture you are mapping to the mesh.

Try changing your texture filtering settings - and if that doesn't help try switching out the texture to something else. If you are sending the shader texture coords that don't match

with the texture then you can get these strange artifacts also.

Interesting, if i change the background to white and change the texture to for examplea striped texture and only draw the colorbuffer on the screen quad.
I can see a white line instead...

I dont understand..
The mesh halfwidth is 12f, 0.5f, 12.0f, so it super easy to hardcode in the positions to be exact.

[Edit]:

Forget what i just said...
I exported a new mesh, the previous mesh had problems with its uv.
The problem still remains with the point light.

I can see a black line, it still black no matter what the background color is.

A friend of mine said it might be a problem with floating point loss.
That in the vertex shader i guess, a vertex that is supposed to be at 1.0f in Y might be in 0.99998f
and that might be causing the problem.

Can this be the issue, if so how can i go about fixing it?

I've never had issues with precision in vertex shaders so don't think that would be the issue. Try rendering your scene in wireframe, it might bring some geometry issues to light.

Could you dump the surfaces of the g buffer ? At such a small scale, i doubt there is float issues except if you are doing something really bad.

This topic is closed to new replies.

Advertisement