Sign in to follow this  
Noxil

Deferred rendering issue

Recommended Posts

Hello

 

I have an issue in my deferred renderer (Dx9).
As seen in the attached image between my meshes there is a small black line.
This line becomes visible depending on the distance between the camera and the meshes.

 

The problem seems to be in the point light shader.
Since if i remove the light output and just draw color on the screenquad everything looks fine.

 

GBuffer:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
 
    float4 worldPos  = mul( float4(input.Position.xyz, 1.0f), World );
    output.Position  = mul( worldPos, ViewProjection );
    output.UV  = input.UV;
    output.Normal = mul( input.Normal, World );
    output.Depth.x  = output.Position.z;
    output.Depth.y  = output.Position.w;
 
    return output;
}
 
PixelShaderOutput PixelShaderFunction(VertexShaderOutput input) : COLOR0
{ 
    PixelShaderOutput output;
 
    output.Color = tex2D( DiffuseMapSampler, input.UV );
 
    output.Normal.xyz = ( input.Normal + 1.0f ) * 0.5f;    
    output.Normal.a = 1.0f;
 
    float D = input.Depth.x / input.Depth.y;
    output.Depth = float4( D, D, D, 1.0f );
 
    output.Glow = tex2D( GlowMapSampler, input.UV );
 
    return output;
}

 

Point Light:

 
VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
 
    float4 worldPos  = mul( float4(input.Position.xyz, 1.0f), World );
    output.Position  = mul( worldPos, ViewProjection );
    output.ScreenPosition  = output.Position;
 
    return output;
}
 
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{ 
    //obtain screen position
    input.ScreenPosition.xy /= input.ScreenPosition.w;
 
    //obtain textureCoordinates corresponding to the current pixel
    //the screen coordinates are in [-1,1]*[1,-1]
    //the texture coordinates need to be in [0,1]*[0,1]
    float2 uv = 0.5f * ( float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1.0f );
 
    //allign texels to pixels
    uv += HalfPixel;
 
    //get normal data from the normalMap
    float4 normalData = tex2D( NormalMapSampler, uv );
 
    //tranform normal back into [-1,1] range
    float3 normal = normalize( 2.0f * normalData.xyz - 1.0f );
 
    //read depth
    float depth = tex2D( DepthMapSampler, uv ).r;
 
    //compute screen-space position
    float4 position;
    position.xy = input.ScreenPosition.xy;
    position.z = depth;
    position.w = 1.0f;
 
    //transform to world space
    position = mul( position, InversedViewProjection );
    position /= position.w;
 
    //surface-to-light vector
    float3 lightVector = LightLocation - position;
 
    //compute attenuation based on distance - linear attenuation
    float attenuation = saturate( 1.0f - (length(lightVector) / LightRadius) );
 
    //normalize light vector
    lightVector = normalize( lightVector ); 
 
    //compute diffuse light
    float NdL = saturate( dot(normal, lightVector) );
    float3 diffuseLight = NdL * Color.rgb * Intensity;
 
    return float4( diffuseLight, 1.0f ) * attenuation;
}

 

Any help is much appreciated! smile.png

Share this post


Link to post
Share on other sites

I think fisrt you should figure out how to get rid of the giant red lines...

 

screenRenderTarget -= redLines;

 

something like that

 

sorry I have to be an idiot every once in a while.. keeps me feeling healthy. I had a similar problem using open gl once and it was because my meshes were slightly overlapping creating strange artifacts.. maybe try and space the meshes apart slightly more

Share this post


Link to post
Share on other sites

there are a couple things to try - first change your background color from black to something else and see if the lines change color with it or if they stay black and you can see them on the outside of the meshes also..

 

if this is the case (where the lines appear black even with a different background color) the problem is likely with the texture you are mapping to the mesh.

 

Try changing your texture filtering settings - and if that doesn't help try switching out the texture to something else. If you are sending the shader texture coords that don't match

with the texture then you can get these strange artifacts also.

Share this post


Link to post
Share on other sites

Interesting, if i change the background to white and change the texture to for examplea striped texture and only draw the colorbuffer on the screen quad.
I can see a white line instead...

I dont understand..
The mesh halfwidth is 12f, 0.5f, 12.0f, so it super easy to hardcode in the positions to be exact.

 

[Edit]:

Forget what i just said...
I exported a new mesh, the previous mesh had problems with its uv.
The problem still remains with the point light.

I can see a black line, it still black no matter what the background color is.

Edited by Noxil

Share this post


Link to post
Share on other sites

A friend of mine said it might be a problem with floating point loss.
That in the vertex shader i guess, a vertex that is supposed to be at 1.0f in Y might be in 0.99998f
and that might be causing the problem.

 

Can this be the issue, if so how can i go about fixing it?

Share this post


Link to post
Share on other sites

I've never had issues with precision in vertex shaders so don't think that would be the issue. Try rendering your scene in wireframe, it might bring some geometry issues to light.

Share this post


Link to post
Share on other sites
Does the black line stay in the same position in the world when you move your camera? Which attributes in the gbuffer are changing at that location? Is there a triangle edge at that location? Are you using MSAA at all?
 
There's some shader operations that can result in NaN, like x/0, 0^0, x^y where y < 0, etc.
You can add some debug checks before these 'dangerous' operations when testing, e.g.
 ...
    if( position.w == 0 )
        return float4(1,0,0,1);//red for debugging
    position /= position.w;
...
Edited by Hodgman

Share this post


Link to post
Share on other sites

The black line is right at the edge of the mesh, where the two meshes would intersect if they where to overlap.
The line flickers and become less and more visible if i move the camera around.

Im not using msaa.

Share this post


Link to post
Share on other sites

Add random checks to check for non-wanted values and then replace them with something else, so when you find a spot where this fixes the error, you can start moving the check to find out where it happens.

Share this post


Link to post
Share on other sites

Thanks for all your tips and feedback.
I think i have narrowed down the problem.

 

In my screenquad shader, if i only draw the normals from the gbuffer i get the black line.

So somehow when i draw the normals to the gbuffer, something goes wrong on a few pixels.

Share this post


Link to post
Share on other sites

Strange you said that the gbuffer was ok and now the normal is wrong :)

 

The normal is world space as i can see in the point light shader (you should switch to view space but that's an other topic).

 

Do you just dump the mesh normal, or do you use some kind of tangent frame to use tangent space normal map ? If you use a tangent frame, do you have mirror in uv and you forget to split the vertex where the uv direction change ?

Share this post


Link to post
Share on other sites

If you have pix, then you can output your corrupted g-buffer to the back buffer and run a single frame capture on it. Then run the "debug this pixel" feature on the black pixels by right clicking on them. That will let you step through the shader to see what is wrong.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this