Jump to content

  • Log In with Google      Sign In   
  • Create Account

Deferred Ambient Light via CubeMap, Edge Artefacts


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 bombshell93   Members   -  Reputation: 206

Like
0Likes
Like

Posted 03 April 2013 - 06:12 AM

here is a screenshot of the problem followed by the cubemap I'm using (I made a content processor to convert it to textureCube (I'm using XNA) though it does it strangley and they come out flipped so the cubemap may look odd)
rendering_artefacts_by_pushbombshell-d60
cubemap_by_pushbombshell-d60a34i.jpg
ignore the dark pixel its the models UV's, whats really ticking me off is the edge, it only happens with my ambient light shader, the directional light and the point lights have no issue, but this confused me because it seemed so much like the half-pixel offset issue.
here is my shader, all the buffers are set to PointClamp sampling.
 

sampler DiffuseSampler : register(s0);
sampler SpecularSampler : register(s1);
sampler NormalSampler : register(s2);

texture Environment;
float ambientStrength;
float2 GBufferSize;

sampler EnvironmentSampler = sampler_state
{
	texture = <Environment>;
	mipfilter = LINEAR;
	minfilter = LINEAR;
	magfilter = LINEAR;
};

struct VertexShaderInput
{
    float4 Position : POSITION0;
	float2 UV : TEXCOORD0;
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
	float2 UV : TEXCOORD0;
	float4 ProjectPos : TEXCOORD1;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
	
	output.Position = input.Position;
	output.UV = output.Position.xy * float2(0.5f, -0.5f) + 0.5f;
	output.UV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f;
	output.ProjectPos = output.Position;
	
    return output;
}

float3 decode (float2 enc)
{
    float4 nn = float4(enc, 0, 0) * float4(2,2,0,0) + float4(-1,-1,1,-1);
    float l = dot(nn.xyz,-nn.xyw);
    nn.z = l;
    nn.xy *= sqrt(l);
    return nn.xyz * 2 + float3(0,0,-1);
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
	float4 diffuseSample = tex2D(DiffuseSampler, input.UV);
	float4 specularSample = tex2D(SpecularSampler, input.UV);
	float4 normalSample = tex2D(NormalSampler, input.UV);
	float3 normal = decode(normalSample.xy);
	
	float AmbientIntensity = ambientStrength;
    return float4(texCUBE(EnvironmentSampler, normal).xyz * diffuseSample * AmbientIntensity, 1);
}

technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}

the problem gets worse as the ambient light increases, I've done some messing around and noticed it seems worse going from a red surface to another surface so I'm convinced its getting wrong normals and so what should be the flat backdrop is getting the normals of the spheres and the ship above it, again somewhat like the half-pixel offset issue, but I've used the exact same UV getting method in my directional light shader and point light shader, with none of this issue.

if anyone notices anything, knows the problem, has some debugging advice, anything to help me around this it'd be great, about a day has been lost fiddling to find the issue.
Thanks in advanced,
Bombshell

 



Sponsor:

#2 jcabeleira   Members   -  Reputation: 686

Like
1Likes
Like

Posted 04 April 2013 - 03:40 AM

That happens because of the mipmapping and the lack of correct derivatives. In forward rendering pixels are processed in 2x2 groups, even if some of the pixels don't fall inside the primitve they are still processed as if they were just to provide you with the screen space derivatives that you need to perform mipmapping.

In deferred rendering you haven't such derivatives, you're processing pixels in 2x2 groups but you're reading your data from a buffer so at the edges of objects some pixels may fall inside of a different primitive than the one you'd expect.

 

The simplest solution for that is to disable the mipmapping for that texture or supply your g-buffer with the screen space derivatives and use them when sampling from the texture.



#3 bombshell93   Members   -  Reputation: 206

Like
0Likes
Like

Posted 04 April 2013 - 06:23 AM

all the render targets mipmapping is disabled and to be sure I made a sampler state sampling with point sampling and maximum mip level at 0, but the problem persists.



#4 Hodgman   Moderators   -  Reputation: 30415

Like
1Likes
Like

Posted 04 April 2013 - 06:45 AM

To be doubly sure, you could use texCUBElod, with a w coordinate of 0.

#5 bombshell93   Members   -  Reputation: 206

Like
0Likes
Like

Posted 04 April 2013 - 07:17 AM

that did the trick nicely, thanks so much, I thought it was a problem sampling from the GBuffers and that the shader using texCUBE had nothing to do with it



#6 Hodgman   Moderators   -  Reputation: 30415

Like
0Likes
Like

Posted 04 April 2013 - 08:59 PM

If texCUBElod fixed it, then this means that your earlier code that disabled mipmap generation and that set point filtering wasn't actually working, so I'd investigate those failures while you're at it! ;-)

#7 bombshell93   Members   -  Reputation: 206

Like
0Likes
Like

Posted 04 April 2013 - 09:37 PM

well the sampler state is set, I just don't think it works on cubemaps properly, I've fiddled around and nothing I do changes this (possibly a limitation of XNA), which is a bit of a pain as I'd intended to stick to Shader Model 2 but I had to move to 3 in order to use texCUBElod.



#8 bombshell93   Members   -  Reputation: 206

Like
0Likes
Like

Posted 05 April 2013 - 04:57 AM

So I figured making another thread would be pointless as it'd likely die after 1 or 2 replies,
I'm adding SSAO and I have the random sample vectors transform their way to screen space UV's fine, but I need to rotate them randomly per pixel so I don't get the ambient occlusion looking like messed up shadows.
from what I understand I would transform the sample vector into the normals space, rotate its X and Y by sampling a noise texture and transform it back into world space, but how would I do this? (if anyone has a link to an article or something, that would be great, as this seams like something I would want to learn for later use too)

EDIT: I eventually found something for this, though the math is not firmly in my head as I'd like it, I have managed to make the rotation matrix and the AO is looking good now.


Edited by bombshell93, 05 April 2013 - 10:17 PM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS