This topic is 3086 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I've had these artefacts for a while and have only recently come back to look at them. My deferred pipeline isn't exactly streamlined at the moment, there is no elegant use of the buffers etc. I basically have 4 buffers and this is the output for a frame: Position buffer: Normal buffer: Diffuse buffer: Light buffer: Result: I'm aware that the camera angle on the result shot is slight more vertical, this was because the artefact is more apparent there. The light shader is as follows:
//	Light properties.
int		LightType;
float4	LightDiffuseColour;
float4	LightSpecularColour;
float4	LightAmbientColour;

float3	LightPosition;
float3	LightDirection;
float	LightRange;
float	LightFalloff;
float	LightTheta;
float	LightPhi;

float4x4 LightTransform;

float3	CameraPosition;

float4 ClearColor = float4( 0,0,0,0 );

texture PosMapTexture;
texture NormalMapTexture;

sampler PosMapSampler =
sampler_state
{
Texture = PosMapTexture;
};

sampler NormalMapSampler =
sampler_state
{
Texture = NormalMapTexture;
};

struct VS_INPUT
{
float4 Pos:     POSITION;
};

struct VS_OUTPUT
{
float4 Pos:            POSITION;
float2 TexCoord:       TEXCOORD0;
};

struct PS_INPUT
{
float4 TexCoord:       TEXCOORD0;
};

struct PS_OUTPUT
{
float4 Color    : COLOR0;
};

VS_OUTPUT vs_main( VS_INPUT In )
{
VS_OUTPUT Out;

Out.Pos           = float4(In.Pos.xy,0.0,1.0);
Out.TexCoord.x    = (Out.Pos.x+1.0) * 0.5;
Out.TexCoord.y    = 1.0 - ((Out.Pos.y+1.0) * 0.5);

return Out;
}

PS_OUTPUT ps_main( PS_INPUT In )
{
PS_OUTPUT Out;

float4 Pos    = tex2D(PosMapSampler,In.TexCoord);

if (Pos.w==0.0) // If nothing is drawn at this pixel, just output clear color
{
Out.Color = ClearColor;
}
else
{
float3	Normal	= normalize( tex2D( NormalMapSampler, In.TexCoord ).xyz );

float3	ToLight = normalize( LightPosition - Pos );
float3	Reflect = normalize( 2.0f * Normal * saturate(dot( Normal, ToLight )) - ToLight );
float3	View	= normalize( CameraPosition - Pos );

float	Dot = saturate(dot( Normal, ToLight ));
float4	Diffuse	= LightDiffuseColour * Dot;
float   SpecMod = saturate(pow(dot( Reflect, View ),8));
float4	Specular = mul(SpecMod, LightSpecularColour );

float4	FinalColour = LightAmbientColour + Diffuse + Specular * Dot;

float	Dist = length( LightPosition - Pos );
float	Atten = clamp( Dist/LightRange, 0, 1 );

Out.Color = FinalColour * (1-pow(Atten,4));
}
return Out;
}

technique DefaultTechnique
{
pass P0
{
}
}


The banding of the colours on the floor underneath the crates i think is a knackered floor model, im more interested in the edges of the crates looking a bit blocky. Can anyone shed any light as to what is going on? Thanks!

##### Share on other sites
Need to know a few things:
Are you performing gamma correction at all?
What precision are your render targets? fp16, fp32?
Are you using world-space for your positions, normals?

##### Share on other sites
>>Are you performing gamma correction at all?

Nope.

>>What precision are your render targets? fp16, fp32?

A32B32G32R32F for all targets.

Are you using world-space for your positions, normals?

Yes for both.

What confuses me the most is that the rendermonkey deferred example is seemingly the same, there appears no difference. The screen aligned quad data is the same, the vertex shader for the light pass is the same. Also, the light buffer is the only one presenting the artefacts.

Thanks,

[Edited by - Dave on September 9, 2009 2:04:59 PM]

##### Share on other sites
Ok so some more infos. I have made sure that all texture sampling is POINT. I have made sure that the geometry getting to the shader is correct using PIX. It is only when i do a fullscreen quad pass that i get the issue.

##### Share on other sites
Quote:
 Original post by DaveOk so some more infos. I have made sure that all texture sampling is POINT. I have made sure that the geometry getting to the shader is correct using PIX. It is only when i do a fullscreen quad pass that i get the issue.

Are you subtracting 0.5 from your vertex positions? I remember having to do this when mapping texels->pixels otherwise you get distortion similar to yours.

##### Share on other sites
I am not presently, this was suggested before and i did try it. It seemed to make no difference. I will try again but note that the rendermonkey example does not do this and has no artefacts of this nature.

##### Share on other sites
Ah jeez, well i'll be damned.

Thanks.

##### Share on other sites
Was that the problem? If so, glad it's fixed.

Now, if you want the best results from your lighting, I highly recommend you look into gamma correction. It's trivial to do in a shader, but drastically improves the quality of the final results.

##### Share on other sites
It was the half pixel offset issue. :)

I will definately look into gamma correction. Thanks alot mate.

##### Share on other sites
Quote:
 Original post by n00bodyNow, if you want the best results from your lighting, I highly recommend you look into gamma correction. It's trivial to do in a shader, but drastically improves the quality of the final results.

Would you care to elaborate a little, maybe show an image?
I am also interested in this, but always had the impression that you need textures that take this into account as well.

##### Share on other sites
It would be interesting to know the general theory behind it. I wasn't going to look into it for a bit, but while we're here :D.

##### Share on other sites
Quote:
Original post by B_old
Quote:
 Original post by n00bodyNow, if you want the best results from your lighting, I highly recommend you look into gamma correction. It's trivial to do in a shader, but drastically improves the quality of the final results.
Would you care to elaborate a little, maybe show an image?
I am also interested in this, but always had the impression that you need textures that take this into account as well.
Quote:
 Original post by DaveIt would be interesting to know the general theory behind it. I wasn't going to look into it for a bit, but while we're here :D.
Here you go:
http://http.developer.nvidia.com/GPUGems3/gpugems3_ch24.html

Your textures should all be in a linear colour space (unless your hardware supports sRGB textures), and then after performing all of your lighting (using linear math) the final rendering is gamma corrected for the display.

##### Share on other sites

To give you some idea of the difference that gamma correction makes, here are a few convenient visual aids:

Figure 1. 1, without correction; 2, with correction; 3, difference

If you want the short, short version, it looks like this:

For all color textures (albedo, specular, emissive):
albedo.rgba = tex2D(colorTexture, texcoord.st).rgba;
albedo.rgb = pow(albedo.rgb, 2.2);

For final output to back-buffer:
outColor.rgb = pow(outColor.rgb, 1.0 / 2.2);

Now, 2.2 is an average gamma value between multiple systems, and is used by JPEGs to look reasonably consistent across said platforms. However, you can fudge it a little bit to a gamma of 2.0, and shave off a few instructions at a very minor loss of quality.

Color textures:
albedo.rgba = tex2D(colorTexture, texcoord.st).rgba;
albedo.rgb = square(albedo.rgb); // not native, write your own

Final Output:
outColor.rgb = sqrt(outColor.rgb);

EDIT:
I found out that the sqrt() version is actually more expensive because it seems that the compiler will use the RSQ instruction, thus forcing it to use RCP on each of the components.

In general, each system (Windows, Mac, PS3, 360, etc) has their own gamma value, and you'll need to use that if you want best quality. You must also be aware of your art assets that use colors (textures, vertex colors, color constants, etc.), as the colors will only be correct for the system on which they were authored. So you need to process them, and store the colors in the target system's gamma space, or you will have many unexplained issues.

You must also be aware that linear colors need at least fp16 precision for storage. So you couldn't get away with storing them in an rgb8 texture. That's why color textures are stored in gamma-space, because it will work just fine with rgb8.

Bottom line, you want to start taking care of this early in your project, so you don't have to fix it down the road and have hundreds/thousands of art assets to convert/re-author. If you don't perform gamma correction, then the burden falls on your artists to compensate for bad tech, leading to frustration from piles and piles of special-case hacks on their part.

Hope that helps.

[Edited by - n00body on October 20, 2009 1:50:53 PM]