Banding Problem

Started by
4 comments, last by Erik Rufelt 8 years, 6 months ago

I've been trying fix a banding problem that I've been having for quite a while, but I'm not really making much progress. I found this post a while back : http://www.gamedev.net/topic/640733-direct3d-11-deferred-shading-banding/

One suggestion they gave was that much of the banding would be removed with textures. I've placed textures on the floors and it does help, but the banding is still noticeable. I also have done gamma correction as well. Are their any other techniques that may be able to help remove the banding?

cmPyJoH.jpg

Edit: I'm currently experimenting with adding some random values to my image as someone suggested in the linked post. This does help a bit, but it makes the resulting picture very grainy.

Advertisement

When people say "texture", it's more so that you add high frequency details to fool the eye.

You need 1- to take a hard look at your whole pipeline and look at any place that could reduce precision. If you have multiple passes of blending, be careful if you take a 8 bit per channel data, and increase its brightness in a later pass (by multiplying by a value bigger than 1). If you have textures, make sure they don't have already banding (they are stored in sRGB format and they have enough high frequency details, especially if they are compressed textures that could have reduced their precision).

2- If you have perf to burn, make all your calculations in a higher precision (use 16 float number or 10 float number or 10 integer number rendertarget), then as a last pass convert to the 8 bit per channel format. Either with a straight conversion (if the banding was introduced by the aforementionned blending or rescaling), or with dithering.

gamma correction as well.

In what sense ? By default your desktop is displayed (implicitly) in sRGB format (based on your monitor default assumption). That format increases the precision in the dark areas which is where most banding is visible. So the recommendation would be to NOT touch the default color correction/color space. Changing the monitor/scanout color space to a "linear" format (with SetGammaRamp() for example) will definitely increase banding (linear is still recommended for lighting computations which is where the hw capability (or the shader based conversion) comes in). If you have to make brightness adjustments, try to do it shader based as much as possible rather than playing on the output ramp.

perhaps you already seen it but I think this presentation gives some solutions to fix banding.

Just as an fyi (I'm sure the OP knows this, since they referenced the other thread), the difference between the bands in the image you posted is 1 rgb value. So this isn't a precision issue.

It's something that needs to be fixed with noise, textures, dithering, or maybe a gamma adjustment which would have the effect of lightening up the darker tones.

When people say "texture", it's more so that you add high frequency details to fool the eye.

You need 1- to take a hard look at your whole pipeline and look at any place that could reduce precision. If you have multiple passes of blending, be careful if you take a 8 bit per channel data, and increase its brightness in a later pass (by multiplying by a value bigger than 1). If you have textures, make sure they don't have already banding (they are stored in sRGB format and they have enough high frequency details, especially if they are compressed textures that could have reduced their precision).

2- If you have perf to burn, make all your calculations in a higher precision (use 16 float number or 10 float number or 10 integer number rendertarget), then as a last pass convert to the 8 bit per channel format. Either with a straight conversion (if the banding was introduced by the aforementionned blending or rescaling), or with dithering.

gamma correction as well.

In what sense ?

I'm only doing a very simple gamma correction in my shaders by using 2.2 for my constant gamma value:


float4 gammaToLinear(float4 color) {
  return float4(pow(color.rgb), (float3)(c_gammaValue)), color.a);
}

float4 linearToGamma(float4 color) {
  return float4(pow(color.rgb, (float3)(1.0f / c_gammaValue)), color.a);
}
 

perhaps you already seen it but I think this presentation gives some solutions to fix banding.

I actually hadn't seen this before and it was very helpful. Thank you for posting this!

After implementing a random dithering from the post that I referenced in my original post, I was able to almost completely remove the banding.

QntdcRQ.jpg

As you can see, the dithering introduces a graininess to the image. Any idea of how to remove this graininess?

Like phil_t said, it can't be done (at your current bit-depth), the adjacent bands have the closest possible colors that can be displayed.

If your monitor supports it you can get a lot better results by using R10G10B10A2 as the back-buffer format, or even R16G16...etc. if the driver has that and converts it nicely for the monitor. Some monitors without native 30-bit color also does high-quality dithering for you so using that depth can help even on technically 24-bit monitors.

EDIT: Note that these usually only make a difference in true fullscreen mode, not in windowed mode.

This topic is closed to new replies.

Advertisement