• 11
• 9
• 12
• 9
• 11

# Banding artefacts (caused by attenuation?)

This topic is 836 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi,
I am experiencing annoying banding artefacts on monochromic surfaces. I narrowed down the cause to the attenuation over distance to the light source. At least I think it's that, since all ambient, diffuse, specular and any other rendertarget, I made sure to be floating point, so I shouldn't loose precision. Playing around with my simple phong shading also had no effect on the banding artefacts...

This is how I calculate the attenuation, L is the not yet normalized light vector

  return 1 / (factor * max(dot(L, L), 0.0f) + 1) - g_LightCutoff;


I posted since I need some inspiration on where else to look for the possible causes, since I do want to have at least some sort of attenuation for god ol' physics sake (not that my attenuation calculation has anything to do with the physics I got tought at university )...

In a "real" scene with more interesting texturing, the banding is rather hard or even impossible to detect. It is more a conceptual matter of not having those artefacts.
Here are some images to show you what I mean. If you look closely you can see the banding on the floor and the green wall:

Attenuation and fall off from the spot light enabled

Attenuation enabled, but no fall off from the spot light

##### Share on other sites

Load up the screenshots in photoshop and slide a colour picker across the bands:

The R/G/B values only ever change by +/- 1 unit at a time, meaning these really are the smallest changes in colour that it's possible to represent in 8 bit.

To all the people who say that 24bit colour is good enough for the human eye: you are wrong :lol: :wink: This is why the industry has very slowly been migrating to 30bit colour over the past decade (and probably next decade).

Usually with textures applied, this isn't a problem. If you're making a flat-coloured game though, you can attempt to address it with dithering if it's still bothering you. In your shader, your colours are represented in floating point, so they're very smooth. After your return them, the GPU quantizes to the texture's format (8 bit in this case). So before returning, add a random value from -0.5/28 to +0.5/28 to each channel -- that will cause the GPU to randomly round up or down when quantizing, which will smooth out the bands in the gradient and replace them with noise, which is should make them harder to perceive.

##### Share on other sites

Yes, I also thought about dithering, I just wanted to make sure I didn't miss something obvious (like the colour picker trick... Simple, awesome, effective and totally made me drop my jaw since I couldn't come up with it myself ).

Well now that I know for sure that it's the discretization when rendering to the window's back buffer, I can do things again, thanks a lot!

So I guess the banding is visible in darker images, due to the contrast enhancement for darker regions in the human vision? I think I heard something familiar to this in a lecture about visualization...

Last time I heard something about 30bit colordepth it was 2009 when Nvidia released some paper about the Quadro series to support it, thanks for the reminder, too.

Edited by Wh0p