Direct3D 11 Deferred Shading Banding

Started by
18 comments, last by WFP 11 years ago

As far as I know A2R10G10B10 format is supported in full screen modes and should give less banding if your monitor supports more than 8-bits per channel. DX SDK even had a sample about it.

The problem you are describing is difficult to handle, but typically you won't notice the problem since most of the surfaces are textured and texturing hides the problem pretty well.

Cheers!

What monitor on earth except a handful support more than 8 bits per channel? Those things are fantastically expensive and obviously supporting that is not a consumer thing.

Some type of dithering is definitely the way to go here.

Advertisement

Well I wouldn't say fantastically expensive (my monitor is less than 700 euros) , thought maybe the feature can't be found on typical consumer equipment.

In my opinion the op doesn't need to bother on this problem. Like I said, texturing etc will hide the problem. Has someone tried dithering with this kind of problem?

Cheers!

Well I wouldn't say fantastically expensive (my monitor is less than 700 euros) , thought maybe the feature can't be found on typical consumer equipment.

In my opinion the op doesn't need to bother on this problem. Like I said, texturing etc will hide the problem. Has someone tried dithering with this kind of problem?

Cheers!

A 10-bit display for 700 euros would be a pretty amazing deal. Even then, not many people spend 700 euros ($900 USD) on a monitor, probably less than 1% of people. Textures will only hide it a lot of the time, not all of the time.

FWIW, most LCD's say they're 8 bit, but are actually 6 bit with dynamic contrast...

I stress my previous answer, the problem described isn't that bad when surfaces are textured. Going 10-bit isn't the solution since it's not the most typical format supported nor supported by typical hardware.

[edit] Well, at least you can get a 8-bit screen with AFRC-technique that gives something like 10-bit output :)

Cheers!

Banding is normal even with 10bits per components display. The only solution si to apply a dithering pass on the picture. Because floyd steindberg is not really applicable, we use cheaper solution. The most effective one is to add some noise to the color at the tone mapping and HR to LR stage.

So in your pipeline, you add light in the linear space in a 16F render target ( a small float is really enough to add lighting ). Then when moving from linear to gamma from a 16F to RGBA8 you use the formula gammaColor = pow( linearColor + noise1, 1.f/2.2f) + noise2; You can build a small noisy texture (like a 64x64) well create to remove visible pattern, you can also use some dynamic noise to give a movie grain not constant.

By tweaking noise1 and noise2, you will add/remove a small amount of value, enough to let the HR to LR conversion have some pixel moving one up and one down.

After that, the human eye will do the color integration and the banding will disapear magically :)

Are you gamma-correcting this at all? Easiest way to do this is to use an sRGB backbuffer/RTV. You'll also need to degamma any color stuff to counteract the darkening effect.

EDIT: A (0.5/255) bias alternating across pixels/frames will also help somewhat. The idea is that you shift brightness for pixels that sit in between the two displayed steps of color so your eye will do a little bit of extra temporal integration as they flip-flop over time.

clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

FWIW, most LCD's say they're 8 bit, but are actually 6 bit with dynamic contrast...

absolutely. actually 8 bits display with a classic LDR are not supposed to display banding at all, and that is the origin of the terminology "true color". Basically we see banding in recent years because of an industry regression.

there's not too much performance impact .

Hem, surely not on your graphic card. but run it on an integrated sandy bridge, or fusion, or intel GMA and you will get your slow down. Bandwidth is the biggest hit-to-the ground that bites your butt when it comes to portability over category of hardware. Because when overcapacitized the framerate just plumets like lead sucked by a black hole.

Unfortunately, developing on a recent mid/high range graphic card will always hide that fact.

Also beware of using more than 2 render targets at once. drivers may say they support up to 4, or 8 even for DX10 hardware, but in practice sometimes on low end cards, its super poorly implemented. (i'm thinking of the horrible integrated geForce 6100 for example, and the more than shitty FireGL 5600.)

OK, here I did a test with a little C++ and the FreeImage library. I rendered a test cube in Blender with low lighting and no textures, then exported it as a 16-bit EXR. I quantized the image to 8-bits:

[attachment=14483:test.png]

Top left you see the results if you output floor(pixelColor * 255f + 0.5f), and then on the right you see floor(pixelColor * 255f + rand), where rand is between 0 and 1. At the bottom I adjusted the contrast to show it off better. The random bias is added after gamma correction but before quantization, so I don't think there is any way to utilize a sRGB backbuffer for this technique.

Just wanted to post a follow-up and say thanks to all the great replies here.

Gamma correcting turned out to be a huge help in this, and is something I will never overlook in my pipeline again!

Here is a more recent image with gamma correction and no noise added. On my LG monitor, there is just a small bit of banding noticeable, but on my Samsung second monitor there is practically none visible at all. I'm all but certain that when I get around to adding a bit of dithering or noise in there the banding will be almost completely unnoticeable.

Thanks again all!

This topic is closed to new replies.

Advertisement