Jump to content

  • Log In with Google      Sign In   
  • Create Account


Direct3D 11 Deferred Shading Banding


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
19 replies to this topic

#1 WFP   Members   -  Reputation: 483

Like
0Likes
Like

Posted 23 March 2013 - 09:12 AM

Greetings,

 

I am having some issues with my deferred shading implementation that I am spinning my tires on.

 

The back buffer format I am using is:

DXGI_FORMAT_R8G8B8A8_UNORM.

 

My GBuffer render targets are:

 

GBufferSetup     R                 G                B                A
SV_Target0        normal.x      normal.y     normal.z     specularPower
SV_Target1        diffuse.r       diffuse.g     diffuse.b     ambient.r
SV_Target2        specular.r    specular.g   specular.b  ambient.g
SV_Target3        position.x     position.y   position.z   ambient.b
 

SV_Target0 is DXGI_FORMAT_R32G32B32A32_FLOAT

SV_Target1 is DXGI_FORMAT_R8G8B8A8_UNORM

SV_Target2 is DXGI_FORMAT_R8G8B8A8_UNORM

SV_Target3 is DXGI_FORMAT_R32G32B32A32_FLOAT

 

The texture used for my light pass (additive blending) is also DXGI_FORMAT_R32G32B32A32_FLOAT.

 
I know some the rgba32_f textures may be overkill, but I'm just trying to get things working before worrying about saving too much on bandwidth and other similar concerns - with such simple scenes (one sphere and a plane for the ground, oh my!) there's not too much performance impact smile.png.
 
I have Googled around and tried a number of different fixes such as using a DXGI_FORMAT_R16G16B16A16_FLOAT back buffer format and changing the GBuffer and light pass render target formats around, but so far everything still has the ugly banding present.

 

Here is an example of what is going on.

ugly banding r8g8b8a8_unorm_bbuffer.PNG

 

Any insight to possible fixes I may have missed would be greatly appreciated.  I may be missing something obvious, and have just been staring at it for too long.  If there is any other information I can supply to better troubleshoot this issue, just let me know.

 

Thanks!

 



Sponsor:

#2 lipsryme   Members   -  Reputation: 985

Like
1Likes
Like

Posted 23 March 2013 - 11:40 AM

Have you tried putting a texture on the plane ? :)

I think that would plainly fix the problem you're having. (You are talking about the banding at the light's falloff on the ground, right?)

You might want to read this: http://en.wikipedia.org/wiki/Colour_banding. Basically it has something to do with the monitor not being able to produce the gradients in between.

Anyway try to put a texture on it.



#3 AgentC   Members   -  Reputation: 1219

Like
3Likes
Like

Posted 23 March 2013 - 11:53 AM

Analyzing the picture in an image editor shows the 8-bit color value decreasing by one at each band, so you simply don't get more color resolution from a 8-bit backbuffer. The result would be the same with forward rendering.


Every time you add a boolean member variable, God kills a kitten. Every time you create a Manager class, God kills a kitten. Every time you create a Singleton...

Urho3D (engine)  Hessian (C64 game project)


#4 powly k   Members   -  Reputation: 630

Like
4Likes
Like

Posted 23 March 2013 - 11:54 AM

Indeed, that seems to be just the unavoidable banding you always with such a slowly changing gradient and current monitors. Another thing you can do is apply some very slight noise, it won't be visible as noise but will smooth out the gradient.



#5 WFP   Members   -  Reputation: 483

Like
0Likes
Like

Posted 23 March 2013 - 12:02 PM

liprsyme:  Yes, I should have mentioned it, but the texture I'm using for the floor plane is just a 1 x 1 pixel white texture that's just tiled over the entire thing.  The texture's format is X8R8G8B8, but I also tried just a few moments ago changing that to A16G16B16R16F just to see if it would help.  It did not.

 

AgentC:  Interesting.  What about the fact that the bands are still there when using the DXGI_FORMAT_R16G16B16A16_FLOAT back buffer format?  I've posted a picture.  Please disregard the non-gamma-correctedness of it (it's automatically put into linear color space, see here http://msdn.microsoft.com/en-us/library/windows/desktop/ff471325(v=vs.85).aspx).  At this point, is it basically just the fact that the monitor is incapable of producing a colors between those banded values?  By the way, you're right - the image is produced identically when I switch to the forward renderer.

 

powly k:  That's a good idea, I may try that out if it turns out to be strictly a monitor-related issue.

 

Editted to include image and forward rendering comment.

Attached Thumbnails

  • ugly banding r16g16b16a16_float_bbuffer.PNG

Edited by WFP, 23 March 2013 - 12:04 PM.


#6 lipsryme   Members   -  Reputation: 985

Like
2Likes
Like

Posted 23 March 2013 - 12:18 PM

Well of course what I was trying to say was not to use a solid color as the texture but an actual "texture". Because you will only see color banding appear clearly on those solid color gradients. Giving the backbuffer more precision won't help either as powly k also pointed out it's a (monitor)panel issue. Btw that doesn't mean that your monitor sucks xD probably 90%+ will show this behavior.


Edited by lipsryme, 23 March 2013 - 12:19 PM.


#7 WFP   Members   -  Reputation: 483

Like
0Likes
Like

Posted 23 March 2013 - 12:27 PM

lipsryme: OK, I see what you were saying now, and yes using a "real" texture does alleviate the problem (for example the sphere in the first picture I posted doesn't have an issue).  I was just making sure I wasn't doing something incorrectly elsewhere to cause the bands, but it definitely looks to be monitor limitations.

 

Thank you and all others who posted for your help with this!



#8 AgentC   Members   -  Reputation: 1219

Like
0Likes
Like

Posted 23 March 2013 - 01:30 PM

What about the fact that the bands are still there when using the DXGI_FORMAT_R16G16B16A16_FLOAT back buffer format?

 

Not 100% sure, but I would guess that Windows composites / blits a float backbuffer to 8 bits per channel before displaying.


Every time you add a boolean member variable, God kills a kitten. Every time you create a Manager class, God kills a kitten. Every time you create a Singleton...

Urho3D (engine)  Hessian (C64 game project)


#9 Chris_F   Members   -  Reputation: 1937

Like
0Likes
Like

Posted 23 March 2013 - 03:02 PM

Of course there is going to be banding. Using a 16-bit back buffer isn't going to make any difference since it's still going to get quantized to 8-bit when being displayed. You would have to render to a HDR buffer and then perform some type of dithering operation when converting to 8-bit LDR for display.



#10 kauna   Crossbones+   -  Reputation: 2154

Like
0Likes
Like

Posted 23 March 2013 - 03:29 PM

As far as I know A2R10G10B10 format is supported in full screen modes and should give less banding if your monitor supports more than 8-bits per channel. DX SDK even had a sample about it. 

 

The problem you are describing is difficult to handle, but typically you won't notice the problem since most of the surfaces are textured and texturing hides the problem pretty well.

 

Cheers!


Edited by kauna, 23 March 2013 - 04:30 PM.


#11 Frenetic Pony   Members   -  Reputation: 1183

Like
0Likes
Like

Posted 23 March 2013 - 04:53 PM

As far as I know A2R10G10B10 format is supported in full screen modes and should give less banding if your monitor supports more than 8-bits per channel. DX SDK even had a sample about it. 

 

The problem you are describing is difficult to handle, but typically you won't notice the problem since most of the surfaces are textured and texturing hides the problem pretty well.

 

Cheers!

 

What monitor on earth except a handful support more than 8 bits per channel? Those things are fantastically expensive and obviously supporting that is not a consumer thing.

 

Some type of dithering is definitely the way to go here.



#12 kauna   Crossbones+   -  Reputation: 2154

Like
0Likes
Like

Posted 23 March 2013 - 08:25 PM

Well I wouldn't say fantastically expensive (my monitor is less than 700 euros) , thought maybe the feature can't be found on typical consumer equipment.

 

In my opinion the op doesn't need to bother on this problem. Like I said, texturing etc will hide the problem. Has someone tried dithering with this kind of problem?

 

Cheers!


Edited by kauna, 23 March 2013 - 08:26 PM.


#13 Chris_F   Members   -  Reputation: 1937

Like
0Likes
Like

Posted 23 March 2013 - 09:10 PM

Well I wouldn't say fantastically expensive (my monitor is less than 700 euros) , thought maybe the feature can't be found on typical consumer equipment.

 

In my opinion the op doesn't need to bother on this problem. Like I said, texturing etc will hide the problem. Has someone tried dithering with this kind of problem?

 

Cheers!

 

A 10-bit display for 700 euros would be a pretty amazing deal. Even then, not many people spend 700 euros ($900 USD) on a monitor, probably less than 1% of people. Textures will only hide it a lot of the time, not all of the time.


Edited by Chris_F, 23 March 2013 - 09:11 PM.


#14 Hodgman   Moderators   -  Reputation: 27466

Like
3Likes
Like

Posted 23 March 2013 - 09:26 PM

FWIW, most LCD's say they're 8 bit, but are actually 6 bit with dynamic contrast...

#15 kauna   Crossbones+   -  Reputation: 2154

Like
1Likes
Like

Posted 24 March 2013 - 08:10 AM

I stress my previous answer, the problem described isn't that bad when surfaces are textured. Going 10-bit isn't the solution since it's not the most typical format supported nor supported by typical hardware.

 

[edit] Well, at least you can get a 8-bit screen with AFRC-technique that gives something like 10-bit output :)

 

Cheers!


Edited by kauna, 24 March 2013 - 08:49 AM.


#16 galop1n   Members   -  Reputation: 226

Like
2Likes
Like

Posted 24 March 2013 - 11:46 AM

Banding is normal even with 10bits per components display. The only solution si to apply a dithering pass on the picture. Because floyd steindberg is not really applicable, we use cheaper solution. The most effective one is to add some noise to the color at the tone mapping and HR to LR stage.

 

So in your pipeline, you add light in the linear space in a 16F render target ( a small float is really enough to add lighting ). Then when moving from linear to gamma from a 16F to RGBA8 you use the formula gammaColor = pow( linearColor + noise1, 1.f/2.2f) + noise2; You can build a small noisy texture (like a 64x64) well create to remove visible pattern, you can also use some dynamic noise to give a movie grain not constant.

 

By tweaking noise1 and noise2, you will add/remove a small amount of value, enough to let the HR to LR conversion have some pixel moving one up and one down.

 

After that, the human eye will do the color integration and the banding will disapear magically :)



#17 InvalidPointer   Members   -  Reputation: 1364

Like
0Likes
Like

Posted 25 March 2013 - 02:18 PM

Are you gamma-correcting this at all? Easiest way to do this is to use an sRGB backbuffer/RTV. You'll also need to degamma any color stuff to counteract the darkening effect.

 

EDIT: A (0.5/255) bias alternating across pixels/frames will also help somewhat. The idea is that you shift brightness for pixels that sit in between the two displayed steps of color so your eye will do a little bit of extra temporal integration as they flip-flop over time.


Edited by InvalidPointer, 25 March 2013 - 02:26 PM.

clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

#18 Lightness1024   Members   -  Reputation: 677

Like
0Likes
Like

Posted 26 March 2013 - 09:05 AM

FWIW, most LCD's say they're 8 bit, but are actually 6 bit with dynamic contrast...

absolutely. actually 8 bits display with a classic LDR are not supposed to display banding at all, and that is the origin of the terminology "true color". Basically we see banding in recent years because of an industry regression.

 

there's not too much performance impact .

Hem, surely not on your graphic card. but run it on an integrated sandy bridge, or fusion, or intel GMA and you will get your slow down. Bandwidth is the biggest hit-to-the ground that bites your butt when it comes to portability over category of hardware. Because when overcapacitized the framerate just plumets like lead sucked by a black hole.

Unfortunately, developing on a recent mid/high range graphic card will always hide that fact.

Also beware of using more than 2 render targets at once. drivers may say they support up to 4, or 8 even for DX10 hardware, but in practice sometimes on low end cards, its super poorly implemented. (i'm thinking of the horrible integrated geForce 6100 for example, and the more than shitty FireGL 5600.)



#19 Chris_F   Members   -  Reputation: 1937

Like
5Likes
Like

Posted 26 March 2013 - 08:51 PM

OK, here I did a test with a little C++ and the FreeImage library. I rendered a test cube in Blender with low lighting and no textures, then exported it as a 16-bit EXR. I quantized the image to 8-bits:

 

test.png

 

Top left you see the results if you output floor(pixelColor * 255f + 0.5f), and then on the right you see floor(pixelColor * 255f + rand), where rand is between 0 and 1. At the bottom I adjusted the contrast to show it off better. The random bias is added after gamma correction but before quantization, so I don't think there is any way to utilize a sRGB backbuffer for this technique.



#20 WFP   Members   -  Reputation: 483

Like
0Likes
Like

Posted 27 March 2013 - 10:45 AM

Just wanted to post a follow-up and say thanks to all the great replies here.

 

Gamma correcting turned out to be a huge help in this, and is something I will never overlook in my pipeline again!

 

Here is a more recent image with gamma correction and no noise added.  On my LG monitor, there is just a small bit of banding noticeable, but on my Samsung second monitor there is practically none visible at all.  I'm all but certain that when I get around to adding a bit of dithering or noise in there the banding will be almost completely unnoticeable.

 

Thanks again all!

 

 

Attached Thumbnails

  • gammacorrected_no_noise.PNG





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS