Jump to content
  • Advertisement
Sign in to follow this  
jameszhao00

Screen Color Precision

This topic is 2860 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I was testing a custom deferred renderer. The following were in the scene:
1. A box
2. A point light

Here's an image of the screen and the corresponding blurred image



Note the blurring did not remove the banding.

Also, I've debugged this in pix and although each pixel returns a different output value (at 6 decimal places after 0.), they're 'polarized' (what's the correct word?) by the time they get to the screen.

From this, I've concluded that the banding is a result of the display precision limitations.

In general, are there ways to make this issue disappear (probably unlikely), or at least less noticeable?

Share this post


Link to post
Share on other sites
Advertisement
There is banding. The word you are looking for is aliasing. In your case it happens because a float value - your color channels values from the pixel shader - will be rounded to the target format's resolution (bit depth of the color channels). I'm judging this from your PNG-screenshot which has 24-bit per pixel, i.e. 8 bits per channel. [0..1] will be scaled to [0..255].

To remove this banding "automatically" you have to render to a format which has more than these 8 bits per channel. This has to be supported by your graphics card: Most (modern) cards can render to even a float valued target, but frame buffer (the device back buffer which will be sent to to your monitor) formats higher than 8 bits per channel is what you need in this case. Not all cards can do that (e.g. my GeForce 8500 GT can only do 8-bit and, well, less).

Then: Your monitor must cope with such a higher color resolution. Not sure, but the term is AFAIK contrast. But I don't know what connection quality (HDMI, DVI, SVGA ?) is needed to make this work.

You can "remove" banding in lower resolutions by dithering.

Edit: Forgot to tell you that blurring does not remove the banding.

Share this post


Link to post
Share on other sites
This could be due to a million things, but the one that comes to mind first is gamma correction. Are you gamma correcting? Or...perhaps you are over gamma correcting? The picture you provided doesn't provide much black. Perhaps you could show one that goes from white to black?

Share this post


Link to post
Share on other sites
Quote:
Original post by unbird
...
Edit: Forgot to tell you that blurring does not remove the banding.


Yes my intention of showing that blurred image was to prove it doesn't remove banding, hence it's probably an output precision issue :)

No I'm not gamma correcting. I should probably try that. Is setting the DXGI format to blah_blah_sRGB enough?

Share this post


Link to post
Share on other sites
Having a proper linear-lighting/gamma-correction pipeline is key. Most diffuse albedo maps are going to be in sRGB space, which means you will need to load them as an sRGB format so that they get converted to linear values when sampled in the shader. However normal maps and other tool-generated textures will often be linear. Whenever you render to an 8-bit render target (including the back buffer), you'll want to use an sRGB format so that the linear values output from your shader get converted to sRGB.

Properly calibrating your monitor can also be very important. There are guides on the internet that you can try and use, or there are professional-quality tools available for purchase.

Share this post


Link to post
Share on other sites
The Gaussian image looks very smooth on my monitor. There is no noticeable banding unless you look really closely and really hard.

The RAW image has horrible banding.

Perhaps it it your monitor? Some LCD displays have 6-bit depth instead of 8. In any case, it could just as easily be your monitor or it's calibration, not the image.

Share this post


Link to post
Share on other sites
Your image has all the possible values in the range. It doesn't skip even one color in a 'band' that I could find, but goes ... 253 252 251 250 249 ..., and any visible banding is either because your eyes can see a difference between those colors or because your monitor isn't good enough. I believe many cheaper monitors actually don't even have 255 levels, but fake it with patterns.
You could try to fix it with dithering. I saved the following image by converting your image to 16 bit per channel in photoshop, then blurring it and converting it back to 8 bit. It seems to have some dithering to hide the bands.

Share this post


Link to post
Share on other sites
Forget to mention how one can see banding: Zoom in and out of the picture fast and repeatedly. The eye can detect change in contrast better when something moves. Also: contrast is perceived worse in the center of your focus. Try this at a clear night: Look slightly past a weakly shining star. You will see it better than if you were looking directly at it.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!