Sign in to follow this  

Screen Color Precision

This topic is 2522 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I was testing a custom deferred renderer. The following were in the scene:
1. A box
2. A point light

Here's an image of the screen and the corresponding blurred image



Note the blurring did not remove the banding.

Also, I've debugged this in pix and although each pixel returns a different output value (at 6 decimal places after 0.), they're 'polarized' (what's the correct word?) by the time they get to the screen.

From this, I've concluded that the banding is a result of the display precision limitations.

In general, are there ways to make this issue disappear (probably unlikely), or at least less noticeable?

Share this post


Link to post
Share on other sites
There is banding. The word you are looking for is aliasing. In your case it happens because a float value - your color channels values from the pixel shader - will be rounded to the target format's resolution (bit depth of the color channels). I'm judging this from your PNG-screenshot which has 24-bit per pixel, i.e. 8 bits per channel. [0..1] will be scaled to [0..255].

To remove this banding "automatically" you have to render to a format which has more than these 8 bits per channel. This has to be supported by your graphics card: Most (modern) cards can render to even a float valued target, but frame buffer (the device back buffer which will be sent to to your monitor) formats higher than 8 bits per channel is what you need in this case. Not all cards can do that (e.g. my GeForce 8500 GT can only do 8-bit and, well, less).

Then: Your monitor must cope with such a higher color resolution. Not sure, but the term is AFAIK contrast. But I don't know what connection quality (HDMI, DVI, SVGA ?) is needed to make this work.

You can "remove" banding in lower resolutions by dithering.

Edit: Forgot to tell you that blurring does not remove the banding.

Share this post


Link to post
Share on other sites
This could be due to a million things, but the one that comes to mind first is gamma correction. Are you gamma correcting? Or...perhaps you are over gamma correcting? The picture you provided doesn't provide much black. Perhaps you could show one that goes from white to black?

Share this post


Link to post
Share on other sites
Quote:
Original post by unbird
...
Edit: Forgot to tell you that blurring does not remove the banding.


Yes my intention of showing that blurred image was to prove it doesn't remove banding, hence it's probably an output precision issue :)

No I'm not gamma correcting. I should probably try that. Is setting the DXGI format to blah_blah_sRGB enough?

Share this post


Link to post
Share on other sites
Having a proper linear-lighting/gamma-correction pipeline is key. Most diffuse albedo maps are going to be in sRGB space, which means you will need to load them as an sRGB format so that they get converted to linear values when sampled in the shader. However normal maps and other tool-generated textures will often be linear. Whenever you render to an 8-bit render target (including the back buffer), you'll want to use an sRGB format so that the linear values output from your shader get converted to sRGB.

Properly calibrating your monitor can also be very important. There are guides on the internet that you can try and use, or there are professional-quality tools available for purchase.

Share this post


Link to post
Share on other sites
The Gaussian image looks very smooth on my monitor. There is no noticeable banding unless you look really closely and really hard.

The RAW image has horrible banding.

Perhaps it it your monitor? Some LCD displays have 6-bit depth instead of 8. In any case, it could just as easily be your monitor or it's calibration, not the image.

Share this post


Link to post
Share on other sites
Your image has all the possible values in the range. It doesn't skip even one color in a 'band' that I could find, but goes ... 253 252 251 250 249 ..., and any visible banding is either because your eyes can see a difference between those colors or because your monitor isn't good enough. I believe many cheaper monitors actually don't even have 255 levels, but fake it with patterns.
You could try to fix it with dithering. I saved the following image by converting your image to 16 bit per channel in photoshop, then blurring it and converting it back to 8 bit. It seems to have some dithering to hide the bands.

Share this post


Link to post
Share on other sites
Forget to mention how one can see banding: Zoom in and out of the picture fast and repeatedly. The eye can detect change in contrast better when something moves. Also: contrast is perceived worse in the center of your focus. Try this at a clear night: Look slightly past a weakly shining star. You will see it better than if you were looking directly at it.

Share this post


Link to post
Share on other sites
Found a fix (not really a fix but more of a hack)

Turn off / minimize light attenuation for point lights. (The light attenuation was creating the elongated lighting gradient which was in turn being 'aliased' by the 8 bit output)

I also enabled sRGB output and everything looks brighter(/better). Thank you allingm/mjp.

Share this post


Link to post
Share on other sites
I retract my previous statement upon seeing the image inside Photoshop.

I also noticed that gradients in Photoshop (even black/white) contain color dithering, which eliminates any perceivable banding.



Without some form of dithering, bands will always be there.

Share this post


Link to post
Share on other sites
Did you fix the gamma problem? Whether or not it is causing your banding, you should take care of it in my opinion.

If you don't know anything about gamma you can take a look at the following presentation. Its my favorite on the subject.
http://filmicgames.com/archives/6 (the presentation is on the "here" link)

sRGB will help. Textures are usually in sRGB and they need to be converted to linear on texture read. Also, any render target that is in 8bit should be in sRGB if you want to have more definition at lower precisions aka the dark parts (for instance your diffuse buffer or your lighting buffer, but not your normal buffer). Finally, your back buffer should be in sRGB because this is what your monitor expects. DirectX 10 and 11 will automatically do the conversion for you on write/read if you specify the texture is an _sRGB texture.

Share this post


Link to post
Share on other sites
If you have an image specified as floating-point values and you want to convert them to integers without appreciable banding, you can use the type of dithering techniques that were common when people needed to use only 256 colors in pictures.

One basic algorithm starts with the top-left corner, rounds its color to an integer color, computes the difference between the desired color and the integer color, and propagates the difference to neighboring pixels: Half to the pixel to the right and half to the pixel below. Now do the same thing with the next pixel (e.g., the one on the right). When you are done with the row, move to the next row and do the same thing.

Another easier alternative is to have an image with values between -0.5 and +0.5 for each pixel (three values if you use RGB). You add the value from this image to each pixel before you round to an integer.

But you are probably being obsessive and the banding is actually not that big a deal once you have 8 bits per channel.

Share this post


Link to post
Share on other sites
@allingm
Yea I did. I set the framebuffer format to rgba8_srbg. Also I fixed my laptop's gamma (which was horrendously off by the way). The banding is barely noticeable now. Thank you for the suggestion. I didn't know gamma could make such a big difference.

I'm not doing light prepass (assuming that's what the lighting buffer you mentioned is for). But if I was doing light prepass that lighting buffer shouldn't be in gamma space, should it?

@alvaro
Thanks I'll check them out.

Share this post


Link to post
Share on other sites
Hmm if it's deferred renderer then perhaps it's caused by linear filtering in depth texture. You should change it to use point sampling instead. And second, maybe you need to store the normal in a float buffer (such as RG16F) and store viewspacenormal.xy instead....hope this helps..

Share this post


Link to post
Share on other sites
Gamma space / sRGB is just a compression scheme. Your eyes detect changes in light by percentage; therefore, they are more sensitive to changes in the darks, values closer to 0.0f. If we have to store data into a 8 bit channel we could just store it directly (linear), or if we realize the above we could scrunch the numbers so that there is more precision near 0.0f and less near 1.0f. We do this with a function similar to pow( RGB, 2.2f ), linear to gamma, and pow( RGB, 1.0f / 2.2f ), gamma to linear. The sRGB texture in DirectX 10 & 11 simply does the compression on read and write for you. Notice that there is no sRGB format for the floating point formats because they already have enough precision.

Now, with the above in mind we realize we should use sRGB anywhere we want more fidelity near 0.0f. This is typically done on diffuse textures (aka a photo), and your monitor. You can also do it anywhere you'd like. Would it be good on normals? Seeing as we want a uniform distribution (linear), I would say no. How about a lighting buffer? Well, we want dark lights to be more precise, right? Basically the rule of thumb is that if your eyes are going to see it you should compress it. One place of debate however might be specular values since they are kind of visual and mathematical at the same time.

Share this post


Link to post
Share on other sites
[quote name='jameszhao00' timestamp='1293114730' post='591224']
I was testing a custom deferred renderer. The following were in the scene:
1. A box
2. A point light

[/quote]

It should be noted that when drawing a series of adjacent 8 bit rgb values, your brain will detect any variation in the rate of change in color and detect an edge there, which amplifies the effect. Basically your brain is capable of detecting the derivative of the sequence and tell you 'hey there's an edge here'. This affect can be called banding though I tend to reserve that word for when the values aren't adjacent numerically (say for instance, after a gamma correction step) for whatever the frame buffer format is.

i.e.


[font="Courier New"]0, 0, 0, 1, 1, 1, 2, 2, 2, 3, 3, 4, 4, 4, 5, 5, 5
[/font][font="Courier New"]

Causes our brain to notice the different in rate of change between 2..3 and 3..4
[/font]

Share this post


Link to post
Share on other sites
I don't see any banding either. It's perfectly smooth in both cases. Your laptop LCD probably doesn't support the full color gamut. Some LCDs are very poor with their color gamut support. I've got a secondary LCD in here that can't show all the different colors in the gmail interface. They all get rounded to the nearest displayable color, which makes all the subtle shades show up as the same color.

If the LCD monitor doesn't have a wide gamut backlight, you won't see all the possible shades. Some very poor LCDs only support ~16 bit color, and everything looks dithered to hell.

Share this post


Link to post
Share on other sites
[quote name='andur' timestamp='1294727660' post='4757111']
Do you have any color post processing turned on, on your monitor itself? Those modes can add in weird banding artifacts and the like. I don't see any banding on my monitor.
[/quote]
Or any other strange setting for that matter? For example, my monitor has a "Response Time" setting. Changing it from "Normal" to "Fast" or "Very Fast" causes it to reduce the 8 bits per channel to s.th. that looks like 4 bits per channel.

Share this post


Link to post
Share on other sites
Most LCD-TN panels have really crappy color quality for my standards, I read they are often far less than 8 bit per channel (for the sake of response times, they will tell you). They make me miss my 24" 11-year old CRT! I will get an LCD-IPS a day.

By the way, I can see the banding on both a desktop LCD and a laptop LCD (both TN panels). The banding on the laptop i less noticeable but I wonder if it is because of better processing of just because of lower nits.

Share this post


Link to post
Share on other sites
Or... accept the ugly truth: 8 bits per channel is good, but still not enough precision to get a perfectly nice and smooth gradient.
You can do some simple test... just render some black to white gradient without any kind of dithering where each individual color would get, lets say, 5 or more pixels (i.e. a 1280 pixels wide gradient) and see in your monitor how many bands can you count. If you count around 256 different bands then the display is not the problem... is just precision!
There is an old 10 bit scanout sample from microsoft directx sdk that can show you how much better a 10 bits per channel gradient can look (after all you get 4 times more precision and still use a 32 bits render target)

Share this post


Link to post
Share on other sites

This topic is 2522 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this