What is this phenomenon called?

Started by
8 comments, last by Krypt0n 11 years, 7 months ago
I'm not really sure how to describe it,but it's in every game,2D and 3D,also on wallpapers and everything.Whenever there is a sky or space or some kind of nebula graphic,it looks as if it's made of layers splatted on one another.Or when there's a sky and there's a light gradient in it,it's NEVER smooth,even in the newest games,you can notice the gradient is made of belts of pixels of the same color.Is this some sort of automatic GPU optimization to even out similar nearby pixels or something?
Advertisement
Color Banding - Very annoying, and can be reduced through dithering. However, different contrast settings on monitors make it hard to predict what's going to look ugly or not. Something that looks fine on my monitor may look outright ugly on yours, or vise versa. I encounter this alot - I scan a image and edit it, and it looks perfectly fine on my desktop, but on my laptop the monitor contrast between the colors is much higher, and the parts I edited stick out very clearly.

If/when we move to 16 bits per color channel (with both hardware and software to support it), it should be greatly reduced if not eradicated entirely (unless someone has insane vision). Eradicate the gradient color banding, I mean. Not the ugly contrast of my scanned images, which I have to touch up manually to blend the edited with the non-edited. dry.png
A lot of cheap LCD screens make this effect worse -- the standard monitor output is 24-bit sRGB (8 bits / 256 colors per channel, on a curve that dedicates more bits to the dark areas where humans are good at picking up on color banding), however, cheap LCD screens convert that signal down to just 18-bits (6 bits / 64 colors per channel).
They try and hide this flaw by dynamically ramping up/down the brightness contrast settings so that for example, if you're looking at an image with only color values from 0-64 or 128-192, then the monitor will be able to display it perfectly. The problem is that with very vivid / high-contrast images, these hacks don't work and the monitor shows it's true colors.
Also, some monitors have different color curves (for instance mine has "Standard", "Gaming", "Scenery", "Theater", "Night View" modes) which may alleviate the problem.. but in general, yeah, 48/64-bit color would be the ultimate fix - I doubt anyone would perceive disturbing gradients at that resolution.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

its a problem due to liquid crystals that can't be very precisely oriented, CRT had true 24 bits display, and there was no banding. MVA LCD are better for angles, and IPS are better for colors. TV screens are IPS mostly, computer are TN which is the poorest of all LCD techno, the only advantage is the speed.

computer are TN which is the poorest of all LCD techno, the only advantage is the speed

Don't forget price. TN panels are hideously cheap.

But if the banding is an issue for you, cheaper IPS panels can be had from about $150 these days, with some very decent ones coming in around the $400 mark. Even though the cheaper IPS displays are also 6-bit (+2 dither) panels, they seem to suffer much less from banding and other colour artefacts, than cheap TN panels do.

And if you happen to have money to burn, $1,000+ will net you a true 10-bit panel, which (at least in theory) makes banding a thing of the past.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

There's always some amount of banding - the colours are quantized to a lesser precision than your eye after all.
e.g. in this image here, I've painted a green gradient from ~8 to ~30 over a large area, which means that each discrete level of green covers quite a thick horizontal band.
On any monitor, you should be able to see the different bands of green as horizontal lines -- also, your brain should trick you with the 'mach bands' optical illusion, where each constant band appears as if it has a gradient inside it which is lighter at the bottom of the band and darker at the top.
n.b. I've dithered the right hand side of the image -- so the left side should show bands, while the right side should have no bands, but instead looks noisy.

And if you happen to have money to burn, $1,000+ will net you a true 10-bit panel, which (at least in theory) makes banding a thing of the past.
can you recommend one? I was looking for them, but couldn't really find a good one (except one directly from dolby, which is way more expensive).
I'd love to disable tone mapping and output HDR directly to my display :), with dithering on 10bit/chan the quality should be really superb.

I'd love to disable tone mapping and output HDR directly to my display
N.B. you still need tone-mapping for a 10-bit display as it's only 1024 levels, whereas normal human vision is a massive range from 1/1,000,000th of a candle up to 1,000,000 candles.
From 1 millionth to 1/100th of a candle is "night-vision", which you have to simulate with a tone-mapper (like this one), as it won't actually kick in unless you're in a completely black room -- it takes a while for you to switch into this vision mode, but once you do, your rod-cells can fire with as little input as a single photon. It's very sensitive, though you lose all colour perception and only see 'blue' light.
So even on a theoretical true HDR float-32 display, low-light night scenes would have to be tone-mapped, or they'd just look black in regular viewing conditions.

From 1/100th to 3 candles is regular dimly lit vision. 3 candles up to 1M candles is regular daytime vision, which is way too large a range to display without tone-mapping.

[quote name='Krypt0n' timestamp='1347286803' post='4978579']
I'd love to disable tone mapping and output HDR directly to my display
N.B. you still need tone-mapping for a 10-bit display as it's only 1024 levels, whereas normal human vision is a massive range from 1/1,000,000th of a candle up to 1,000,000 candles.
[/quote]while you could sense a massive range, you cannot do it at the same time. your eyes adapt to just a subband of that range. if I can use a range of 10bit, I hope to have enough "brightness" to play with to blend you and yet render darker areas (with dithering, yet without tonemapping) to allow your eyes to adapt to it after a few seconds.
I dont plan to show you night sky and sun at the same time, so I don't plan to tonemap that massive range to only 1024. (I admit, I was actually looking for a 12bit HDR display, but I guess 10bits will be better than those usual 6bit).

I've made previously some tests with a RAW picture mode of my DSLR, and 12 bits, while not perfectly covering all the range, can capture quite a good range without massive banding.

This topic is closed to new replies.

Advertisement