Jump to content

  • Log In with Google      Sign In   
  • Create Account


What is this phenomenon called?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
9 replies to this topic

#1 mrheisenberg   Members   -  Reputation: 356

Like
0Likes
Like

Posted 09 September 2012 - 08:16 PM

I'm not really sure how to describe it,but it's in every game,2D and 3D,also on wallpapers and everything.Whenever there is a sky or space or some kind of nebula graphic,it looks as if it's made of layers splatted on one another.Or when there's a sky and there's a light gradient in it,it's NEVER smooth,even in the newest games,you can notice the gradient is made of belts of pixels of the same color.Is this some sort of automatic GPU optimization to even out similar nearby pixels or something?

Sponsor:

#2 Servant of the Lord   Crossbones+   -  Reputation: 18283

Like
5Likes
Like

Posted 09 September 2012 - 08:42 PM

Color Banding - Very annoying, and can be reduced through dithering. However, different contrast settings on monitors make it hard to predict what's going to look ugly or not. Something that looks fine on my monitor may look outright ugly on yours, or vise versa. I encounter this alot - I scan a image and edit it, and it looks perfectly fine on my desktop, but on my laptop the monitor contrast between the colors is much higher, and the parts I edited stick out very clearly.

If/when we move to 16 bits per color channel (with both hardware and software to support it), it should be greatly reduced if not eradicated entirely (unless someone has insane vision). Eradicate the gradient color banding, I mean. Not the ugly contrast of my scanned images, which I have to touch up manually to blend the edited with the non-edited. Posted Image

Edited by Servant of the Lord, 09 September 2012 - 08:43 PM.

It's perfectly fine to abbreviate my username to 'Servant' rather than copy+pasting it all the time.

[Fly with me on Twitter] [Google+] [My broken website]

All glory be to the Man at the right hand... On David's throne the King will reign, and the Government will rest upon His shoulders. All the earth will see the salvation of God.                                                                                                                                                            [Need web hosting? I personally like A Small Orange]
Of Stranger Flames - [indie turn-based rpg set in a para-historical French colony] | Indie RPG development journal


#3 Hodgman   Moderators   -  Reputation: 28747

Like
3Likes
Like

Posted 09 September 2012 - 11:32 PM

A lot of cheap LCD screens make this effect worse -- the standard monitor output is 24-bit sRGB (8 bits / 256 colors per channel, on a curve that dedicates more bits to the dark areas where humans are good at picking up on color banding), however, cheap LCD screens convert that signal down to just 18-bits (6 bits / 64 colors per channel).
They try and hide this flaw by dynamically ramping up/down the brightness contrast settings so that for example, if you're looking at an image with only color values from 0-64 or 128-192, then the monitor will be able to display it perfectly. The problem is that with very vivid / high-contrast images, these hacks don't work and the monitor shows it's true colors.

Edited by Hodgman, 09 September 2012 - 11:34 PM.


#4 Bacterius   Crossbones+   -  Reputation: 8352

Like
0Likes
Like

Posted 10 September 2012 - 01:02 AM

Also, some monitors have different color curves (for instance mine has "Standard", "Gaming", "Scenery", "Theater", "Night View" modes) which may alleviate the problem.. but in general, yeah, 48/64-bit color would be the ultimate fix - I doubt anyone would perceive disturbing gradients at that resolution.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#5 Lightness1024   Members   -  Reputation: 704

Like
0Likes
Like

Posted 10 September 2012 - 04:28 AM

its a problem due to liquid crystals that can't be very precisely oriented, CRT had true 24 bits display, and there was no banding. MVA LCD are better for angles, and IPS are better for colors. TV screens are IPS mostly, computer are TN which is the poorest of all LCD techno, the only advantage is the speed.

#6 swiftcoder   Senior Moderators   -  Reputation: 9773

Like
0Likes
Like

Posted 10 September 2012 - 05:27 AM

computer are TN which is the poorest of all LCD techno, the only advantage is the speed

Don't forget price. TN panels are hideously cheap.

But if the banding is an issue for you, cheaper IPS panels can be had from about $150 these days, with some very decent ones coming in around the $400 mark. Even though the cheaper IPS displays are also 6-bit (+2 dither) panels, they seem to suffer much less from banding and other colour artefacts, than cheap TN panels do.

And if you happen to have money to burn, $1,000+ will net you a true 10-bit panel, which (at least in theory) makes banding a thing of the past.

Tristam MacDonald - Software Engineer @Amazon - [swiftcoding]


#7 Hodgman   Moderators   -  Reputation: 28747

Like
1Likes
Like

Posted 10 September 2012 - 05:35 AM

There's always some amount of banding - the colours are quantized to a lesser precision than your eye after all.
e.g. in this image here, I've painted a green gradient from ~8 to ~30 over a large area, which means that each discrete level of green covers quite a thick horizontal band.
On any monitor, you should be able to see the different bands of green as horizontal lines -- also, your brain should trick you with the 'mach bands' optical illusion, where each constant band appears as if it has a gradient inside it which is lighter at the bottom of the band and darker at the top.
n.b. I've dithered the right hand side of the image -- so the left side should show bands, while the right side should have no bands, but instead looks noisy.

Edited by Hodgman, 10 September 2012 - 05:38 AM.


#8 Krypt0n   Crossbones+   -  Reputation: 2361

Like
0Likes
Like

Posted 10 September 2012 - 08:20 AM

And if you happen to have money to burn, $1,000+ will net you a true 10-bit panel, which (at least in theory) makes banding a thing of the past.

can you recommend one? I was looking for them, but couldn't really find a good one (except one directly from dolby, which is way more expensive).
I'd love to disable tone mapping and output HDR directly to my display :), with dithering on 10bit/chan the quality should be really superb.

#9 Hodgman   Moderators   -  Reputation: 28747

Like
0Likes
Like

Posted 10 September 2012 - 08:34 AM

I'd love to disable tone mapping and output HDR directly to my display

N.B. you still need tone-mapping for a 10-bit display as it's only 1024 levels, whereas normal human vision is a massive range from 1/1,000,000th of a candle up to 1,000,000 candles.
From 1 millionth to 1/100th of a candle is "night-vision", which you have to simulate with a tone-mapper (like this one), as it won't actually kick in unless you're in a completely black room -- it takes a while for you to switch into this vision mode, but once you do, your rod-cells can fire with as little input as a single photon. It's very sensitive, though you lose all colour perception and only see 'blue' light.
So even on a theoretical true HDR float-32 display, low-light night scenes would have to be tone-mapped, or they'd just look black in regular viewing conditions.

From 1/100th to 3 candles is regular dimly lit vision. 3 candles up to 1M candles is regular daytime vision, which is way too large a range to display without tone-mapping.

Edited by Hodgman, 10 September 2012 - 08:41 AM.


#10 Krypt0n   Crossbones+   -  Reputation: 2361

Like
0Likes
Like

Posted 10 September 2012 - 08:45 AM


I'd love to disable tone mapping and output HDR directly to my display

N.B. you still need tone-mapping for a 10-bit display as it's only 1024 levels, whereas normal human vision is a massive range from 1/1,000,000th of a candle up to 1,000,000 candles.

while you could sense a massive range, you cannot do it at the same time. your eyes adapt to just a subband of that range. if I can use a range of 10bit, I hope to have enough "brightness" to play with to blend you and yet render darker areas (with dithering, yet without tonemapping) to allow your eyes to adapt to it after a few seconds.
I dont plan to show you night sky and sun at the same time, so I don't plan to tonemap that massive range to only 1024. (I admit, I was actually looking for a 12bit HDR display, but I guess 10bits will be better than those usual 6bit).

I've made previously some tests with a RAW picture mode of my DSLR, and 12 bits, while not perfectly covering all the range, can capture quite a good range without massive banding.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS