Arithmetic vs. geometric mean avg luminance during nighttime scenes

Started by
1 comment, last by CDProp 10 years, 1 month ago

Greetings, all.

I was wondering if we could discuss this issue a bit. For the purposes of simple exposure control, it seems common to store the log of the luminance of each pixel in your luminance texture. That way, when you sample the 1x1 mip map level and exponentiate it, you end up with the geometric mean luminance of the scene. This is done to prevent small, bright lights from dimming the scene.

I find that this works really well, but perhaps a little too well. I am using a 16F texture, and so the brightest pixel I can have is 65355. If I have a really dark nighttime scene, such that things look barely visible without any lights, and then I point a bright spotlight at the player (just a disc with a 65355-unit emission), it hardly affects the exposure at all. I would expect a bright light to sort of blind the player a bit and ruin her dark adaptation, so that the rest of the scene looks really dark. I have found that the light needs to cover nearly 20% of the pixels on the screen before it begins to have this effect.

So I switched over to using an arithmetic mean (just got rid of the log/exp on each end) and now it works more like what I would expect.

If you were in my shoes, would you switch to an arithmetic mean, or would you try to find exposure settings that will work better with a geometric mean?

FWIW, my exposure-control/calibration function is just hdrColor*key/avgLum, where key is an artist-chosen float value, and avgLum is the mean luminance (float). After that, I'm tone mapping with Hable's filmic curve. If you have any suggestions on how to improve it, that would be most helpful. I suppose I could also experiment with histograms and so forth, but I'm not sure if they're meant to solve this particular problem.

Advertisement

one birght pixel should not blind the player, that doesn't happen in real live either. it's actually a problem, because as it doesn't blind you, your eye tries not to adapt to it, but it still can burn into your retina (e.g. sparks from welding or eclipse of the sun where tons of ppl watch straight at it and get burnt retinas, every time).

But in games (or especially in movies) you usually you don't tone map a raw screen with few bright pixel, you rather do your post processing first and then you apply some bloom/glow and while we fake it to bloom to some specific radius (using limited kernels to save time), you'd actually define the radius based on the camera settings and the brightness.

so if you'd have a pixel in the night that is 65520, it would 'bleed' to maybe 20% of the screen and if you tone map this, it would work out fine again. (you probably don't want to have just one bright pixel like this as it would end up in hell of an aliasing, having 8x8 pixel of that brightness that bleed out to 20% of all pixels doesn't sound that crazy to me). if you apply motion blur, you further extend the bounds of the bright pixel).

it's really not just about tone mapping, you need a lot of post processing effects to get it somehow right. (I mean, to get it movie alike)

Sorry for the delay. I've been doing school work the past couple of days and I had to wait until I got a chance to get back to this.

I definitely do not want one or two bright pixels to blind the player. However -- and maybe this just has to do with the values I'm choosing for my lights -- this hasn't really been a problem for me. If I have a spotlight shining toward the player, and the spotlight is far away enough that it takes up a 16x16 area on the screen, it has almost no noticeable effect on the exposure value. It's only when I get close enough to the light that I would expect significant dimming that I actually see it occur.

Your advise about implementing multiple post-processing effects makes sense. Bloom is really important for indicating brightness.

This topic is closed to new replies.

Advertisement