•      Sign In
• Create Account

# Looking for HDR Clarification

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

14 replies to this topic

### #1nfactorial  Members   -  Reputation: 583

Like
1Likes
Like

Posted 20 January 2013 - 04:38 PM

Hiya,

In my spare time I'm working on a game engine and associated editing tool as a self improvement, I'm currently attempting to implement HDR rendering, and I have it working to a point but, I think, I am missing some minor detail. My engine is using deferred rendering which is working fine, and I've added HDR support over the last evening. This is the first time I've looked at both deferred rendering as well as HDR, so it's most likely my own fault or misunderstanding.

So, I'm rendering a simple scene which consists of some terrain, a sky and a couple of other things. After I've combined my lighting in my deferred shader and added the transparent objects (into a R16G16B16A16 floating point target) I then compute the luminance for the scene and finally tonemap the render target onto the back buffer.

My issue, is that the HDR tonemap process is always lifting the images brightness such that it always looks like it's in the middle of the day. For example, if I reduce the sun brightness very low. The scene should be dark, because there isn't much light. However the HDR tonemapper determines that the scene is very dark and lifts the brightness such that the scene becomes too bright.

To show what I mean, here is an image showing the scene in low lighting conditions with HDR disabled:

This next image shows the same scene, with the same lightingg with HDR enabled:

Now, I understand what HDR is doing and that the luminance is low so, therefore, the HDR tonemap process tries to fit the image into the correct range, raising the brightness. My issue is that, I *want* the image to remain dark as the lighting *is* dark. The situation deteriorates if I adjust the camera to look down, as the sky is artificially raising the average luminance in the previous image, so when that disappears off the screen the HDR thinks the scene is even darker, and thus brightens it even further (screenshot below, with the *same* lighting conditions).

For example, if it is the middle of the night I do not want the tonemap to adjust the scene's brightness so much that it looks like it's the middle of the day!

A thought I had is that, perhaps, there should be another luminance value that I can specify that tells the system 'This is the luminance I'm expecting' and I have a max( expectedLuminance, averageLuminance) in the shader, but that seems to go against the grain of what HDR is supposed to achieve. So, I think I'm missing a minor detail in the theory and wondered if anyone could point me in the right direction.

Thanks,

n!

Sponsor:

### #2Hodgman  Moderators   -  Reputation: 22471

Like
1Likes
Like

Posted 20 January 2013 - 06:17 PM

The same thing would happen with a real camera - if you use an arbitrarily long exposure at night, it will look like day. In real-life though, as you increase your exposure times, you're also reducing your frames-per-second and increasing motion blur, so when filming there is an arbitrary limit to how long you can expose a shot for.

You can think of your auto-exposure code, not as graphics code, but as AI code for a cameraman ;) putting in some arbitrary constraints on an AI so it makes seemingly intelligent decisions is fine.

### #3nfactorial  Members   -  Reputation: 583

Like
1Likes
Like

Posted 20 January 2013 - 06:43 PM

Ahhh, so nothing in particular I've fluffed up then. I'm using the tone map from a DX sample based on Reinhard:

color.rgb *= MIDDLE_GRAY / ( luminance + 0.001f );
color.rgb *= ( 1.0f + vColor / LUM_WHITE );
color.rgb /= ( 1.0f + vColor );

I guess, I should put a bit more research into that then and what the norm is for controlling it and the alternatives. I wasn't sure if it were something I'd misunderstood earlier in the pipeline.

Thanks for the reply!

n!

### #4larspensjo  Members   -  Reputation: 1517

Like
1Likes
Like

Posted 21 January 2013 - 02:13 AM

Are you using the average luminance of the scene?

It may not be what you want, and it can add some shader cost also. That is, when you turn around, a subsection of the scene will get varying brightness.

In my case, I use a constant (but calibratable) white point value. You can see some of my experiences in the HDR section in http://ephenationopengl.blogspot.de/2012/10/deferred-shader-part-2.html.

Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

### #5nfactorial  Members   -  Reputation: 583

Like
1Likes
Like

Posted 21 January 2013 - 03:27 AM

Hey,

Yeah, I'm calculating the average luminance of the scene and using that in the reinhard calculation. I do want to swap to a filmic tonemap operator eventually, but I thought there isn't much point switching when it isn't working as desired with reinhard. One step at a time, as they say.

My problem with a fixed constant, unless I'm misunderstanding, is that there wouldn't be any auto-eye adjustment present. I did consider having various regions in the scene where the constant could be altered, but that just 'feels' wrong in my head, and none of the presentations I've looked at (various ones from DICE, Crytek, Naughty Dog etc) mention anything similar in their engine. For example, if I set up a really bright area in my scene and move into it, the rest of the scene gets darkened while the HDR adjusts to display the bright area and this works perfectly as I expected. However, when moving into a dark area it gets brightened way too much such that it no longer looks like a dark area at all.

I am considering calculating the min, max and average luminance. And using those values to control the tonemapping, or having the tonemap adjust based on how dark the average luminance is. For example, rather than remap a low luminance scene to [0...1] it may remap it to [0...0.5] or something. I guess I'll play with it more this evening after work.

Thanks!

n!

### #6Bacterius  Crossbones+   -  Reputation: 6614

Like
1Likes
Like

Posted 21 January 2013 - 05:29 AM

I think the idea is to limit the amount of tonemapping to account for the inability of the eye to compensate when it's just too dark. That way, when you go into a dark room in your engine, the engine will want to raise everything by a factor of, say, X, but your eye may only increase perceived brightness by Y < X, and thus the scene will still appear somewhat dark. There should be some formulation of this around the internet, but I can't find it.

I agree with Hodgman, it should be AI code and not graphics code.

Edited by Bacterius, 21 January 2013 - 05:30 AM.

"The best comment is a deleted comment."

### #7nfactorial  Members   -  Reputation: 583

Like
0Likes
Like

Posted 21 January 2013 - 12:28 PM

With more thought throughout the day, I think that the MIDDLE_GRAY value used in the calculation shouldn't be a constant. But should be a variable based on the calculated average luminance. Not a simple variable but, perhaps, a [logarithmic] curve of some description. So, in low lighting conditions MIDDLE_GRAY is also low though higher than the average luminance (narrowing to zero as the luminance approaches zero) and escalating with the luminance. I'll need to play with that for a bit, but that's how it's starting to make sense in my head (whether I'm wrong or right, I guess I'll find out ;)).

Thanks for the replies so far everyone!

n!

### #8Hodgman  Moderators   -  Reputation: 22471

Like
1Likes
Like

Posted 21 January 2013 - 06:57 PM

In my last game, we just put lum = min(a, max(b, lum)) before the tone-mapper, so that the average lum has limits. If it's 0, then it instead gets clamped at b, and if it's 1000000 it gets clamped at a. The artists could then play with a/b to find decent limits so that dark caves didn't end up looking like daylight.

I'm using the tone map from a DX sample based on Reinhard:

color.rgb *= MIDDLE_GRAY / ( luminance + 0.001f );
color.rgb *= ( 1.0f + vColor / LUM_WHITE );
color.rgb /= ( 1.0f + vColor );

I'm a bit confused with your code; you seem to have two colour inputs: color and vColor? What are they?

Despite my lack of understanding, we can probably break this out into two lines of math though. The first is your exposure, and the second is a the tonemapper. I find that trying to think about these two ideas separately makes it easier for me, but YMMV
C' = C * Grey / Lum
final = C' * ( 1 + V / White ) / ( 1 + V );

I'm calculating the average luminance of the scene and using that in the reinhard calculation.

Are you averaging the luminance values, or averaging log(luminance) values? The latter is known as the geometric mean (as opposed to the regular, arithmetic mean), and is usually used in these calculations.

With more thought throughout the day, I think that the MIDDLE_GRAY value used in the calculation shouldn't be a constant. But should be a variable based on the calculated average luminance.

In photography, that "MIDDLE_GRAY" value is sometimes called the "key", or a zone in the Ansel Adams system, or even "mood" because speaking artistically, it gives the final tone-mapped image a different mood, as to whether it's a day or night shot, etc... Technically, I think it's supposed to be the value that an 18% diffuse grey ball would have if it were present in your shot.

Edited by Hodgman, 21 January 2013 - 07:04 PM.

### #9nfactorial  Members   -  Reputation: 583

Like
0Likes
Like

Posted 22 January 2013 - 02:50 AM

I'm a bit confused with your code; you seem to have two colour inputs: color and vColor? What are they?

Yeah, that was my bad sorry. I copied the code and thought it would be easier without the v prefix, so changed it to 'color' inside the post. And forgot to change the others, it's all the same variable 'color' sorry.

Are you averaging the luminance values, or averaging log(luminance) values? The latter is known as the geometric mean (as opposed to the regular, arithmetic mean), and is usually used in these calculations.

The posted images are using averaged luminance values, but I also tried calculating with the log(luminance) values and didn't get much difference in the output images.

In photography, that "MIDDLE_GRAY" value is sometimes called the "key", or a zone in the Ansel Adams system, or even "mood" because speaking artistically, it gives the final tone-mapped image a different mood, as to whether it's a day or night shot, etc... Technically, I think it's supposed to be the value that an 18% diffuse grey ball would have if it were present in your shot.

And yup, reading some articles around the net (http://mynameismjp.wordpress.com/2010/04/30/a-closer-look-at-tone-mapping/) for example, I noticed it talked about ''where the geometric mean (log average) of scene luminance is calculated and used to scale the luminance of each pixel. With this approach a “key value” is user-controlled, and is meant to be chosen based on whether the scene is “high-key” (bright, low contrast) or “low-key” (dark, high contrast).'' which leads me to think that the key value (MIDDLE_GRAY, as you say) is not a constant for all scenes. If I change the MIDDLE_GRAY value to a low value (such as 0.2 or 0.3) my dark scene remains dark, but then my bright areas do not adjust.

Though I am maybe misunderstanding the point. However, just to stress, the HDR looks like it's working, it's just working too well, making my dark images too bright. It works great when I move into really bright areas. Which also leads me to think the key value cannot be a constant across all scenes. For example, running Crysis 2 or another game with HDR rendering, the scene luminance does not change noticeably whilst I look around. Though I realise it'll be less drastic there as they will adjust the luminance over time, whereas it's currently instant in my  editor (I can add time to the adjustment once I'm happy).

I didn't get to play with my HDR last night, due to friends interrupting my life (curse them!!!) so I'll make another attempt tonight

Thanks again for the reply!

n!

### #10nfactorial  Members   -  Reputation: 583

Like
0Likes
Like

Posted 22 January 2013 - 03:22 AM

In my last game, we just put lum = min(a, max(b, lum)) before the tone-mapper, so that the average lum has limits. If it's 0, then it instead gets clamped at b, and if it's 1000000 it gets clamped at a. The artists could then play with a/b to find decent limits so that dark caves didn't end up looking like daylight.

Sorry, was supposed to quote this bit in the previous reply. This is pretty close to what I came up with yesterday, But I considered something similar to lum = computeLuminance( lum ). Where computeLuminance would look like:

float computeLuminance( float luminance )
{
return min( a, max( b, luminance ) );
}

Given your sample clamping, and I was considering more of a curve computation instead. That's where I got up to with my thinking yesterday anyway..

Thanks again,

n!

### #11nfactorial  Members   -  Reputation: 583

Like
0Likes
Like

Posted 22 January 2013 - 03:55 AM

Ugh, had luminance too much on my mind in that post

The correct name for the example function would be "float computeKey( float luminance )". So, I guess it is different to what you were suggesting.

Thanks,

n!

### #12larspensjo  Members   -  Reputation: 1517

Like
1Likes
Like

Posted 22 January 2013 - 05:46 AM

color.rgb *= ( 1.0f + vColor / LUM_WHITE );
color.rgb /= ( 1.0f + vColor );

The modified Reinhard equation, equation 4, use the square of the white point, doesn't it? That is, it scales with  ?

Or am I missing something?

Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

### #13nfactorial  Members   -  Reputation: 583

Like
0Likes
Like

Posted 22 January 2013 - 07:10 AM

Ahh, that pdf is fantastic and looks like it's discussing everything I'm having issues with thanks! Not sure how I didn't find it with all the Reinhard googling I did over the weekend, I found a completely different tonemap.pdf that wasn't as helpful. Very useful thanks, will sit down and look through it thoroughly this evening.

I think  you're correct to some degree, I can't test it too much in my lunch hour. But changing a few things makes some parts behave better and other parts behave worse, but that could be down to me just plugging it in without much thought. Certainly the document looks like it'll be very useful when I get the time later,

Thanks again,

n!

### #14nfactorial  Members   -  Reputation: 583

Like
0Likes
Like

Posted 22 January 2013 - 03:12 PM

Yes, that pdf has been very useful so far thankyou. I've managed to get my scene behaving at least in a way I'm expecting, it's not perfect yet but that's mostly playing with numbers. I've switched, for now, to using the rather simple Ld(x,y) = L(x,y) / 1 + L(x,y) and I'll work my way through everything with some more tests, so that it's all clear in my head. But my scene's now remain dark when it's dark, and bright when it's bright.

A couple of sample screenshots with the modified tonemap operation (they're not so pretty unfortunately, I'm no artist), I've removed the water as the shader needs to be updated to handle HDR lighting on it.

This shot shows a scene with a bright sunlight:

And this screenshot shows the same scene with low lighting conditions:

As you can see, the dark lighting is now remaining dark. Still not sure what is wrong with the original tonemap I copied from the DX SDK, but at least I understand what this one is doing and it makes sense to me.

Thanks again,

n!

### #15MJP  Moderators   -  Reputation: 8525

Like
0Likes
Like

Posted 22 January 2013 - 07:24 PM

At work we just let the artists pick the value for different scenes, and they just use it as a parameter to the auto-exposure system. I've seen some algorithms for automatically estimating a key value based on a luminance histogram, but in my experience generating a histogram with the required number of buckets just wasn't worth the cost. We also allow the artists to specify min and max exposure values in order to limit the auto-exposure to a certain range, which is functionally similar to what Hodgman described. However we expose all of the exposure values in log2 space so that they're more intuitive.

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

PARTNERS