# Question about tone mapping / HDR

This topic is 2108 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I've been trying to implement HDR but have been having some trouble figuring out the correct approach.  I have bloom working, but not tone mapping.  I am using DirectX 11.

I have read several sources but they each say different things.  I currently have a shader that takes the texture from the first pass and renders to the luminance texture, then finds the average luminance by generating all of the mip maps.  The final pass applies the luminance scaling (along with the bloom).

I am confused why other articles and samples compute the average luminance by downsampling the texture multiple times.  Why is it not enough to just call deviceContext::GenerateMips()?  I saw another article suggesting using the compute shader to do this.

I am also confused what I should be storing as the luminance.  This article talks about finding the average log of the pixel color and some color weights http://msdn.microsoft.com/en-us/library/windows/desktop/bb173486(v=vs.85).aspx, but this post says nothing of logarithmic operations when finding the average scene luminance http://www.gamedev.net/topic/532960-hdr-lighting--bloom-effect/, but does use the exponential operator when computing the final color.

This paper linked to by the microsoft article goes into more detail on the equations, but doesn't talk about how the relate to the shaders at all http://www.cs.utah.edu/~reinhard/cdrom/tonemap.pdf

I tried looking at the samples in the deprecated directx SDK (the compute shader HDR sample), and that version just averages the color of the pixels directly and weights them by (.299, .587, .114, 0) to get the average luminance, then scales the color of the final pixel linearly.  Running the sample looks decent, but when I run it in my program, because blue is weighted so low, looking at the sky makes everything else extremely bright.  Weighting each RGB value equally just makes the final image darker in most cases and doesn't produce any interesting effect.

Which formula should I be using to get the average luminance, and which formula should I be using to convert the unmapped pixel color to the mapped pixel color given the average luminance?

##### Share on other sites

"I am confused why other articles and samples compute the average luminance by downsampling the texture multiple times.  Why is it not enough to just call deviceContext::GenerateMips()?  I saw another article suggesting using the compute shader to do this." - I suspect this amounts to the same thing, but you don't know for sure how GenerateMips is implemented, it's probably going to use the GPU and downsample the texture multiple times in exactly the same way, but if it happens to do the work on the CPU then that could be a massive performance cost. I suspect you're fine to use GenerateMips if you find it's working for you (although perhaps you should google to check that it behaves how you expect on most devices), but I think it makes sense that articles and papers are more explicit about the required process.

"I am also confused what I should be storing as the luminance.  This article talks about finding the average log of the pixel color and some color weights http://msdn.microsoft.com/en-us/library/windows/desktop/bb173486(v=vs.85).aspx, but this post says nothing of logarithmic operations when finding the average scene luminance http://www.gamedev.net/topic/532960-hdr-lighting--bloom-effect/, but does use the exponential operator when computing the final color." - I'm not sure, but I think this probably depends on how you're encoding the HDR. Older implementations would probably do clever things to do HDR with 8888 render targets by encoding some logarithmic scaling factor in the alpha channel. More modern implementations probably just use float or half-float render targets instead. What you need to do depends on your implementation. Most likely you're using half-float render targets and you just need a simple average.

"I tried looking at the samples in the deprecated directx SDK (the compute shader HDR sample), and that version just averages the color of the pixels directly and weights them by (.299, .587, .114, 0) to get the average luminance, then scales the color of the final pixel linearly.  Running the sample looks decent, but when I run it in my program, because blue is weighted so low, looking at the sky makes everything else extremely bright.  Weighting each RGB value equally just makes the final image darker in most cases and doesn't produce any interesting effect." - I think it's correct that blue should have a lower weight when converting RGB to luminance. Perhaps your sky just isn't correctly calibrated against the rest of your scene. Probably most of your scene is dynamically lit with your HDR system and your sky is just a pre-baked 8888 skybox texture, I think that you'd need to modify your sky so that it's a lot brighter. You could do that by modifying the shader you use when rendering to force up the values, or by using a sky texture that can contain HDR values (half-float or something) and get your artist to tweak it.

##### Share on other sites

My sky is just a gradient with pure white (1, 1, 1) at the horizon, and pure blue (0, 0, 1) at the top.  When a large portion of the blue part is in view, the rest of the scene becomes too bright.  Should I be making the sky brighter than one?

Also, I still don't understand where the formula for exposure from the other post comes into play in relation to all of the other descriptions of HDR.

This one: float4 exposed = 1.0 - pow( 2.71, -( vignette * unexposed * exposure ) );

##### Share on other sites

Hiya,

You should be using HDR values for your scene, if your colours are in the range [0...1] then you don't need HDR tonemapping.

So, as an example, if your brightest pixel is 10,10,10 (instead of 1,1,1) then your blue would be (0,0,10) or higher.

n!

##### Share on other sites

Thanks for the response.  Your clarification on the formulas help.

In addition to that, I think I'm actually having an issue generating (or sampling) the mip maps correctly.

Originally I was doing this in the final shader:

float4 luminance = luminanceTexture.SampleLevel(g_samPoint, float2(.5f, .5f), 10); //luminanceTexture should have the average luminance at the lowest mip level

But sampling at the 10th level is zero for most cases -- it only seems to average the luminance values near the center of the texture.  I tried sampling at other levels in between and it seems to be interpolating or something.  Rendering the luminance texture to the screen with the texture coordinates of the pixel looks like this for different mip levels.

The brightness of the texture changes based on the position of the sphere on screen.  When I sample at the lowest level, the screen is correctly all one color, but it changes when I move the circle around.

Edited by Vexal

##### Share on other sites

I figured out that the mip mapping is not correctly averaging the pixels.  Looking at the mip levels in visual studio, it shows this.

The color of the final pixel changes based on the position of the objects in the screen.  The closer the objects are to the center, the brighter the final pixel.  In this example, the final pixel ends up being zero and I cannot figure out why.

Level 0:

Level 3

Level 5:

Level 9:

And finally, level 10 is just a single black pixel.  I checked it with the color tool and it's 0,0,0

##### Share on other sites

The brightness of the texture changes based on the position of the sphere on screen.  When I sample at the lowest level, the screen is correctly all one color, but it changes when I move the circle around.

This shouldn't happen if the mip-maps are being generated correctly...
Another issue could be the texture format used -- are you using an FP16, or other "higher than 8-bit channels" format?

Level 9:
And finally, level 10 is just a single black pixel.  I checked it with the color tool and it's 0,0,0

Ok, that shows it jumping from a 3px wide texture to a 1px wide texture. Depending on how they're generating the mip-maps, they might be failing to take the right-most pixel into account (just averaging the two left-most pixels). That's very bad :/

You can work around this by calculating the mip-levels yourself instead of using the automatic method. The performance cost will be the same (assuming you do it the same way as the automatic method does it), but your code will obviously be more complex, and you'll have full control over the calculations.

Edited by Hodgman

##### Share on other sites

Does your texture contain just luminance, or log(luminance)?

##### Share on other sites

Right now it just contains luminance.

• ### Game Developer Survey

We are looking for qualified game developers to participate in a 10-minute online survey. Qualified participants will be offered a \$15 incentive for your time and insights. Click here to start!

• 14
• 30
• 9
• 16
• 22