question about HDR luminance calculation
Good evening,
Im following the directx sdk HDR sample for doing some nice HDR tone mapping, and i notice that to calculate luminance, they calculate luminance first for a scaled down version of the scene, then average all of these luminance values together to get the average luminace for the scene
The equation they use to calculate luminance for a single pixel is:
lum = log(dot(colorValue, LUMINANCE_VECTOR) + 0.0001f)
Im wondering if instead of doing luminance calculation then averaging, if I could instead average my color values first then do a single luminance calculation on that average color and still get a correct luminance for the scene?
That's not going to work because log(a) + log(b) = log(a * b). And what about dot(a * b) * dot(a * c)?
Ah that makes sense; oh well it was worth a shot :P
I was hoping for a quicker way to calculate luminance but i guess theres no way to really avoid that log
I was hoping for a quicker way to calculate luminance but i guess theres no way to really avoid that log
Yup, you have to convert to luminance first before averaging. But the upside of that is that the surfaces you use for downsampling will be smaller, since they only need one or two channels.
Also if you're interested in seeing the math for luminance calculation and tone-mapping, it all comes from this paper.
Also if you're interested in seeing the math for luminance calculation and tone-mapping, it all comes from this paper.
Just be aware that the luminance mapping equation (eqn. 1) in that paper is wrong. The correct equation is:
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement