HDR gamma correction

Started by
3 comments, last by LeGreg 8 years, 6 months ago

Hi,

I understand that with texture fetches in the shader, we normally want to apply gamma correction so we can do lighting calculation/work in fragment shader in linear space.

What I don't quite understand is, do we strictly have to do it for absolutely all textures/image formats?

Let's say I have a cubemap in .HDR format, isn't that already stored in linear space? So does that mean I do not have apply gamma correction whenever I fetch from my cubemap?

Advertisement

I've never heard of .hdr format, and google searches don't seem to indicate it is any kind of GPU texture format. Can you provide more information?

HDR images are already linear, yes. The only images that need gamma correction are sRGB images (typically 8 bpc), which are your "normal" images in common image formats. Essentially you can just tell the API that the textures are sRGB and the GPU will handle the correction for you.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

I've never heard of .hdr format, and google searches don't seem to indicate it is any kind of GPU texture format. Can you provide more information?

I might have not worded it properly, I have 6 images in .hdr for a cubemap, I didn't mean .hdr is a cubemap format. You can create floating point textures from .hdr images directly, it's one of the image formats that can have floating point precision, unlike jpeg/png.

HDR images are already linear, yes. The only images that need gamma correction are sRGB images (typically 8 bpc), which are your "normal" images in common image formats. Essentially you can just tell the API that the textures are sRGB and the GPU will handle the correction for you.

Thanks, I thought that was the case.
Additional note,


While people have pointed out that if your data is in non linear sRGB format, then doing a filtering directly with it is going to be incorrect (resulting in more aliasing or weird differences in brightness).. On the other hand, doing a sRGB to linear conversion before filtering, is NOT giving you a correct result especially when the higher dynamic range is involved.


Arguably you can't get a correct (perceptual space) result with pre-filtering. Prefiltering is done in two places : First when computing the mipmaps (could be done off line), second when doing the bilinear taps/anisotropic filtering. Those two steps assume (wrongly in the case of the HDR content) that the filtering steps and shading steps can be done in any order.


The only time they are (strictly) commutative is when the shading step is a linear operation. The tonemapping alone is not going to be linear, except on small ranges (approximately). For example on a sigmoid function (if your tone mapping function looks like a sigmoid), there's a narrow middle band where things almost look linear/affine, but then near zero or in higher intensity it flattens and we lose the commutativity on any range of values that go near that.


What can you do ? Very little actually (you can live with it). Unless you forgo hardware filtering completely you have to rely on this pre-filtering no matter what. Doing it in "linear" (lighting) space is slightly less wrong than doing it in sRGB space but not by very much.


You could, in theory, do supersampling (doing all the operations at a higher resolution then downsample) so that the textures are filtered in perceptual space a little bit. But it's usually considered too expensive to do by default. You can also ask your artists to make textures that are very flat and don't have a very high contrast (at all levels of minification), that way no matter what calculations you do in your shaders and tonemapping phase, the final values are going to be close together and the function that transforms one to the other can be approximated with a linear function. That's of course very limiting and may not fit your content at all. You could also in theory have a larger support for your filtering function (at the cost of extra blurriness).


(this problem also affects other types of calculations that are not linear or affine, like lighting calculations from normal maps, or anything else we do in shaders these days).


The end result is not necessarily going to be super wrong of course. But you may end up with more aliasing and artifacts that you would have liked.

This topic is closed to new replies.

Advertisement