HDR programming

Started by
7 comments, last by Hodgman 5 years, 7 months ago

I got a book called "High Dynamic Range Imaging" (published by Elsevier) and read some pages.  I learned that HDR requires deep color depth (10/12-bit color depth). 

I heard that many vendors and independent developers implemented HDR features into their games and applications because UHD standard requires deep color depth (HDR) to eliminate noticeable color-banding.  I had seen color banding on SDR displays and want to get rid of it.

I googled HDR for OpenGL but learned that they require Quadro or FireGL cards to support that.  How do they get HDR working on consumer video cards?

That's why I want HDR implementation for my OpenGL programs.

Tim

Advertisement

No you don't need those insane card to do this, they support the proper color format to do HDR since a long time.

 

Basically you render your scene color to a  11/16/32bits buffer instead of the usual 8bits one and you make sure that you don't clamp your colors to 1.0. 

During the post-processing stage you tonemap that buffer and you get a result that is displayable on LDR displays.

 

HDR can also be useful to perform effects such as bloom and light adaptation

2 hours ago, Sword7 said:

googled HDR for OpenGL but learned that they require Quadro or FireGL cards to support that.

Unfortunately a lot of information on the internet is woefully out of date.

For example, Valve's original HDR implementation was for their Lost Coast tech demo, and that ran on consumer cards over a decade ago.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

56 minutes ago, swiftcoder said:

Unfortunately a lot of information on the internet is woefully out of date.

For example, Valve's original HDR implementation was for their Lost Coast tech demo, and that ran on consumer cards over a decade ago.

Ok, I got it. Thanks for information. But that old HDR implementation still mention LDR output (cause color-banding). I reviewed pictures on that and noticed some color-banding.  How about HDR10 and Dolby Vision (10/12-bit color output) that UHD standard requires?  There are new HDR10 monitors on market now. That gives wider color spectrum that eliminates color-banding.

So there's separate (but related) topics here: HDR rendering, and HDR output for displays. Depending on your exact Google queries you might find information about one of these or both of these topics.

HDR rendering has been popular in games ever since the last generation of consoles (PS3/XB360) came out.The basic idea there is to perform lighting and shading calculations internally using values that can be outside the [0, 1] range, which is most easily done using floating-point values. Performing lighting without floats seems silly now, but historically GPU's did a lot of lighting calculations with limited-precision fixed-point numbers. Support for storing floating point values (including writing, reading, filtering, and blending) was also very patchy 10-12 years ago, but is now ubiquitous. Storing floating-point values isn't strictly necessary for HDR rendering (Valve famously used a setup that didn't require it), but it certainly makes things much simpler (particularly performing post-processing like bloom and depth of field in HDR). You can find a lot of information about this out there now that it's very common.

HDR output for displays is a relatively new topic. This is all about how the application sends its data to be displayed, and format of that data. With older displays you would typically have a game render with a rather wide HDR range (potentially going from the dark of night to full daytime brightness if using a physical intensity scale) and then using a set of special mapping functions (usually consisting of exposure + tone mapping) to squish that down into the limited range of a display. The basic idea of HDR displays is that you remove the need for "squishing things down", and have the display take a wide range of intensity values in a specially-coded format (like HDR10). In practice that's not really the case, since these displays have a wider intensity range than previous displays, but still nowhere wide enough to represent the full range of possible intensity values (imagine watching a TV as bright as the sun!). So that means either the application or the display itself still needs to compress the dynamic range somehow, with each approach having various trade-offs. I would recommend reading or watching this presentation by Paul Malin for a good overview of how all of this works. As for actually sending HDR data to a display on a PC, it depends on whether the OS and display driver support it. I know that Nvidia and Windows definitely support it, with DirectX having native API support. For OpenGL I believe that you have to use Nvidia's extension API (NVAPI). Nvidia has some information here and here.

Be aware that using HDR output isn't necessarily going to fix your banding issues. If fixing banding is your main priority, I would suggest making sure that your entire rendering pipeline is setup in a way to avoid common sources of banding. The most common source is usually storing color-data without the sRGB transfer curve applied to it, which acts like a sort of compression function that ensures darker color values have sufficient precision in an 8-bit encoding. It's also possible to mask banding through careful use of dithering.

Ok, I got it.  I googled it and learned that on NVIDIA website.  NVIDIA said that all 900 and 1000 series GPU card support HDR (deep color) output for HDR displays. It mentioned about HDMI 2.0. Does it support DP port for HDR output? Some day I will buy new HDR monitor and try it.  Thanks.

I forget to tell something about OS and HDR support.  Windows 10 support HDR at full screen at this time and Microsoft plans to add HDR support for OS system soon.  Do Linux and Mac support HDR output? Also I have Ubuntu 18.04.  I heard that MesaGL added 10bpc for HDR support.

3 hours ago, Sword7 said:

NVIDIA said that all 900 and 1000 series GPU card support HDR (deep color) output for HDR displays. It mentioned about HDMI 2.0. Does it support DP port for HDR output?

Nvidia supports HDR over both DisplayPort and HDMI.

However, be careful when shopping for monitors - quite a few HDR monitors on the market support HDR over HDMI but not DisplayPort (despite having both ports).

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

5 hours ago, Sword7 said:

Do Linux and Mac support HDR output?

Most Apple products use the DCI-P3 color space now (instead of sRGB or Rec709), which is a different "wide gamut" color space. From memory their new products support HDR10 (UHDTV) and DolbyVision output, too.

This topic is closed to new replies.

Advertisement