What will change with HDR monitor ?

Started by
16 comments, last by Hodgman 7 years, 2 months ago

I have a BenQ BL3201PT 32" UHD display that supports 10bit color channels on Windows desktop. In nVidia control panel settings, it shows up as a output color depth 10bpc in addition to 8pbc. Here is a screenshot of the control panel setting (googled from the web, that one has even 12bpc option it looks like)

As mentioned above, agree that simply having 10bpc over 8bpc doesn't mean much, unless the actual color space that the monitor can output would increase. This BenQ does not support the Rec.2020 colorspace (or even close), since I don't see any difference in anything with 10bpc vs 8bpc, i.e. 8bpc does not produce banding, nor does 10bpc mode produce any more brightness/luminance/colors/(dynamic) contrast. The TomsHardware test confirms that the display covers only 71.84% of Adobe RGB 1998 color space (and Rec.2020 is way larger than Adobe RGB 1998), and the 10bpc mode is implemented with FRC (frame rate control), so the "supports 10bpc" label on a display is not worth much alone.

Looking forward to seeing a display first hand that would support the full Rec.2020 space. Anyone knows if such displays are actually available for purchase yet?

Advertisement

Is 10-bit color mode presentation supported on current OS desktop compositors? I was aware that, at least on Windows, it is not and dirty OGL "hacks" are needed .__.

It is recommended practice on Windows to use 10bit backbuffers!

I have a BenQ BL3201PT 32" UHD display that supports 10bit color channels on Windows desktop ... the 10bpc mode is implemented with FRC (frame rate control) [AKA 8bit with dithering]
...I don't see any difference in anything with 10bpc vs 8bpc, i.e. 8bpc does not produce banding

Bold bit is the problem smile.png
8bpc does produce very perceptible banding in contrived conditions. Try opening photoshop and making a linear gradient from black to 100% green, then zoom in on a dark section of it -- e.g. the individual bands of green should be easily distinguisable.
NxqmQW8.png
Real 10bit, or even fake (dithered) 10bit-via-8bit should relieve this somewhat. So even with the existing sRGB color space 10bit is a good improvement over 8bit.

If you go to a wider gamut, such as Rec.2020, then you also have to add more bits (than 8), otherwise you'll end up with less precision than existing 8bit sRGB. In the end, you don't get that much extra precision (but you do get a better gamut)! 10bit sRGB or 12bit Rec.2020 would be ideal smile.png

[edit] the intense white surrounding that dark image ruins your eye's perception of it. Hold your hands up to frame the dark green section (and block out that white light), and you should be able to make out an obvious diagonal line (or two!) in the middle, separating the different shades of green.

Looking forward to seeing a display first hand that would support the full Rec.2020 space. Anyone knows if such displays are actually available for purchase yet?

As far as I know current tech is only able to display 97% of the DCI colorspace (which only cover a part of REC 2020's one). Manufactors say that next generation of quantum dot/ led should make possible to reach 90% of rec 2020 colorspace.

Hopefully this new monitors will be better calibrated than srgb ones because I'm expecting them to be much more difficult to calibrate. Plus there is still no dedicated way to make d3d or Ogl use icc profiles...

Is 10-bit color mode presentation supported on current OS desktop compositors? I was aware that, at least on Windows, it is not and dirty OGL "hacks" are needed .__.

It is recommended practice on Windows to use 10bit backbuffers!

I have a BenQ BL3201PT 32" UHD display that supports 10bit color channels on Windows desktop ... the 10bpc mode is implemented with FRC (frame rate control) [AKA 8bit with dithering]
...I don't see any difference in anything with 10bpc vs 8bpc, i.e. 8bpc does not produce banding

Bold bit is the problem smile.png
8bpc does produce very perceptible banding in contrived conditions. Try opening photoshop and making a linear gradient from black to 100% green, then zoom in on a dark section of it -- e.g. the individual bands of green should be easily distinguisable.
NxqmQW8.png
Real 10bit, or even fake (dithered) 10bit-via-8bit should relieve this somewhat. So even with the existing sRGB color space 10bit is a good improvement over 8bit.

If you go to a wider gamut, such as Rec.2020, then you also have to add more bits (than 8), otherwise you'll end up with less precision than existing 8bit sRGB. In the end, you don't get that much extra precision (but you do get a better gamut)! 10bit sRGB or 12bit Rec.2020 would be ideal smile.png

[edit] the intense white surrounding that dark image ruins your eye's perception of it. Hold your hands up to frame the dark green section (and block out that white light), and you should be able to make out an obvious diagonal line (or two!) in the middle, separating the different shades of green.

Yes, the 10-bit back-buffer is not an issue, but what DXGI format use the DWM? I was aware that it still a 8-bit color buffer.

"Recursion is the first step towards madness." - "Skegg?ld, Skálm?ld, Skildir ro Klofnir!"
Direct3D 12 quick reference: https://github.com/alessiot89/D3D12QuickRef/

Looking forward to seeing a display first hand that would support the full Rec.2020 space. Anyone knows if such displays are actually available for purchase yet?

Manufactors say that next generation of quantum dot/ led should make possible to reach 90% of rec 2020 colorspace.

Unfortunately that will probably take a while to filter down to South Africa (for broad usage).

But progress is nice, Maybe I will be able to skip straight from my standard/normal/RunOfTheMill monitor to something like this :)

Never say Never, Because Never comes too soon. - ryan20fun

Disclaimer: Each post of mine is intended as an attempt of helping and/or bringing some meaningfull insight to the topic at hand. Due to my nature, my good intentions will not always be plainly visible. I apologise in advance and assure you I mean no harm and do not intend to insult anyone.

Mac OS X recently gained system-wide support for 10bpc and higher color output, and the 2014 and 2015 iMac displays are capable of it ("deep color" as well as "wide color" via P3 rather than sRGB – and extended dynamic range output.)

This page has some further info about how it's handled there https://developer.apple.com/library/mac/releasenotes/MacOSX/WhatsNewInOSX/Articles/MacOSX10_11_2.html

10bit linear is worse than 8bit gamma, so sRGB will stay.
These aren't "HDR monitors" that's marketing buzzwords...
HDR photography usually combines three or more 10-16 bit images to create a 32bit floating point image, which can ve tonemapped back to 8bit sRGB.
HDR games usually use a 16bit floating point rendering, and tonemap it to 8/10bit sRGB.
10bits is not HDR.
These monitors have been abound for a while using the name "deep color", not "HDR".
Software support for them has been around for 10 years already. You just make a 10_10_10_2 backbuffer instead of 8_8_8_8! Lots of games already support this.


10 10 10 2 ? so only 2 (01 and 10) transparency levels, beside opaque (11) and transparent (00) ?? so, no fog effect...

10 10 10 2 ? so only 2 (01 and 10) transparency levels, beside opaque (11) and transparent (00) ?? so, no fog effect...

This is for the final back-buffer that's sent to the monitor - it doesn't require transparency.
You can use other formats for the rest of your rendering pipeline, such as 16 16 16 16.

This topic is closed to new replies.

Advertisement