Jump to content
  • Advertisement
Sign in to follow this  
vlj

What will change with HDR monitor ?

This topic is 651 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

A week ago AMD announced that they were working with their partner to push "HDR" monitors in the consumer market.
These  monitors should support 10 bits RGB color instead of the usual 8 bits per channel. These monitors should be able to display higher constrast and brightness to enable these 2 additionals bits.

 

My question is : what are the consequences for display algorithm and content creation ? 

I never saw a 10 bits monitor so far and thus have no idea how the extra colors "look". I've been often told that the human eyes can't differentiate more than 256 shades of one of the primary color.

On the other hand games have a tonemap pass that turns 10+ bits color per channel to ones displayable on a monitor by mimic-ing eye's or camera's behavior ; with 10 bits monitor this pass could happen "in the eye" and have the viewer being dazzled by monitor. However this sounds unconfortable and may conflict with some electronic post process such as dynamic contrast, not accounting for uncorrectly configured display or poorly lit environment.

 

Additionnaly will traditionnal RGBA8 textures fit on engine designed around 10 bits displays ?

Share this post


Link to post
Share on other sites
Advertisement
Thanks!

By the way how gamma will interact here? As far as I know gamma is a legacy from crt era where there was a non linear response between electric signal and output light power. However the new colorspaces can't be completely displayed by old analog only crt so why is gamma kept instead of transmitting linear color only and let monitor embedded electronic turn them into voltage?

Share this post


Link to post
Share on other sites

10bit linear is worse than 8bit gamma, so sRGB will stay.

These aren't "HDR monitors" that's marketing buzzwords...
HDR photography usually combines three or more 10-16 bit images to create a 32bit floating point image, which can ve tonemapped back to 8bit sRGB.
HDR games usually use a 16bit floating point rendering, and tonemap it to 8/10bit sRGB.

10bits is not HDR.
These monitors have been abound for a while using the name "deep color", not "HDR".

Software support for them has been around for 10 years already. You just make a 10_10_10_2 backbuffer instead of 8_8_8_8! Lots of games already support this.

 

Aye, though in this case "HDR" as a buzzword has now moved towards meaning DCI colorspace with 10bit input requirements. Or rather it means that and possibly more and there's an argument among display makers as to what it should mean (what's the colorspace? what's the contrast ratio that should be required? what's the min brightness in nits? etc.). Regardless there's a new digital display standard to go with it, getting rid of the old analog stuff. The article in question is really vague as to what AMD even plans on doing in supporting "HDR" beyond incidentally moving to display port 1.3. Honestly for GPUs the only thing I can think of to do is that automatic conversion between the new gamma curves and linear, because higher bit backbuffers for output is, as you pointed out, software and 10bit has been supported for a while now. 

Share this post


Link to post
Share on other sites

Since the original question was "What will change with HDR monitor?"; if the monitors truly are HDR (like Hodgman said, HDR is quite overused by marketing... what it really means by HDR monitors "depends") then what will change for sure is your electricity bill. A monitor that can show a picture in a dynamic range around 4-10 times higher is bound to consume much higher electricity when the "shiny stuff" lits the pixels.

 

Funny how "staying green" slogan is also important.

Edited by Matias Goldberg

Share this post


Link to post
Share on other sites

Since the original question was "What will change with HDR monitor?"; if the monitors truly are HDR (like Hodgman said, HDR is quite overused by marketing... what it really means by HDR monitors "depends") then what will change for sure is your electricity bill. A monitor that can show a picture in a dynamic range around 4-10 times higher is bound to consume much higher electricity when the "shiny stuff" lits the pixels.

 

Funny how "staying green" slogan is also important.

 

If the monitors had a high enough dynamic range then our GPUs might in contrast use less power as they won't need to do things like tonemapping, bloom, lens flares and so on, as the monitor and our eyes will do it automatically, so we may actually save electricity in the long run! laugh.png (although we may overall lose due to people accidentally setting fire to their homes and losing their eyesight after displaying a rendered sun on their monitor unsure.png )

Share this post


Link to post
Share on other sites

Ok so I actually went and read an article about the announcement -- http://www.extremetech.com/gaming/219038-amds-radeon-technologies-group-highlights-next-gen-display-standards-new-freesync-capabilities

Apparently a "HDR" display:

* uses 10 bit per channel instead of 8 bit.

* is 4 times brighter than current screens (maybe 40 times in the future).

* and maybe uses the Rec.2020 instead of sRGB or Rec.709.

 

10bpc is nice, but not new -- as mentioned earlier, it's called "deep colour" and it's been around for a decade in the consumer market, though uptake has been low.

Lots of 8bpc displays can actually already accept 10bpc signals, which they display via dithering.

On that note - lots of cheap LCD displays are actually only 6bpc, but they use dithering to display the 8bpc signal!

 

Brighter screens are always nice, but lots of people will turn them down anyway laugh.png All that matters is whether it can produce 256 (or now 1024) discernible steps between it's min and max brightness, spaced correctly as per it's colour space. However, it is important to note that colour spaces such as sRGB do actually define the brightness of the display, and the ambient brightness in the viewer's room -- the ambient brightness will actually effect the way that you perceive colour!

10bits gives us 4x more values than 8bit, so a 4x increase in brightness makes sense... but if they're going to ramp that up to 40x brighter, than we'll need more than 10bpc....

 

Switching to Rec.2020 instead of sRGB is going to be a massive pain in the ass -- unless there is some kind of API created where you can ask the monitor what colour space it's using.

 

At the moment, sRGB is great because it's the standard for PC graphics -- it's the colour space of the internet. There's no existing way to ask your monitor what colour space it's using, but because sRGB exists and everyone accepted it as the standard, it's safe for games to simply assume that outputting sRGB values is the right thing(tm) to do.

If a new standard shows up on the scene, all that goes out the window. This would be ok if we can query the monitor -- the game's tonemapper can be configured to either convert to sRGB or to Rec.2020 -- but if we can't query the monitor, we can't ship artwork that's going to look the way in which the artist's intended.

At the last studio I worked for, we had a testing room with multiple properly calibrated displays, and a few "factory default" ones (usually has all the stupid "enhancement" sliders at 100%, oversaturating everything). All of the textures and post-fx would be tuned to look perfect (as decided by the art director) on the properly calibrated screens, while still looking adequate on the the badly tuned "factory default" ones. Adding a new colour space adds a 3rd batch of screens to tune for, but now two of those batches are "authoritative", and in a tug of war against each other. angry.png

Edited by Hodgman

Share this post


Link to post
Share on other sites

Is 10-bit color mode presentation supported on current OS desktop compositors? I was aware that, at least on Windows, it is not and dirty OGL "hacks" are needed .__.

 

Anyway, I really do not care about panel brightness, I do not use over 100-250 nit settings (depends on environment lightning): I did a laser surgery last year, and using higher brightness tire a lot my eyes.

 

What would be nice is having decent black without IPS glow or TN greysh shining....

 

Talking about APIs and hardware: what about 10-bit color compression? Is that supported on DXT/BC or on ASTC? Because I find really useless using 10-bit presentation without 10-bit textures.

Edited by Alessio1989

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!