What will change with HDR monitor ?

Started by
16 comments, last by Hodgman 7 years, 2 months ago

A week ago AMD announced that they were working with their partner to push "HDR" monitors in the consumer market.
These monitors should support 10 bits RGB color instead of the usual 8 bits per channel. These monitors should be able to display higher constrast and brightness to enable these 2 additionals bits.

My question is : what are the consequences for display algorithm and content creation ?

I never saw a 10 bits monitor so far and thus have no idea how the extra colors "look". I've been often told that the human eyes can't differentiate more than 256 shades of one of the primary color.

On the other hand games have a tonemap pass that turns 10+ bits color per channel to ones displayable on a monitor by mimic-ing eye's or camera's behavior ; with 10 bits monitor this pass could happen "in the eye" and have the viewer being dazzled by monitor. However this sounds unconfortable and may conflict with some electronic post process such as dynamic contrast, not accounting for uncorrectly configured display or poorly lit environment.

Additionnaly will traditionnal RGBA8 textures fit on engine designed around 10 bits displays ?

Advertisement

So the wrong thing to focus on is the "10 bit monitor" part, at least initially. The right thing to ask is "what is the colorspace". A "colorspace" represents the range of colors representable by a display. Here's a handy chart

DMColorRec2020_s.jpg

The total area of the chart represents the colors a human eye can see, the triangles represent different "colorspaces". The points of the triangles represent the shades of red, blue, and green that can be mixed together to create the colors in between. Right now we are at the smallest triangle, REC 709, and have been since, well practically since color TV was invented. The "bits" as in "10 bit, 8 bit" etc. come in when you want to display the colors in between the far points of the triangle.

Right now we have (for the most part) 8bit monitors, in binary that plays out to each color (Red, blue, green) having 256 shades each to combine together to make the colors in between. For REC 709 that's fine, you won't see banding (mostly). But when we get to bigger triangles, we need more shades of each color to cover the space in between unless we want banding. EG

MJPEG-Sky-grad-75percent-4096.jpg

This is supposed to be smooth, but doesn't have enough colors to represent a smooth change to our eye, so we see obvious "jumps" in color. That's where we need extra bits, to get more colors to put in between. Ten bits offers 1024 shades of each color, and is enough for the middle triangle, which is the DCI colorspace, or rather what movies are (ideally) projected in in theaters. It's also what the first wave of "HDR!" screens support.

Unfortunately, or fortunately depending, there's also the bigger triangle, the REC 2020 colorspace, which is what's supposed to be supported but couldn't quite make it out this year. To cover that area without banding would need 12 bits of each color. Which "colorspace" will win is a complicated mess and no one knows. Regardless, now that I've covered what's actually going on to the question.

For one part of production, the shades of color and ye 10+ bits, that's going to be easy. Right now textures covering albedo generally take up 24 bits to render (8 bits of color each). In order to cover the bigger triangle, IE in order for it to be done properly, you just up the bits. Maybe to say, 11,11,10 and hope people don't notice banding if we dither for textures. Other things also get upped, some people still use 10 bits per color channel for HDR (think going into a bigger triangle virtually, then shrinking it back down) which shows some banding on 8bit output. But for 10+ bits the minimum HDR target will probably be 16 bits per channel. So, more rendering power, ouch, but relatively easy to do. Though it should be noted right now GPUs have automatic "support" for REC 709, and automatically convert back and forth between REC 709s gamma curve and everything being linear and thus easier and proper to do maths with, while the bigger triangles have different gamma curves and will need to be either manually done or have new GPUs that do it quick and automatic.

Asset creation will be the hard part. Right now every studio does REC 709 textures and colorspace and etc. The cameras for texture sampling are set up for REC 709, the camera color checkers are, the monitors are, the GPUs are, the modeling software is. All of that will have to be replaced, ideally from the moment real production begins (years in advance of release for triple A games) to produce assets that are meant to be displayed in the higher colorspaces. Now you can convert older textures into the newer spaces automatically, but it doesn't mean artists are going to be happy with the results. It might take a while to go over all the textures manually to get them to look acceptable. You might also see older 8 bit textures used in games covering the higher colorspaces (according to marketing...) and just use lighting values that cover the higher colorspaces. Obviously not ideal, but I wouldn't doubt that at least one or more games would go for it.

But ideally you'd have all assets designed from the start to cover whatever higher colorspace is used. With (right now) quite slow adoption of "HDR!" screens, little to no support from anything else (Netflix, image formats, etc.), and the need for more processing power I'd say you aren't going to see much of any games supporting "HDR!" for years, and quite possibly not uniformly until yet another new generation of consoles (or whatever, depending on how long these last).

Hopefully that covers everything you, and anyone else for that matter, wanted to know.

Thanks!

By the way how gamma will interact here? As far as I know gamma is a legacy from crt era where there was a non linear response between electric signal and output light power. However the new colorspaces can't be completely displayed by old analog only crt so why is gamma kept instead of transmitting linear color only and let monitor embedded electronic turn them into voltage?
10bit linear is worse than 8bit gamma, so sRGB will stay.

These aren't "HDR monitors" that's marketing buzzwords...
HDR photography usually combines three or more 10-16 bit images to create a 32bit floating point image, which can ve tonemapped back to 8bit sRGB.
HDR games usually use a 16bit floating point rendering, and tonemap it to 8/10bit sRGB.

10bits is not HDR.
These monitors have been abound for a while using the name "deep color", not "HDR".

Software support for them has been around for 10 years already. You just make a 10_10_10_2 backbuffer instead of 8_8_8_8! Lots of games already support this.

10bit linear is worse than 8bit gamma, so sRGB will stay.

These aren't "HDR monitors" that's marketing buzzwords...
HDR photography usually combines three or more 10-16 bit images to create a 32bit floating point image, which can ve tonemapped back to 8bit sRGB.
HDR games usually use a 16bit floating point rendering, and tonemap it to 8/10bit sRGB.

10bits is not HDR.
These monitors have been abound for a while using the name "deep color", not "HDR".

Software support for them has been around for 10 years already. You just make a 10_10_10_2 backbuffer instead of 8_8_8_8! Lots of games already support this.

Aye, though in this case "HDR" as a buzzword has now moved towards meaning DCI colorspace with 10bit input requirements. Or rather it means that and possibly more and there's an argument among display makers as to what it should mean (what's the colorspace? what's the contrast ratio that should be required? what's the min brightness in nits? etc.). Regardless there's a new digital display standard to go with it, getting rid of the old analog stuff. The article in question is really vague as to what AMD even plans on doing in supporting "HDR" beyond incidentally moving to display port 1.3. Honestly for GPUs the only thing I can think of to do is that automatic conversion between the new gamma curves and linear, because higher bit backbuffers for output is, as you pointed out, software and 10bit has been supported for a while now.

Since the original question was "What will change with HDR monitor?"; if the monitors truly are HDR (like Hodgman said, HDR is quite overused by marketing... what it really means by HDR monitors "depends") then what will change for sure is your electricity bill. A monitor that can show a picture in a dynamic range around 4-10 times higher is bound to consume much higher electricity when the "shiny stuff" lits the pixels.

Funny how "staying green" slogan is also important.

Since the original question was "What will change with HDR monitor?"; if the monitors truly are HDR (like Hodgman said, HDR is quite overused by marketing... what it really means by HDR monitors "depends") then what will change for sure is your electricity bill. A monitor that can show a picture in a dynamic range around 4-10 times higher is bound to consume much higher electricity when the "shiny stuff" lits the pixels.

Funny how "staying green" slogan is also important.

If the monitors had a high enough dynamic range then our GPUs might in contrast use less power as they won't need to do things like tonemapping, bloom, lens flares and so on, as the monitor and our eyes will do it automatically, so we may actually save electricity in the long run! laugh.png (although we may overall lose due to people accidentally setting fire to their homes and losing their eyesight after displaying a rendered sun on their monitor unsure.png )

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

10-bit monitors aren't HDR. A HDR monitor is one that makes you go blind if you look at a picture of the sun.

Ok so I actually went and read an article about the announcement -- http://www.extremetech.com/gaming/219038-amds-radeon-technologies-group-highlights-next-gen-display-standards-new-freesync-capabilities

Apparently a "HDR" display:

* uses 10 bit per channel instead of 8 bit.

* is 4 times brighter than current screens (maybe 40 times in the future).

* and maybe uses the Rec.2020 instead of sRGB or Rec.709.

10bpc is nice, but not new -- as mentioned earlier, it's called "deep colour" and it's been around for a decade in the consumer market, though uptake has been low.

Lots of 8bpc displays can actually already accept 10bpc signals, which they display via dithering.

On that note - lots of cheap LCD displays are actually only 6bpc, but they use dithering to display the 8bpc signal!

Brighter screens are always nice, but lots of people will turn them down anyway laugh.png All that matters is whether it can produce 256 (or now 1024) discernible steps between it's min and max brightness, spaced correctly as per it's colour space. However, it is important to note that colour spaces such as sRGB do actually define the brightness of the display, and the ambient brightness in the viewer's room -- the ambient brightness will actually effect the way that you perceive colour!

10bits gives us 4x more values than 8bit, so a 4x increase in brightness makes sense... but if they're going to ramp that up to 40x brighter, than we'll need more than 10bpc....

Switching to Rec.2020 instead of sRGB is going to be a massive pain in the ass -- unless there is some kind of API created where you can ask the monitor what colour space it's using.

At the moment, sRGB is great because it's the standard for PC graphics -- it's the colour space of the internet. There's no existing way to ask your monitor what colour space it's using, but because sRGB exists and everyone accepted it as the standard, it's safe for games to simply assume that outputting sRGB values is the right thing(tm) to do.

If a new standard shows up on the scene, all that goes out the window. This would be ok if we can query the monitor -- the game's tonemapper can be configured to either convert to sRGB or to Rec.2020 -- but if we can't query the monitor, we can't ship artwork that's going to look the way in which the artist's intended.

At the last studio I worked for, we had a testing room with multiple properly calibrated displays, and a few "factory default" ones (usually has all the stupid "enhancement" sliders at 100%, oversaturating everything). All of the textures and post-fx would be tuned to look perfect (as decided by the art director) on the properly calibrated screens, while still looking adequate on the the badly tuned "factory default" ones. Adding a new colour space adds a 3rd batch of screens to tune for, but now two of those batches are "authoritative", and in a tug of war against each other. angry.png

Is 10-bit color mode presentation supported on current OS desktop compositors? I was aware that, at least on Windows, it is not and dirty OGL "hacks" are needed .__.

Anyway, I really do not care about panel brightness, I do not use over 100-250 nit settings (depends on environment lightning): I did a laser surgery last year, and using higher brightness tire a lot my eyes.

What would be nice is having decent black without IPS glow or TN greysh shining....

Talking about APIs and hardware: what about 10-bit color compression? Is that supported on DXT/BC or on ASTC? Because I find really useless using 10-bit presentation without 10-bit textures.

"Recursion is the first step towards madness." - "Skegg?ld, Skálm?ld, Skildir ro Klofnir!"
Direct3D 12 quick reference: https://github.com/alessiot89/D3D12QuickRef/

This topic is closed to new replies.

Advertisement