SRGB means no gamma param in game settings ?

Started by
2 comments, last by Alundra 7 years ago

Hi,
Using the SRGB gamma curve, does that means no gamma setting is used in game settings ?
From what I understand it should be correct since the monitors are now calibrated for that, but better to ask.
Thanks

Advertisement

The "gamma" setting usually corresponds to some tweak in the final curve applied to the frame image before presenting it to the display. It can still be useful even with sRGB standards, because monitors (TV's especially) are all over the place in terms of calibration. Many come out of the factory with terrible settings designed to make the display stand out in a brightly-lit store, and are a very poor match for in-home viewing conditions. Others have been tweaked in various ways by their users. Even with proper calibration different displays can have wildly different characteristics that may deviate from standards, and so it can be a good idea to have the sure help calibrate your game so that you're not getting your blacks crushed or your whites clipped.

Yeah we can ensure that all of our game artists have good quality, calibrated sRGB monitors, so we can ensure that all of our source data is good sRGB data... but we have no control over the user's monitor.

As above, monitors are all over the place. User's might have a crappy one that doesn't follow the sRGB rules, or perhaps they've just tweaked the settings to be different.

Human perception of colour also depends on the ambient light level in the room. If you look at the same picture in a bright room and a dark room, you will perceive different colours! sRGB assumes "typical office lighting" as the background, so a user playing games in a dark room might actually need a gamma tweak to get something that looks like you expect it to.

Also, sRGB is the standard for computer monitors, but not the standard for TV's :( If your user is playing games on their TV, you need to output Rec.601, Rec.709 or Rec.2020 (depending on the TV) instead of sRGB :angry: A simpler option is to just give the user an output gamma slider and treat gamma 2.2 as the default/middle setting. Then use sRGB internally to encode all your data, and use the user's gamma value for the final conversion back to 8bit before display.

Ok, looks like a solid solution to solve the issue, have all the data in sRGB but in the shader use the hardware sRGB to linear but on the output use pow(linear, 1.0f/Gamma).
Bad TV's are not using sRGB standard :( will it be one day ?

This topic is closed to new replies.

Advertisement