Reading linearized Texture as sRGB, then doing Gamma conversion: Different result

Started by
9 comments, last by Hodgman 9 years, 8 months ago

One thing still puzzles me. Most HDR samples I see do not use sRGBWrite, but perform the pow in their tonemapping instead. Why is that? Is there an advantage to doing it manually?

Ideally, everyone's TV/monitor would follow the sRGB standard exactly, but unfortunately, many don't. Worse, many do, but the factory default settings use extreme contrast/gamma, and the user is unlikely to go through the settings and activate sRGB mode :(

Because of this, lots of games show you a 'calibration' image and a slider to choose a game value (e.g. "Move the slider until the left image is barely visible").
If you do the final gamma correction manually, you can use this custom gamma value.

On the console games I've worked on, we routinely test on many different monitors and TVs, and have found that most look ok with gamma values all the way from 1.8 to 2.4!

According to the CardCaps.xls file included with the DirectX SDK, sRGB Writes and Reads on A8R8G8B8 surfaces are supported on almost all hardware except some released in 2002 (Geforce 4 MX 420). Is that true, or do recent adapters exist which don't support it?

as of D3D10, it became a required feature, so all modern cards will support it.

This topic is closed to new replies.

Advertisement