• Create Account

## Reading linearized Texture as sRGB, then doing Gamma conversion: Different result

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

10 replies to this topic

### #1Tubos  Members

Posted 31 July 2014 - 08:12 AM

Hi,

I just switched my Direct3D9 renderer to linear color.

As a test, I rendered a quad with an image. The quad itself uses no shaders. In the sampler state, I'm setting D3DSAMP_SRGBTEXTURE to 1.

Before presenting, I copy the rendertarget to the backbuffer with a gamma adjustment of 2.2. Here is the gamma adjustment shader: ("Texture" refers to the rendertarget as rendered previously).



sampler Texture
{
SRGBTexture=false;
};

float4 GammaAdjustOnly( float2 Tex : TEXCOORD0 ) : COLOR0
{
return pow(tex2D(Texture, Tex), 1.0/2.2);
}

{
pass P0
{
}
}


This should lead to the exact same result as before, when I didn't use sRGB textures. However, there is a difference: When using sRGB, the image looks more grey-ish and a bit brighter.

I am not using sRGB writes, instead I'm using an FP16 buffer. To confirm that I messed nothing else up, I removed the Gamma Adjustment and the sRGB sampler state while leaving everything else that same. Sure enough, it looks correct then.

Where does the difference come from? After all, "pixel ^ (1 / 2.2) ^ (2.2)" should be equal to "pixel", right?

Here are some more details: This is on Windows 7, with Direct3D9 via SlimDX, running in windowed mode.

#### Attached Thumbnails

Edited by Tubos, 31 July 2014 - 08:19 AM.

### #2osmanb  Members

Posted 31 July 2014 - 08:38 AM

pow(x, 2.2) is an approximation of the sRGB decoding process, but that's not exactly what happens. It's typically a piecewise linear curve, and the actual gamma is slightly different in different regions. Check out http://en.wikipedia.org/wiki/SRGB for the ideal version.

### #3Tubos  Members

Posted 31 July 2014 - 08:50 AM

Ah, that means DirectX does the conversion using an exact sRGB conversion, while my shader uses the approximated version.

So a good solution would be to disable the sRGB sampler state, and do the pow(x, 1.0 / 2.2) manually in every pixel shader, right?

Then I'm working with slightly incorrect texture values in the linear space, but the final output should look identical to the input image.

### #4kauna  Members

Posted 31 July 2014 - 10:00 AM

Well, typically it would be better to let the GPU do the conversion, since there is hardware for it (ie. it's free, where as pow isn't). But if it suits your needs better, do as you wish.

Cheers!

### #5Chris_F  Members

Posted 31 July 2014 - 10:06 AM

So a good solution would be to disable the sRGB sampler state, and do the pow(x, 1.0 / 2.2) manually in every pixel shader, right?

No, don't apply gamma in the pixel shader unless you have a good reason to. That's why sRGB formats exist.

Edited by Chris_F, 31 July 2014 - 01:09 PM.

### #6mark ds  Members

Posted 31 July 2014 - 10:20 AM

The best solution would be to use sRGB source images, let the hardware convert to linear and store in the offscreen buffer, do any blending etc in the linear buffer, then send to an sRGB backbuffer allowing the hardware to convert back to sRGB (which is what you monitor [nominally] is).

Edited by mark ds, 31 July 2014 - 10:23 AM.

### #7Samith  Members

Posted 31 July 2014 - 10:21 AM

Ah, that means DirectX does the conversion using an exact sRGB conversion, while my shader uses the approximated version.

So a good solution would be to disable the sRGB sampler state, and do the pow(x, 1.0 / 2.2) manually in every pixel shader, right?

Then I'm working with slightly incorrect texture values in the linear space, but the final output should look identical to the input image.

You should use D3DRS_SRGBWRITEENABLE when you render, that way you don't need to do the pow(x, 1.0/2.2) approximation in your shader.

### #8Tubos  Members

Posted 31 July 2014 - 12:59 PM

The best solution would be to use sRGB source images, let the hardware convert to linear and store in the offscreen buffer, do any blending etc in the linear buffer, then send to an sRGB backbuffer allowing the hardware to convert back to sRGB (which is what you monitor [nominally] is).
Ok! I'm now doing that, and the result looks exactly right

One thing still puzzles me. Most HDR samples I see do not use sRGBWrite, but perform the pow in their tonemapping instead. Why is that? Is there an advantage to doing it manually?

### #9mark ds  Members

Posted 31 July 2014 - 01:54 PM

Sticking to sRGB, whilst technically the correct thing to do, is not always the final look an artist wants (they're a fussy bunch!). So, for the final output, whatever looks best is the right thing to do, meaning manual pow before presentation (or more likely, a series of pre-calculates lookups, often involving 3d textures).

On the other hand, things like photo editing packages would normal stay strictly within sRGB (or some other specified colour space).

Edited by mark ds, 31 July 2014 - 01:56 PM.

### #10Tubos  Members

Posted 01 August 2014 - 05:08 AM

Thank you!

According to the CardCaps.xls file included with the DirectX SDK, sRGB Writes and Reads on A8R8G8B8 surfaces are supported on almost all hardware except some released in 2002 (Geforce 4 MX 420). Is that true, or do recent adapters exist which don't support it?

### #11Hodgman  Moderators

Posted 01 August 2014 - 05:56 AM

One thing still puzzles me. Most HDR samples I see do not use sRGBWrite, but perform the pow in their tonemapping instead. Why is that? Is there an advantage to doing it manually?

Ideally, everyone's TV/monitor would follow the sRGB standard exactly, but unfortunately, many don't. Worse, many do, but the factory default settings use extreme contrast/gamma, and the user is unlikely to go through the settings and activate sRGB mode

Because of this, lots of games show you a 'calibration' image and a slider to choose a game value (e.g. "Move the slider until the left image is barely visible").
If you do the final gamma correction manually, you can use this custom gamma value.

On the console games I've worked on, we routinely test on many different monitors and TVs, and have found that most look ok with gamma values all the way from 1.8 to 2.4!

According to the CardCaps.xls file included with the DirectX SDK, sRGB Writes and Reads on A8R8G8B8 surfaces are supported on almost all hardware except some released in 2002 (Geforce 4 MX 420). Is that true, or do recent adapters exist which don't support it?

as of D3D10, it became a required feature, so all modern cards will support it.

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.