Jump to content
  • Advertisement
Sign in to follow this  
KaiserJohan

sRGB on diffuse textures or gbuffer color texture?

This topic is 1123 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Reading the following article: http://john-chapman-graphics.blogspot.hu/2013_01_01_archive.html

 

The last two paragraphs made me curious:

 

 

 

For a deferred renderer there is a pitfall which programmers should be aware of. If we linearize a non-linear input texture, then store the linear result in a g-buffer prior to the lighting stage we will lose all of the low-intensity precision benefits of having non-linear data in the first place. The result of this is just horrible - take a look at the low-intensity ends of the gradients in the left image below:

 

fig4.jpg

 

 

 

Clearly we need to delay the gamma correction of input textures right up until we need them to be linear. In practice this means writing non-linear texels to the g-buffer, then gamma correcting the g-buffer as it is read at the lighting stage. As before, the driver can do the work for us by using an sRGB format for the appropriate g-buffer targets, or correcting them manually.

 

 

From all the other resources I've read you should declare your input diffuse textures as *_SRGB and your gbuffer color texture as non-SRGB which is contradictory to this. 

 

Which is correct? Should the diffuse textures be sRGB or the gbuffer color texture be sRGB (effectively delaying the conversion untill sampling it in the lighting stage)?

Edited by KaiserJohan

Share this post


Link to post
Share on other sites
Advertisement

Both?

 

You want your diffuse textures to be sRGB so low-end precision is maintained in your textures. The diffuse textures are stored in non-linear / sRGB format.

After they're sampled in the shader, the values are linear, but you want to re-encode this value back into an sRGB target so low-end precision is maintained as the g-buffer is written. So you want your diffuse/albedo gbuffer render target to be SRGB too.

When your gbuffer is read later in the frame for lighting purposes, continue to use an sRGB formatted SRV and the data will be linearised again ready for use in lighting calculations.

 

OK that makes sense.

The "intermediate" buffer that contains all the output from lighting calculations should be linear though right? For the sake of blending? With a texture format something like DXGI_FORMAT_R16G16B16A16_UNORM for precision.

And then finally the backbuffer should be DXGI_FORMAT_R8G8B8A8_UNORM_SRGB so once all lighting is done copyResource() from intermediate->backbuffer should automatically convert it back from linear to non-linear right?

 

Thanks

Share this post


Link to post
Share on other sites
The "intermediate" buffer that contains all the output from lighting calculations should be linear though right? For the sake of blending? With a texture format something like DXGI_FORMAT_R16G16B16A16_UNORM for precision.

 

No. The whole point of deferred rendering/lighting is to delay the lighting calculations until the very last pass, and do the calculations in the pixel shader. This means that you don't store the lighting calculations anywhere. Instead, you have multiple "g-buffers" - one of those is the diffuse g-buffer. What the article you linked to seems to be talking about in the paragraphs you quoted, is this diffuse g-buffer.

 

I don't remember exactly what the meaning of the _SRGB formats is in D3D, but I believe to achieve what the article says, both your diffuse g-buffer and also all of your source diffuse textures should have a non-_SRGB format, to avoid an SRGB->linear transformation, and preserve the color data from the textures in the g-buffer.

 

Then, when reading from the diffuse g-buffer to do the lighting calculations, it should have an _SRGB format, so the GPU does the SRGB->linear transformation, which is needed for lighting.

 

So basically, you are delaying the SRGB->linear transformation of diffuse data till the very last pass of the deferred lighting.

 

EDIT: After thinking about it a bit, I don't think it is possible to do this in D3D, because you can't change the color format of the g-buffer texture in D3D. You would have to create a new g-buffer texture with the _SRGB format, and then copy the old g-buffer into it. But I also do not see any benefits to doing this in D3D - all textures are floating point, so there is no precision loss. I believe that article is intended for OpenGL, where the intermediary g-buffer textures can (or always) have an integer format, instead of floating point, and in that case, the conversion of the linear diffuse colors to integer would cause the type of precision loss shown in the image you posted.

Edited by tonemgub

Share this post


Link to post
Share on other sites


So basically, you are delaying the SRGB->linear transformation of diffuse data till the very last pass of the deferred lighting.
....which is the pass that writes into the 16-bit per channel lighting buffer.

Share this post


Link to post
Share on other sites
No. The whole point of deferred rendering/lighting is to delay the lighting calculations until the very last pass, and do the calculations in the pixel shader. This means that you don't store the lighting calculations anywhere.

 

Not sure I follow this one - isn't the whole point to be able to run multiple passes? For example run several point light passes, an ambient pass and maybe a directional light pass in sequence and accumulate the output color? Isn't that why you need to work in linear space to additively blend the output from the various lighting passes? And then you need a texture which is in linear space and then only at the end after all lighting is done gets converted to SRGB when copied into the backbuffer?

 

I don't remember exactly what the meaning of the _SRGB formats is in D3D, but I believe to achieve what the article says, both your diffuse g-buffer and also all of your source diffuse textures should have a non-_SRGB format, to avoid an SRGB->linear transformation, and preserve the color data from the textures in the g-buffer.

 

Oh that makes sense, having source diffuse textures _SRGB and gbuffer RTV _SRGB is just unnecesary since you just sample the diffuse texture and output to the gbuffer, going srgb->linear and then linear->srgb again for no reason smile.png

 

 

Then, when reading from the diffuse g-buffer to do the lighting calculations, it should have an _SRGB format, so the GPU does the SRGB->linear transformation, which is needed for lighting.

 

So only the diffuse gbuffers SRV should be _SRGB while the actual texture and its RTV (due to my 1st paragraph) should be non-SRGB?

 

Thanks

Edited by KaiserJohan

Share this post


Link to post
Share on other sites

 

 

If you assume that linear->srgb and srgb->linear is done by dedicated circuits, and is therefore free, then it's not something to worry about smile.png
 
Either:
Mark source texture views, GBuffer render-target, GBuffer texture view, and backbuffer render-target as SRGB:
(Textures)-- sRGB->Linear --[To GBuffer Shader]-- Linear->sRGB --(GBuffer)-- sRGB->Linear --[Lighting shader]-- no change (Lighting buffer)-- no change --[Tonemap]-- Linear->sRGB (Backbuffer)
 
Or:
Mark source texture views and GBuffer render-target as Linear (even though they're not!), and mark GBuffer texture view, and backbuffer render-target as SRGB:
(Textures)-- no change --[To GBuffer Shader]-- no change --(GBuffer)-- sRGB->Linear --[Lighting shader]-- no change (Lighting buffer)-- no change --[Tonemap]-- Linear->sRGB (Backbuffer)

 

Awesome, that was exactly what I was looking for.

 

As long as it's guaranteed to be a free operation (maybe only older cards dont have this feature? or newer but cheaper?) the first option then seems more clearer / less deceptive.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!