Jump to content
  • Advertisement
Sign in to follow this  
_Flame_

OpenGL GL_FRAMEBUFFER_SRGB and framebuffer

Recommended Posts

hello.

I don't quite understand how combination of GL_FRAMEBUFFER_SRGB and framebuffer works.

Without a framebuffer it is straightforward - Linear color space converts to sRGB.

But when i use additional framebuffer then things are bit weird.

1) If i attach GL_RGB image format to framebuffer then glEnable(GL_FRAMEBUFFER_SRGB) ignores framebuffer and affects only screen render.

2) if i use GL_SRGB framebuffer image with gDisable(GL_FRAMEBUFFER_SRGB)(disabled for framebuffer and screen) then screen is black.

3) If i use GL_SRGB with gEnabled(GL_FRAMEBUFFER_SRGB) only for framebuffer then the final screen image looks like it is without gamma correction.

4) If i use GL_SRGB with gEnabled(GL_FRAMEBUFFER_SRGB) for framebuffer and screen render then the final screen image is with gamma correction.

Point 3 is probably because opengl sees that framebuffer image is SRGB and converts it to the linear space and the resulting image is in the original linear space.

Point 4 is similar to point 3 but there is final conversion to sRGB space and the resulting image is in the correct sRGB space.

What about point 2? Is it because there is only conversion from linear to linear space and the image becomes even darker?

What about point1. Why is framebuffer not affected by glEnable(GL_FRAMEBUFFER_SRGB) in that case?

Edited by _Flame_

Share this post


Link to post
Share on other sites
Advertisement

Point 1, probably because the texture attached to the framebuffer has not the correct format.

"If [GL_FRAMEBUFFER_SRGB is] enabled and the value of GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING for the framebuffer attachment corresponding to the destination buffer is GL_SRGB, the R, G, and B destination color values (after conversion from fixed-point to floating-point) are considered to be encoded for the sRGB color space and hence are linearized prior to their use in blending."

from: https://www.khronos.org/registry/OpenGL-Refpages/gl4/

Edited by Green_Baron

Share this post


Link to post
Share on other sites

Point 2: Should work, but no conversion should take place.

How do you assemble the framebuffer ? Do you check for completeness ? You could clear it to a color, just to be sure that it is used but no rendering takes place ...

What do you mean "only for framebuffer" and "final screen image" ? Do you use a windowing api ? Do you render to an own framebuffer object and then blit to the default framebuffer ? Or do you render independently to your own framebuffer (with srgb enabled) and then to the default framebuffer (i mean the one that is displayed to the screen depending on your windowing api) ? That would explain 3 and 4 if you disable srgb or if it is not enabled for the default framebuffer.

Do you blit the image in the srgb enabled framebuffer to the default framebuffer (the one for display on screen) or are the two for different purposes ?

I have observed that things work well on my gtx 970, but not on the notebook that has an old intel hd4000. But there i get an error when checking the framebuffer for completeness. Have not checked deeper what causes the error ... maybe missing support or so ...

Edited by Green_Baron

Share this post


Link to post
Share on other sites
2 hours ago, Green_Baron said:

Point 1, probably because the texture attached to the framebuffer has not the correct format.

"If [GL_FRAMEBUFFER_SRGB is] enabled and the value of GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING for the framebuffer attachment corresponding to the destination buffer is GL_SRGB, the R, G, and B destination color values (after conversion from fixed-point to floating-point) are considered to be encoded for the sRGB color space and hence are linearized prior to their use in blending."

from: https://www.khronos.org/registry/OpenGL-Refpages/gl4/

yes, i have seen this but it's a little strange behaviour. Default(screen) framebuffer doesn't have such limitation so it would make sense in my opinion to have same behaviour for texture buffer too.

50 minutes ago, Green_Baron said:

Point 2: Should work, but no conversion should take place.

How do you assemble the framebuffer ? Do you check for completeness ? You could clear it to a color, just to be sure that it is used but no rendering takes place ...

What do you mean "only for framebuffer" and "final screen image" ? Do you use a windowing api ? Do you render to an own framebuffer object and then blit to the default framebuffer ? Or do you render independently to your own framebuffer (with srgb enabled) and then to the default framebuffer (i mean the one that is displayed to the screen depending on your windowing api) ? That would explain 3 and 4 if you disable srgb or if it is not enabled for the default framebuffer.

Do you blit the image in the srgb enabled framebuffer to the default framebuffer (the one for display on screen) or are the two for different purposes ?

I have observed that things work well on my gtx 970, but not on the notebook that has an old intel hd4000. But there i get an error when checking the framebuffer for completeness. Have not checked deeper what causes the error ... maybe missing support or so ...

Point 2 probably works but i assume it converts colors to linear space which makes colors very dark since original colors are in linear space already. "Only for framebuffer" means i enable  GL_FRAMEBUFFER_SRGB only for texture framebuffer. I use glfw. I do not blit. I render with 2 phases. First draw to a texture framebuffer and after that draw to a default one with a quad that has a texture from previous phase.

Share this post


Link to post
Share on other sites

Have you tried setting the window hint for the default framebuffer ? Something like GLFW_SRGB_CAPABLE ?

Another suggestion: when done rendering blit your renderbuffer over to the default buffer for display, if your setup allows. So you only have to care about the color stretching in your own buffer.

Share this post


Link to post
Share on other sites
8 hours ago, Green_Baron said:

Have you tried setting the window hint for the default framebuffer ? Something like GLFW_SRGB_CAPABLE ?

Another suggestion: when done rendering blit your renderbuffer over to the default buffer for display, if your setup allows. So you only have to care about the color stretching in your own buffer.

Why? It works ok. I just want to clarify why it works this way.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!