Jump to content
  • Advertisement
Sign in to follow this  
GeneralQuery

Render depth to texture issue

This topic is 2569 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm implementing a deferred shader and am running into issues obtaining a 3D view space position from the depth buffer and screen space position. If I attach a depth renderbuffer for depth testing and write the depth values to a colour attachment, everything works as expected, like so:

[source]
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depthBuffer);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, w, h);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depthBuffer);

...

glGenTextures(1, &texDepth);
glBindTexture(GL_TEXTURE_2D, texDepth);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_FLOAT32_ATI, w, h, 0, GL_RGB, GL_FLOAT, NULL);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT2_EXT, GL_TEXTURE_2D, texDepth, 0);

GLenum buffers[] = { GL_COLOR_ATTACHMENT0_EXT, GL_COLOR_ATTACHMENT1_EXT, GL_COLOR_ATTACHMENT2_EXT};
glDrawBuffers(3, buffers);

/////////////////////////////////////////////////////////////////////////////////////////////////////////

// G Buffer fragment shader
gl_FragData[0] = vec4(oDiffuse, 1.0);
gl_FragData[1] = vec4(0.5f * (normalize(oNormal) + 1.0f) , 1.0);
gl_FragData[2] = vec4(oDepth.x / oDepth.y, 1.0, 1.0, 1.0);

/////////////////////////////////////////////////////////////////////////////////////////////////////////

// Lighting fragment shader
float Depth = texture2D(iTexDepth, oTexCoords).r;
vec4 Position = vec4(oTexCoords.x * 2.0f - 1.0f, (oTexCoords.y * 2.0f - 1.0f), Depth, 1.0);

Position = mtxInvProj * Position;
Position /= Position.w;
[/source]

However, when I try and use GL_DEPTH_ATTACHMENT_EXT as the attachment point (see listing below), I get incorrect results. Lighting changes with camera position, triangles facing away from the light source are lit and so on. When I display just the depth buffer, data is being written but it seems to be much more "bunched together" than when using GL_COLOR_ATTACHMENT2_EXT as the attachment point. For example, if I move the camera towards the mesh with the latter, the mesh depth values "pop" into view much more gradually than the former, so i figured that when I'm reconstructing the view space vector for a given fragment, the incorrect result is throwing off my point lighting. Any ideas?

[source]

glGenTextures(1, &texDepth);
glBindTexture(GL_TEXTURE_2D, texDepth);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, w, h, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, NULL);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_BORDER);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_BORDER);
glTexParameteri (GL_TEXTURE_2D, GL_DEPTH_TEXTURE_MODE, GL_LUMINANCE);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, texDepth, 0);

GLenum buffers[] = { GL_COLOR_ATTACHMENT0_EXT, GL_COLOR_ATTACHMENT1_EXT};
glDrawBuffers(2, buffers);

/////////////////////////////////////////////////////////////////////////////////////////////////////////

// G Buffer fragment shader
gl_FragData[0] = vec4(oDiffuse, 1.0);
gl_FragData[1] = vec4(0.5f * (normalize(oNormal) + 1.0f) , 1.0);

/////////////////////////////////////////////////////////////////////////////////////////////////////////

// Lighting fragment shader
float Depth = texture2D(iTexDepth, oTexCoords).r;
vec4 Position = vec4(oTexCoords.x * 2.0f - 1.0f, (oTexCoords.y * 2.0f - 1.0f), Depth, 1.0);

Position = mtxInvProj * Position;
Position /= Position.w;
[/source]

Share this post


Link to post
Share on other sites
Advertisement

However, when I try and use GL_DEPTH_ATTACHMENT_EXT as the attachment point (see listing below), I get incorrect results. Lighting changes with camera position, triangles facing away from the light source are lit and so on. When I display just the depth buffer, data is being written but it seems to be much more "bunched together" than when using GL_COLOR_ATTACHMENT2_EXT as the attachment point. For example, if I move the camera towards the mesh with the latter, the mesh depth values "pop" into view much more gradually than the former, so i figured that when I'm reconstructing the view space vector for a given fragment, the incorrect result is throwing off my point lighting. Any ideas?

My first bet would be, that you write the linear depth when using the color buffer and open gl writes the non-linear depth. I solved this this problem by using both, the depth attachment to use the z-buffer and a single color attachment for depth(2*halffloat) and normal(2*halffloat compressed).

Share this post


Link to post
Share on other sites
Yeah, I had a suspicion that it might a non-linear depth issue. I wanted to avoid having an extra buffer just to write linear depth and use the depth buffer used for z testing directly. Is there any way to get a linear depths value from a non-linear one?

Share this post


Link to post
Share on other sites
I fixed the issue. For those who may stumble across this thread in the future with the same problem as me, the key is this line in the specs:

After clipping and division by w, depth coordinates range from -1 to 1, corresponding to the near and far clipping planes. glDepthRange specifies a linear mapping of the normalized depth coordinates in this range to window depth coordinates. Regardless of the actual depth buffer implementation, window coordinate depth values are treated as though they range from 0 through 1 (like color components). Thus, the values accepted by glDepthRange are both clamped to this range before they are accepted.[/quote]

As the values in the depth buffer are stored in [0,1] range, they must be converted to [-1,1] range for the unprojection, like so: Depth = (2.0f * Depth - 1.0f)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!