Jump to content
  • Advertisement
Sign in to follow this  
subi211

OpenGL Copying the depth buffer to a texture for use in post-processing

This topic is 2872 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

When I caught myself about to hurl my laptop across the room in frustration, I knew it was time to ask someone. It seems pathetic, given that this is such a fundamental thing that should just work, but I've been googling for two days now, including reading most of this forum for the past ten years, and I cannot get this to work.

I'm just trying to copy the depth buffer to a texture that I can pass to a shader. At the moment I have no shaders active, I just render the scene, copy the depth buffer to a texture, then render that texture in 2D. So far, the texture resolutely stays white, unless I don't copy anything to it, in which case it stays grey.

So, here's the relevant highlights of the code (it doesn't do much more than this, it's just a test app). First, the various depth flags:

  glEnable(GL_DEPTH_TEST);
glDepthMask(true);
glDepthFunc(GL_LEQUAL);
glClearDepth(1.0f);
glEnable(GL_DEPTH);


Then the depth texture creation:

  unsigned int uiDepthTextureID;
glGenTextures(1, &uiDepthTextureID);
glBindTexture(GL_TEXTURE_2D, uiDepthTextureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, 1024, 1024, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, NULL);


(I have tried various values other than GL_DEPTH_COMPONENT, including GL_DEPTH_COMPONENT24 and GL_LUMINENCE, this is just the state of the code as I write this).

The render (with a zn of 1 and a zf of 10000) g_p_vtxCube being an interleaved array:

  glInterleavedArrays(GL_T2F_C4F_N3F_V3F, 0, g_p_vtxCube);
glDrawElements(GL_TRIANGLES, g_uiNumIndices, GL_UNSIGNED_SHORT, g_p_usCube);


The depth copy:

    glBindTexture(GL_TEXTURE_2D, uiDepthTextureID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, 0, 0, 1024, 1024, 0);


And the 2D render:

    glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1024, 768, 0, -1, 1); // I know, but I prefer the top-left as origin.

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

glBegin(GL_QUADS);
glTexCoord2f(0, 0); glVertex3f(0, 0, 0);
glTexCoord2f(1, 0); glVertex3f(256, 0, 0);
glTexCoord2f(1, 1); glVertex3f(256, 256, 0);
glTexCoord2f(0, 1); glVertex3f(0, 256, 0);
glEnd();


Now, I have changed every single value in there, types, modes, flags and values to every single permutation I can find on the web with no success. The texture stays white. It's like the depth values aren't getting saved, but the behaviour of the scene shows that they must be, so it must be the copy. The behaviour is the same on cards old and new, ATI and NVidia. And, for that matter, on Windows and Linux.

Can anyone tell me what idiotic thing I'm doing or not doing? I don't care if it makes me feel stupid. In fact, I might register www.howtocopythedepthbuffertoatextureinopengl.com and leave the answer there for future generations, so no-one has to go through this pain ever again. :)

Cheers.

[Edited by - subi211 on December 7, 2010 4:28:18 AM]

Share this post


Link to post
Share on other sites
Advertisement
I think you need to:

glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);

before you call glCopy~~

though I would use a fbo so you render the depth directly to texture (rtt). You'' finf more on this if look to using fbo(frame buffer objects).

dont forget to set them back:
glDrawBuffer(GL_BACK);
glReadBuffer(GL_FRONT);

Share this post


Link to post
Share on other sites
No luck with using GL_NONE I'm afraid.

I'll try with an FBO, I assume you mean because the depth element on those is on a separate texture made with glGenRenderbuffersEXT? And that if I just glBindTexture that texture in the post-process step I can access it like any other in a shader?

Share this post


Link to post
Share on other sites
You create the depth texture the same way you create a color texture for the color attachments. So something along the lines of...
GLint depthTexture;
glGenTextures (1, &depthTexture);
glBindTexture (GL_TEXTURE_2D, depthTexture);

glTexImage2D (GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24, screenwidth, screenheight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, NULL);

glFramebufferTexture2D (GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, depthTexture, 0);

Or the equivalent with the EXT functions.

Share this post


Link to post
Share on other sites
Okay, got it. For reference, here's the FBO creation code:

  // Create a frame buffer object.
glGenFramebuffersEXT(1, &m_uiFramebufferID);
glBindFramebufferEXT(GL_FRAMEBUFFER, m_uiFramebufferID);

// Create texture.
unsigned int uiFBOWidth = 32;
while(uiFBOWidth < max(w, h))
{
uiFBOWidth *= 2;
}

glGenTextures(1, &m_uiRenderbufferID);
glBindTexture(GL_TEXTURE_2D, m_uiRenderbufferID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, uiFBOWidth, uiFBOWidth, 0, GL_RGB, GL_UNSIGNED_BYTE, NULL);

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, m_uiRenderbufferID, 0);

// Create depth texture.
glGenTextures(1, &m_uiRenderbufferDepthID);
glBindTexture(GL_TEXTURE_2D, m_uiRenderbufferDepthID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24, uiFBOWidth, uiFBOWidth, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, NULL);

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_uiRenderbufferDepthID, 0);

// Check everything worked.
GLenum eStatus = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER);
if(eStatus != GL_FRAMEBUFFER_COMPLETE)
{
return false;
}


That gives me the depth information in m_uiRenderbufferDepthID, and when I bind and render that I can see the depth buffer.

As a further question, I know the depth effect is subtle (unless the camera is close) as depth buffers aren't linear. I wasn't expecting them to be THAT subtle though - in fact, I can't see how there can be enough difference to be useful in post-processing. Is there anything to be done about that?

Share this post


Link to post
Share on other sites
You can create a custom depth in a 1 channel texture.
By using a GL_R32F texture format instead. You can write to the that channel in the shader.

float depth = (-viewPos.z-near)/(far-near);

http://www.gamerendering.com/category/rendering-methods/page/3

Share this post


Link to post
Share on other sites
I did a quick check:

  glBindTexture(GL_TEXTURE_2D, sgl.m_uiRenderbufferDepthID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

float* p_fTexture = (float*)malloc(1024 * 1024 * sizeof(float));
glGetTexImage(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, GL_FLOAT, p_fTexture);

FILE* p_f = fopen("DEPTH.RAW", "wb");
for(int i = 0; i < (1024 * 1024); i++)
{
unsigned char c = (unsigned char)(pow(10.0f, p_fTexture) * 255.0f);
fwrite(&c, 1, 1, p_f);
}
fclose(p_f);
free(p_fTexture);


... and the information IS there, so no worries. I shall adjust my z-values for a better range.

Thanks guys!

[Edited by - subi211 on December 7, 2010 9:29:38 AM]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!