Incomplete render buffer for depth texture

Started by
2 comments, last by tseval 14 years, 7 months ago
Hi folks, I have a little problem with a depth texture FBO. I want to render to a depth buffer (for shadow map rendering) and use the following code to initialize the FBO. glGenTextures(1, &depth_tex_); glBindTexture(GL_TEXTURE_2D, depth_tex_); glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24, depth_size_*num_splits_, depth_size_, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, (GLvoid*)NULL); // Set up FBO with a depth texture array as target. glGenFramebuffersEXT(1, &depth_fb_); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, depth_fb_); // Attach texture to framebuffer depth buffer glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, depth_tex_, 0); glDrawBuffer(GL_NONE); glReadBuffer(GL_NONE); GLuint status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT); This code seems to work nicely, since I get my shadows and everything. However the glCheckFrameBufferStatusEXT returns GL_FRAMEBUFFER_UNSUPPORTED_EXT on my linux computers with NVIDIA 8700 and 7950 graphics cards. On a MacBook Pro with NVIDIA 9600 it returns GL_FRAMEBUFFER_COMPLETE_EXT. As I said, the code works, even if the status check returns an error condition on linux, but I would like to know what happens here... Can anyone see what I have done wrong here, or is this a driver problem or something? Cheers
Advertisement
http://www.opengl.org/wiki/Common_Mistakes#Creating_a_Texture
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
What V-man wanted to say is that you have to set the texture filtering in any case. What could go wrong on some graphics card as well is the DEPTH_COMPONENT24 format. Some graphics cards might only support a combined depth and stencil format which is in most cases GL_DEPTH24_STENCIL8_EXT.

Did you try using glGetError after the texture creation and image set calls?
------------------------------------I always enjoy being rated up by you ...
Great! That solved it, thanks a lot :-D

This topic is closed to new replies.

Advertisement