Incomplete render buffer for depth texture
Hi folks,
I have a little problem with a depth texture FBO. I want to render to a depth buffer (for shadow map rendering) and use the following code to initialize the FBO.
glGenTextures(1, &depth_tex_);
glBindTexture(GL_TEXTURE_2D, depth_tex_);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24,
depth_size_*num_splits_, depth_size_, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, (GLvoid*)NULL);
// Set up FBO with a depth texture array as target.
glGenFramebuffersEXT(1, &depth_fb_);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, depth_fb_);
// Attach texture to framebuffer depth buffer
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, depth_tex_, 0);
glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);
GLuint status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
This code seems to work nicely, since I get my shadows and everything. However the glCheckFrameBufferStatusEXT returns GL_FRAMEBUFFER_UNSUPPORTED_EXT on my linux computers with NVIDIA 8700 and 7950 graphics cards. On a MacBook Pro with NVIDIA 9600 it returns GL_FRAMEBUFFER_COMPLETE_EXT.
As I said, the code works, even if the status check returns an error condition on linux, but I would like to know what happens here...
Can anyone see what I have done wrong here, or is this a driver problem or something?
Cheers
What V-man wanted to say is that you have to set the texture filtering in any case. What could go wrong on some graphics card as well is the DEPTH_COMPONENT24 format. Some graphics cards might only support a combined depth and stencil format which is in most cases GL_DEPTH24_STENCIL8_EXT.
Did you try using glGetError after the texture creation and image set calls?
Did you try using glGetError after the texture creation and image set calls?
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement