how to do Vertex texture fetch for depth texture generated by FBO? is it possible?

Started by
4 comments, last by V-man 15 years, 6 months ago
the following code generates the depth texture,using fbo depth attachment. glGenTextures(1, &depthTextureID); glBindTexture(GL_TEXTURE_2D, depthTextureID); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glGenFramebuffersEXT(1, &framebufferID); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, framebufferID); // Set up some renderbuffer state glGenRenderbuffersEXT(1, &renderbufferID); glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, renderbufferID); glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT32, 1024, 1024); glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, renderbufferID); glTexImage2D(GL_TEXTURE_2D, 0,GL_DEPTH_COMPONENT32, 1024, 1024, 0,GL_DEPTH_COMPONENT , GL_UNSIGNED_BYTE, NULL); glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT , GL_TEXTURE_2D, &depthTextureID);, 0); glDrawBuffer(GL_NONE); glReadBuffer(GL_NONE); GLenum fboStatus = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT); if (fboStatus != GL_FRAMEBUFFER_COMPLETE_EXT) { fprintf(stderr, "FBO Error!\n"); } glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); today only the GL_LUMINANCE_FLOAT32_ATI or GL_RGBA_FLOAT32_ATI texture internalformat can be used for VTF(is it right?),then how can i use VTF for the generated depth texture. if i use glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_FLOAT32_ATI , 1024, 1024, 0,GL_DEPTH_COMPONENT , GL_UNSIGNED_BYTE, NULL),then will get "FBO Error". Thanks a lot
Advertisement
Its not strictly true that only GL_LUMINANCE_FLOAT32_ATI and GL_RGBA_FLOAT32_ATI are supported for ATI, it depends on the driver/card. For one, I know that some non-floating point formats are supported on current NVIDIA Mac drivers.
The best way is to try out each format and see what happens :)
GL_LUMINANCE_FLOAT32_ATI or GL_RGBA_FLOAT32_ATI are part the old extension. Feel free to move up to GL_ARB_texture_float
As far as supported formats, it depends on the GPU. I don't know what Gf8 and Gf9 support.
Certain ATI cards will return 0 for
glGetIntegerv(GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, &MaxVertexTextureImageUnits);

You can use GL_DEPTH_COMPONENT32 and it will work except it will run the vertex shader in software mode.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Thanks V-Man

i use Geforce 7300 card,

glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT32, 1024, 1024, 0,GL_DEPTH_COMPONENT , GL_UNSIGNED_BYTE, NULL);

Now the depth texture internalformat is GL_DEPTH_COMPONENT32.Because VTF need

GL_RGBA32F_ARB or GL_LUMINANCE32F_ARB,so the depth texture couldnot be used for VTF.Is it right?

The conclusion that depth texture couldnot be used for VTF is right or not?
If this returns a non-zero value
glGetIntegerv(GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, &MaxVertexTextureImageUnits);
then you can use vertex texture fetch. That's all that GL tells you.

If the GPU supports GL_DEPTH_COMPONENT32 sampling from your VS, then it is ok.
If not, it will run in software mode.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);

This topic is closed to new replies.

Advertisement