# Render Buffer

## Recommended Posts

Hi together, currently I am trying to render a scene into a 16 bit depth buffer and read back the result int main memory using glReadPixels with and without GL_PIXEL_PACK_BUFFER_EXT. But somehow the read out dose not work. Here is how I am initalizing the buffers:
  mInternalFormat = GL_DEPTH_COMPONENT16;
mFormat         = GL_DEPTH_COMPONENT;
mType           = GL_UNSIGNED_SHORT;

if(!mBuffersInitialized){
glGenTextures(1,&mDBOTexture);
glGenTextures(1,&mFBOTexture);
glGenFramebuffersEXT(1, &mFBO);
glGenRenderbuffersEXT(1,&mDBO);
}

glBindFramebufferEXT (GL_FRAMEBUFFER_EXT, mFBO);
//initialize texture
glBindTexture(target,mFBOTexture);
glTexParameteri(target,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(target,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexImage2D(target,0,GL_RGBA,512,512,0,GL_RGBA,GL_UNSIGNED_BYTE,0);

//initialize depth renderbuffer
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, mDBO);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, mInternalFormat, mWidth, mHeight);

glBindTexture(target, mDBOTexture);
glTexParameteri(target, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(target, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(target, 0, mInternalFormat, mWidth, mHeight, 0, mFormat, mType, NULL);

glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, mFBO);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,target,mFBOTexture, 0);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, mDBO);

glDrawBuffer(GL_NONE); // no color buffer dest
glReadBuffer(GL_NONE); // no color buffer src

data);
checkDepthbufferStatus();

if(!mBuffersInitialized){
glGenBuffersARB(N_MAX_BUFFERS, mPixelBuffer);
mBuffersInitialized = true;
}

for (unsigned int iBuffer = 0; iBuffer < N_MAX_BUFFERS; ++iBuffer){
glBindBufferARB(GL_PIXEL_PACK_BUFFER_EXT, mPixelBuffer[iBuffer]);
}

glBindBufferARB(GL_PIXEL_PACK_BUFFER_EXT, 0);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);


That is how im tring to read back:
  glReadPixels(0, 0, mWidth, mHeight, GL_DEPTH_COMPONENT, GL_UNSIGNED_SHORT,buffer);


wold be nice if sombody could help me. Cheers Pascal

##### Share on other sites
How do you know it isn't working? What are the "symptoms"?

Also, if you call glGetError() after some key points in your code, you can identify any errors that occur when you are seetting up your buffer and whatnot.

For example, you may want to check for errors after you generate the buffers, and after you bind them,and after you call glFramebufferTexture2DEXT(), etc., to determine where the error happens.

##### Share on other sites
I assume you are working on an ATI card...

Everything on ati cards that has to do anything with depth textures is totalty buggy, so use a shader to render the depth texture into an fbo, that's the only way working on every ati card.

See my last thread for reference:

http://www.gamedev.net/community/forums/topic.asp?topic_id=446689

##### Share on other sites
Thank you for the quick response.

I checked for gl errors after almost any line and bisides the error after:
//  glReadBuffer(GL_FRONT); //glError 1280

which I don't use anymore (but probably it should be there) everything works fine. I also use this code:
   GLenum status;   status = (GLenum) glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);   switch(status)    {   case GL_FRAMEBUFFER_COMPLETE_EXT:      break;   case GL_FRAMEBUFFER_UNSUPPORTED_EXT:      cout << "Unsupported framebuffer format\n";      break;   case GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT:      cout << "Framebuffer incomplete, missing attachment\n";      break;   case GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT:      cout << "Framebuffer incomplete, attached images must have same dimensions\n";      break;   case GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT:      cout << "Framebuffer incomplete, attached images must have same format\n";      break;   case GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT:      cout << "Framebuffer incomplete, missing draw buffer\n";      break;   case GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT:      cout << "Framebuffer incomplete, missing read buffer\n";      break;   default:      cout << "Framebuffer unknown error!\n";      //assert(0);   }   return status == GL_FRAMEBUFFER_COMPLETE_EXT;

to check the framebuffer and again everything looks nice. There are two reasons why I assume that the readback is the problem:
- The rendering takes some time...?!
- The buffer I want to write the data in is unchanged even if it contains data.

My hardware is a GeForce FX 5900 Ultra. The driver version is 8776 running on kubuntu.
What I am trying to do is to modify two 16 Bit images on the hardware and read the result back. First I used float buffers but unfortunately I had to use 32 bit and it is only possible to read back 4 channels at once which is terrible slow. So my idea was to render single images into a 16 Bit depth buffer and read them back. If anybody has a better idea how to render the images fast without loosing information i would be glad. Oh and if the interpolation would work (outside the shader) it would be great.

Pascal

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628333
• Total Posts
2982139

• 9
• 24
• 9
• 9
• 13