Sign in to follow this  
Bugbuster

Render Buffer

Recommended Posts

Bugbuster    122
Hi together, currently I am trying to render a scene into a 16 bit depth buffer and read back the result int main memory using glReadPixels with and without GL_PIXEL_PACK_BUFFER_EXT. But somehow the read out dose not work. Here is how I am initalizing the buffers:
  mInternalFormat = GL_DEPTH_COMPONENT16;
  mFormat         = GL_DEPTH_COMPONENT;
  mType           = GL_UNSIGNED_SHORT;

  if(!mBuffersInitialized){
    glGenTextures(1,&mDBOTexture);
    glGenTextures(1,&mFBOTexture);
    glGenFramebuffersEXT(1, &mFBO);
    glGenRenderbuffersEXT(1,&mDBO);
  }

  glBindFramebufferEXT (GL_FRAMEBUFFER_EXT, mFBO);
  //initialize texture
  glBindTexture(target,mFBOTexture);
  glTexParameteri(target,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
  glTexParameteri(target,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
  glTexImage2D(target,0,GL_RGBA,512,512,0,GL_RGBA,GL_UNSIGNED_BYTE,0);

  //initialize depth renderbuffer
  glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, mDBO);
  glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, mInternalFormat, mWidth, mHeight);

  glBindTexture(target, mDBOTexture);
  glTexParameteri(target, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
  glTexParameteri(target, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
  glTexImage2D(target, 0, mInternalFormat, mWidth, mHeight, 0, mFormat, mType, NULL);

  glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, mFBO);
  glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,target,mFBOTexture, 0);
  glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, mDBO);

//  glReadBuffer(GL_FRONT); //glError 1280
  glDrawBuffer(GL_NONE); // no color buffer dest
  glReadBuffer(GL_NONE); // no color buffer src

data); 
  checkDepthbufferStatus();

  if(!mBuffersInitialized){
    glGenBuffersARB(N_MAX_BUFFERS, mPixelBuffer);
    mBuffersInitialized = true;
  }

  for (unsigned int iBuffer = 0; iBuffer < N_MAX_BUFFERS; ++iBuffer){
      glBindBufferARB(GL_PIXEL_PACK_BUFFER_EXT, mPixelBuffer[iBuffer]);
      glBufferDataARB(GL_PIXEL_PACK_BUFFER_EXT, width*height*getDepth(), NULL, GL_STATIC_READ);
  }

  glBindBufferARB(GL_PIXEL_PACK_BUFFER_EXT, 0);
  glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

That is how im tring to read back:
  glReadPixels(0, 0, mWidth, mHeight, GL_DEPTH_COMPONENT, GL_UNSIGNED_SHORT,buffer);

wold be nice if sombody could help me. Cheers Pascal

Share this post


Link to post
Share on other sites
smitty1276    560
How do you know it isn't working? What are the "symptoms"?

Also, if you call glGetError() after some key points in your code, you can identify any errors that occur when you are seetting up your buffer and whatnot.

For example, you may want to check for errors after you generate the buffers, and after you bind them,and after you call glFramebufferTexture2DEXT(), etc., to determine where the error happens.

Share this post


Link to post
Share on other sites
Ingrater    187
I assume you are working on an ATI card...

Everything on ati cards that has to do anything with depth textures is totalty buggy, so use a shader to render the depth texture into an fbo, that's the only way working on every ati card.

See my last thread for reference:

http://www.gamedev.net/community/forums/topic.asp?topic_id=446689

Share this post


Link to post
Share on other sites
Bugbuster    122
Thank you for the quick response.

I checked for gl errors after almost any line and bisides the error after:

// glReadBuffer(GL_FRONT); //glError 1280



which I don't use anymore (but probably it should be there) everything works fine. I also use this code:

GLenum status;
status = (GLenum) glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
switch(status)
{
case GL_FRAMEBUFFER_COMPLETE_EXT:
break;
case GL_FRAMEBUFFER_UNSUPPORTED_EXT:
cout << "Unsupported framebuffer format\n";
break;
case GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT:
cout << "Framebuffer incomplete, missing attachment\n";
break;
case GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT:
cout << "Framebuffer incomplete, attached images must have same dimensions\n";
break;
case GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT:
cout << "Framebuffer incomplete, attached images must have same format\n";
break;
case GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT:
cout << "Framebuffer incomplete, missing draw buffer\n";
break;
case GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT:
cout << "Framebuffer incomplete, missing read buffer\n";
break;
default:
cout << "Framebuffer unknown error!\n";
//assert(0);
}
return status == GL_FRAMEBUFFER_COMPLETE_EXT;




to check the framebuffer and again everything looks nice. There are two reasons why I assume that the readback is the problem:
- The rendering takes some time...?!
- The buffer I want to write the data in is unchanged even if it contains data.

My hardware is a GeForce FX 5900 Ultra. The driver version is 8776 running on kubuntu.
What I am trying to do is to modify two 16 Bit images on the hardware and read the result back. First I used float buffers but unfortunately I had to use 32 bit and it is only possible to read back 4 channels at once which is terrible slow. So my idea was to render single images into a 16 Bit depth buffer and read them back. If anybody has a better idea how to render the images fast without loosing information i would be glad. Oh and if the interpolation would work (outside the shader) it would be great.

Pascal

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this