Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


FrameBuffer mixing with BackBuffer


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 maxest   Members   -  Reputation: 294

Like
0Likes
Like

Posted 12 August 2012 - 07:24 AM

I read here (http://stackoverflow.com/questions/5279123/normal-back-buffer-render-to-depth-texture-in-opengl-fbos) that it is impossible to mix backbuffer with a custom GL framebuffer. I ran into this problem today, but it appears that some sort of mixing does work.

I have a few passes in my engine. The first one is early-z, followed by a pass that writes down normals and depths. Early-z renders to the backbuffer, but normal-depth pass switches the render target (*). What is interesting is that normal-depth pass utilizies early-z to only shade pixels that are visible, but in theory that should not work as the backbuffer's z-buffer is filled it, whereas I already have bound a new framebuffer object, which has not z-buffer render target or texture bound at all.

On the other hand, if I skip the early-z pass, leaving only normal-depth that sets the render target, the further rendering screws up (note that right now, as there is no early-z, normal-depth pass has to write z values). So basically it appears that reading from the backbuffer's Z is possible, but it is not possible to simultanously write to a framebuffer color, and backbuffer's z. Is that correct?

I am aware that my description might be a bit enigmatic so if necessary, I can describe it in more detail and post more code.

(*) the function looks this:
void CRenderer::setRenderTarget(const CRenderTarget *renderTarget)
{
  if (renderTarget == NULL)
  {
   glViewport(0, 0, CApplication::getScreenWidth(), CApplication::getScreenHeight());
   glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
  }
  else
  {
   glViewport(0, 0, renderTarget->width, renderTarget->height);
   glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, offScreenFramebuffer);
   glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, renderTarget->texture, 0);
  }
}


Sponsor:

#2 maxest   Members   -  Reputation: 294

Like
0Likes
Like

Posted 22 August 2012 - 01:25 PM

Bump. Anyone?

#3 V-man   Members   -  Reputation: 805

Like
1Likes
Like

Posted 22 August 2012 - 01:37 PM

So you are saying you are rendering to a texture (FBO) that doesn't have a depth buffer, but it appears that it is using the depth buffer from the window? That should not be possible. If it is working, there is no guarantee that it will work on other hardware.

Also, if you don't have a depth buffer attached to your FBO, you should should disable depth testing. http://www.opengl.org/wiki/Framebuffer_Object_Examples#Color_only
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS