Jump to content
  • Advertisement
Sign in to follow this  
stramit

OpenGL Cross Platform FBO depth buffer issue

This topic is 4360 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, As an experiment i'm trying to write a simple deferred shader in lwjgl. Everyhting was going quite well untill I decided to make an attempt at early z-culling. The results worked perfically well on windows, but not on osx (ignore the normal's channel as it's not finished yet): normals | colour ----------------- depth | blank Windows: windows split Apple: apple split I do 2 passes at present to split my scene. The first one renders the depth component, the second renders the color and the normal. To avoid having to use PBuffers and hackery associated with the NV render to depth texture functing I decided on FBO's as I can create a depth texture and just bind that to the screen for my lighting passes. As can be seen from the pictures something is not quite right in the apple rendering. If anyone can offer me any advice I would be very happy to listen. Now some code: Setting up the FBO:
    IntBuffer intBuffer = BufferUtils.createIntBuffer( 1 );
    EXTFramebufferObject.glGenFramebuffersEXT( intBuffer );
    myFBOId = intBuffer.get();

    EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, myFBOId );

    EXTFramebufferObject.glFramebufferTexture2DEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, EXTFramebufferObject.GL_COLOR_ATTACHMENT0_EXT, 
textureType, getTextureID( "fbo0" ), 0 );

    int textureID = TextureLoader.createTextureID();
    t = new Texture( textureID, width, height, textureType );

    GL11.glBindTexture( textureType, textureID );
    GL11.glTexParameteri( textureType, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_LINEAR );
    GL11.glTexParameteri( textureType, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_LINEAR );

    GL11.glTexImage2D( textureType, 0, GL14.GL_DEPTH_COMPONENT24, width, 
height, 0, GL11.GL_DEPTH_COMPONENT, GL11.GL_FLOAT, (ByteBuffer)null );

    EXTFramebufferObject.glFramebufferTexture2DEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, 
EXTFramebufferObject.GL_DEPTH_ATTACHMENT_EXT, textureType, textureID, 0 );

    fboErrorCheck();
    EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, 0 );




This is quite standard I do believe, with a 24 bit depth component. Rendering the depth pass:
    fbo1.activate();

    GL11.glViewport( 0, 0, width, height );
    GL11.glMatrixMode( GL11.GL_PROJECTION );
    GL11.glLoadIdentity();
    GLU.gluPerspective( 45.0f, aspectRatio, 1.0f, 7000.0f );
    GL11.glMatrixMode( GL11.GL_MODELVIEW );
    GL11.glLoadIdentity();

    // OpenGL render states configuration
    GL11.glCullFace( GL11.GL_BACK );
    GL11.glEnable( GL11.GL_CULL_FACE );
    GL11.glDisable( GL11.GL_BLEND );
    GL11.glDepthFunc( GL11.GL_LEQUAL );
    GL11.glEnable( GL11.GL_DEPTH_TEST );
    GL11.glColor4f( 1.0f, 1.0f, 1.0f, 1.0f );
    GL11.glClearDepth( 1.0 );
    GL11.glColorMask( true, true, true, true );
    GL11.glDepthMask( true );
    GL11.glClear( GL11.GL_DEPTH_BUFFER_BIT | GL11.GL_COLOR_BUFFER_BIT );

    GL11.glColorMask( false, false, false, false );
    GL11.glDepthMask( true );
    GL11.glDisable( GL11.GL_LIGHTING );

    GameManager.getGameManager().getCameraManager().getCamera( "main").lookAt();

    scene.draw();

    GL11.glColorMask( true, true, true, true );
    GL11.glDepthMask( false );
    fbo1.deactivate();




Render the colour / normals.
    fbo1.activate();
    deferredSplitShader.activate()
    IntBuffer temp = BufferUtils.createIntBuffer( 3 );
    temp.put( 0, EXTFramebufferObject.GL_COLOR_ATTACHMENT0_EXT );
    temp.put( 1, EXTFramebufferObject.GL_COLOR_ATTACHMENT1_EXT );
    temp.rewind();
    GL20.glDrawBuffers( temp );

    GL11.glPushAttrib( GL11.GL_VIEWPORT_BIT );
    GL11.glViewport( 0, 0, width, height );

    GL11.glLoadIdentity();

    GL11.glPushMatrix();

    //configure camera
    GameManager.getGameManager().getCameraManager().getCamera( "main").lookAt();

    scene.draw();

    GL11.glPopMatrix();
    GL11.glPopAttrib();
    fbo1.deactivate();

    deferredSplitShader.deactivate();




This code executes on both machines, the problem is that it just doesn't look right on the apple. Any help would be much apreciated.

Share this post


Link to post
Share on other sites
Advertisement
Further research has shown to me that if I turn down the depth texture size:

GL11.glTexImage2D( textureType, 0, GL14.GL_DEPTH_COMPONENT16, width, height, 0, GL11.GL_DEPTH_COMPONENT, GL11.GL_FLOAT, (ByteBuffer)null );

GL14.GL_DEPTH_COMPONENT16 instead of 24 the problem is not as pronounced (although it is still present).

So i'm pretty sure that this is a depth buffer issue, i just have no idea how to solve it.

Share this post


Link to post
Share on other sites
Are you using one FBO with two attachments or two FBOS with one attachment each? I would use one FBO with two attachments, or one attachment and one renderbuffer for the depth. I personally use a RB with my FBO for depth...

Share this post


Link to post
Share on other sites
I use one fbo. The fbo has 3 colour attachments and 1 depth attachment. As I am trying to render depth to a texture it is infeasable to use a renderbuffer as they cant be bound as textures.

Share this post


Link to post
Share on other sites
Quote:
Original post by stramit
I use one fbo. The fbo has 3 colour attachments and 1 depth attachment. As I am trying to render depth to a texture it is infeasable to use a renderbuffer as they cant be bound as textures.


Hmm I am using it...


//depth buffer setup code here
glGenRenderbuffersEXT(1, &depth_rb);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depth_rb);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, depthBufferBitDepth, texWidth, texHeight);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth_rb);
glGenTextures(1, &depthtexture);
glBindTexture(texture_target, depthtexture);



Share this post


Link to post
Share on other sites

glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth_rb);
glGenTextures(1, &depthtexture);
glBindTexture(texture_target, depthtexture);



What are you specifying as the texture_target in this example? I don't know what possible values could bind a texture to an active render buffer? I thought the point was that a renderbuffer was a buffer for when you didn't need it as a texture, and that a texture image should be used when specifying a buffer which you want to bind as a texture.

Share this post


Link to post
Share on other sites
Looks like z-fighting to me which is why changing your depth precision affects it. Not sure why it's occuring if you are rendering *exactly* the same geometry. Are you rendering *exactly* the same geometry? In your depth pass you look like you're using fixed func T&L whereas for the deferred you are (obviously) using programmed T&L. This may be causing the problem. In your vertex shader you must use ftransform() to make sure the transformed vertices match the fixed func vertices.

If that's not the problem I suggest you cheat and edge your objects slightly towards the camera in the same way you'd stop z-fighting when shadow mapping.

Share this post


Link to post
Share on other sites
Vertex shader for the colour pass (it unfortunately needs to be 2 passes on osx as the 1600xt mobility in the macbook bro does not support GL drawbuffers, so I do a normal pass after this one):

void main()
{
//Mandatory outputs
gl_Position = ftransform();
gl_TexCoord[0] = gl_MultiTexCoord0;
}



Frag shader:

uniform sampler2D diffuseTex;

void main()
{
gl_FragColor = vec4(texture2D(diffuseTex, gl_TexCoord[0].st).rgb, 0);
}



Quite standard if I do say so myself.

(and just a note, even though the CPU code posted above uses gldrawbuffers, the screenshots are both from the fallback path)


Thank you very much for the input coordz, I will be trying this asap.

It is a very frustrating issue, if I use a rendertarget instead of the depth texture there are no issues... it's just that I can't bind a rendertarget as a texture :(


Share this post


Link to post
Share on other sites
Quote:
Original post by stramit
*** Source Snippet Removed ***

What are you specifying as the texture_target in this example? I don't know what possible values could bind a texture to an active render buffer? I thought the point was that a renderbuffer was a buffer for when you didn't need it as a texture, and that a texture image should be used when specifying a buffer which you want to bind as a texture.


GL_TEXTURE_2D

Share this post


Link to post
Share on other sites
That doesn't make any sense though. You generate a texture ID then bind the texture to the current context for use as a texture and that is all. You havn't attached the texture at a target for the depthchannel of the renderbuffer.

I have no idea how the code you posted actually renders the depth channel to the texture. If there is something I don't undersand here then please enlighten me as I'd be very interested to know. (You also havn't even created room in memory for the texture, or given the texture a type using glTexImage2D).

Unless there is some automagical way that if you bind an empty texture when a RenderBuffer is active it creates the texture and automatically binds it to the currently active renderbuffer.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!