Hi,
As an experiment i'm trying to write a simple deferred shader in lwjgl. Everyhting was going quite well untill I decided to make an attempt at early z-culling. The results worked perfically well on windows, but not on osx (ignore the normal's channel as it's not finished yet):
normals | colour
-----------------
depth | blank
Windows:
Apple:
I do 2 passes at present to split my scene. The first one renders the depth component, the second renders the color and the normal. To avoid having to use PBuffers and hackery associated with the NV render to depth texture functing I decided on FBO's as I can create a depth texture and just bind that to the screen for my lighting passes.
As can be seen from the pictures something is not quite right in the apple rendering. If anyone can offer me any advice I would be very happy to listen.
Now some code:
Setting up the FBO:
IntBuffer intBuffer = BufferUtils.createIntBuffer( 1 );
EXTFramebufferObject.glGenFramebuffersEXT( intBuffer );
myFBOId = intBuffer.get();
EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, myFBOId );
EXTFramebufferObject.glFramebufferTexture2DEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, EXTFramebufferObject.GL_COLOR_ATTACHMENT0_EXT,
textureType, getTextureID( "fbo0" ), 0 );
int textureID = TextureLoader.createTextureID();
t = new Texture( textureID, width, height, textureType );
GL11.glBindTexture( textureType, textureID );
GL11.glTexParameteri( textureType, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_LINEAR );
GL11.glTexParameteri( textureType, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_LINEAR );
GL11.glTexImage2D( textureType, 0, GL14.GL_DEPTH_COMPONENT24, width,
height, 0, GL11.GL_DEPTH_COMPONENT, GL11.GL_FLOAT, (ByteBuffer)null );
EXTFramebufferObject.glFramebufferTexture2DEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT,
EXTFramebufferObject.GL_DEPTH_ATTACHMENT_EXT, textureType, textureID, 0 );
fboErrorCheck();
EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, 0 );
This is quite standard I do believe, with a 24 bit depth component.
Rendering the depth pass:
fbo1.activate();
GL11.glViewport( 0, 0, width, height );
GL11.glMatrixMode( GL11.GL_PROJECTION );
GL11.glLoadIdentity();
GLU.gluPerspective( 45.0f, aspectRatio, 1.0f, 7000.0f );
GL11.glMatrixMode( GL11.GL_MODELVIEW );
GL11.glLoadIdentity();
// OpenGL render states configuration
GL11.glCullFace( GL11.GL_BACK );
GL11.glEnable( GL11.GL_CULL_FACE );
GL11.glDisable( GL11.GL_BLEND );
GL11.glDepthFunc( GL11.GL_LEQUAL );
GL11.glEnable( GL11.GL_DEPTH_TEST );
GL11.glColor4f( 1.0f, 1.0f, 1.0f, 1.0f );
GL11.glClearDepth( 1.0 );
GL11.glColorMask( true, true, true, true );
GL11.glDepthMask( true );
GL11.glClear( GL11.GL_DEPTH_BUFFER_BIT | GL11.GL_COLOR_BUFFER_BIT );
GL11.glColorMask( false, false, false, false );
GL11.glDepthMask( true );
GL11.glDisable( GL11.GL_LIGHTING );
GameManager.getGameManager().getCameraManager().getCamera( "main").lookAt();
scene.draw();
GL11.glColorMask( true, true, true, true );
GL11.glDepthMask( false );
fbo1.deactivate();
Render the colour / normals.
fbo1.activate();
deferredSplitShader.activate()
IntBuffer temp = BufferUtils.createIntBuffer( 3 );
temp.put( 0, EXTFramebufferObject.GL_COLOR_ATTACHMENT0_EXT );
temp.put( 1, EXTFramebufferObject.GL_COLOR_ATTACHMENT1_EXT );
temp.rewind();
GL20.glDrawBuffers( temp );
GL11.glPushAttrib( GL11.GL_VIEWPORT_BIT );
GL11.glViewport( 0, 0, width, height );
GL11.glLoadIdentity();
GL11.glPushMatrix();
//configure camera
GameManager.getGameManager().getCameraManager().getCamera( "main").lookAt();
scene.draw();
GL11.glPopMatrix();
GL11.glPopAttrib();
fbo1.deactivate();
deferredSplitShader.deactivate();
This code executes on both machines, the problem is that it just doesn't look right on the apple. Any help would be much apreciated.