Polygon Flickering
Hey all,
For some reason, whenever I render my scene, the polygons seem to randomly "turn" themselves on and off, so they flicker. Some stay off for a while, some manage to stay on for a while. This is a wierd problem, but it seems that OpenGL isn''t rendering all the polygons that I send to it. This isn''t a Z-Buffer fighting problem, I''ve tried it with a 32bit z-buffer and the scene is only near to the camera. I''ve tried turning off face-culling, depth testing, all sorts of things, yet to no avail. Has anyone else ever had this problem of polygons being seemingly random as to whether they render or not? How did you solve it?
Any and all help is appreciated.
Phil.
Are you using double buffering and swapping buffers? If not, that''s probably your problem.
Billy
Billy
mm should post some code here .. especially the context creation stuff .. or the event handling ...
Maybe you acitvated some things that make doublebuffering impossible.
Is the scene animate in any way ?
Maybe you acitvated some things that make doublebuffering impossible.
Is the scene animate in any way ?
Ok, heres my init code:
Thats the OpenGL init code, I also set the display mode just before that too. Ignore the g_log.TempLog() calls, and some of the stuff at the end ( draw_queue, lights etc ), they are just logging and other internal engine stuff.
Now, the randomness of polygons being rendered is independent of the scene moving, although it does get much worse when it does.
Heres the code that actually displays the scene ( it''s just a simple heightmap I set up ):
SwapBuffers() is called after this in a seperate function.
Any ideas? I''ve tried using simple polygons ( ie, a square ), and it still does it, and I have no idea why!
// SETUP OPENGLint pixel_format;PIXELFORMATDESCRIPTOR pfd = { sizeof(pfd), // Size of this structure 1, // Version of this structure PFD_DRAW_TO_WINDOW | // Draw to Window (not to bitmap) PFD_SUPPORT_OPENGL | // Support OpenGL calls in window PFD_DOUBLEBUFFER, // Double buffered mode PFD_TYPE_RGBA, // RGBA Color mode bit_depth, // choose supported bit depth 0,0,0,0,0,0, // Not used to select mode 0,0, // Not used to select mode 0,0,0,0,0, // Not used to select mode 24, // Size of depth buffer 0, // Not used to select mode 0, // Not used to select mode PFD_MAIN_PLANE, // Not used to select mode 0, // Not used to select mode 0,0,0 }; // Not used to select mode// Choose a pixel format that best matches that described in pfdg_log.TempLog( "choosing pixel format...\n" );pixel_format = ChoosePixelFormat( hdc, &pfd );char pix_format[64];sprintf( pix_format, "pixel format %d chosen.\n", pixel_format );g_log.TempLog( pix_format ); // Set the pixel format for the device contextg_log.TempLog( "setting pixel format..." );if( !SetPixelFormat( hdc, pixel_format, &pfd ) ){ g_log.ErrorLog( "cannot set pixel format" ); g_log.TempLog( "failed.\n" ); return 0;}g_log.TempLog( "done.\n" );g_log.TempLog( "getting opengl rendering context.\n" );glRC = wglCreateContext( hdc );if( glRC ) g_log.TempLog( "making rendering context current.\n" );else{ g_log.TempLog( "cannot get openGL rendering context.\n" ); g_log.ErrorLog( "cannot get openGL rendering context.\n" ); return 0;}wglMakeCurrent( hdc, glRC );// enable alpha blendingg_log.TempLog( "enabling alpha blending.\n" );glEnable( GL_BLEND );// set shading to smoothglShadeModel( GL_SMOOTH );// enable texturingg_log.TempLog( "enabling texturing.\n" );glEnable( GL_TEXTURE_2D );// enable face culling and depth testingglFrontFace( GL_CW );glEnable( GL_DEPTH_TEST );glEnable( GL_CULL_FACE );// enable lightingglEnable( GL_LIGHTING );glEnable( GL_COLOR_MATERIAL );glColorMaterial( GL_FRONT, GL_AMBIENT_AND_DIFFUSE );// enable vertex array renderingglEnableClientState( GL_VERTEX_ARRAY );// enable texture array renderingglEnableClientState( GL_TEXTURE_COORD_ARRAY );// enable vertex colour array renderingglEnableClientState( GL_COLOR_ARRAY );// enable normal array renderingglEnableClientState( GL_NORMAL_ARRAY );// set clear color, viewport and projection matrixglClearColor( 1.0f, 1.0f, 1.0f, 1.0f );glViewport( 0.0, 0.0, screen_width, screen_height );// set view matrices and screen dimensionsg_log.TempLog( "setting projection matrix.\n" );glMatrixMode( GL_PROJECTION );glLoadIdentity();g_log.TempLog( "using perspective projection.\n" );gluPerspective( 90.0, float(width/height), 0.0, 1000.0 );glMatrixMode( GL_MODELVIEW );glLoadIdentity();glHint( GL_PROJECTION, GL_NICEST );g_log.TempLog( "OpenGL init done.\n" );// set the draw queue to use opengldraw_queue.SetAPI( GRAPHICS_API_OPENGL );// set the light manager to use opengllights.SetAPI( GRAPHICS_API_OPENGL );// send the draw queue a pointer to the texture managerdraw_queue.SetTextureManagerPointer( &textures );draw_queue.SetLightManagerPointer( &lights );float ambient[4] = { 1.0f, 1.0f, 1.0f, 1.0f };glLightModelfv( GL_LIGHT_MODEL_AMBIENT, ambient );
Thats the OpenGL init code, I also set the display mode just before that too. Ignore the g_log.TempLog() calls, and some of the stuff at the end ( draw_queue, lights etc ), they are just logging and other internal engine stuff.
Now, the randomness of polygons being rendered is independent of the scene moving, although it does get much worse when it does.
Heres the code that actually displays the scene ( it''s just a simple heightmap I set up ):
int CTerrainManager::RenderTerrain(){ glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); // This is just a texture that was loaded in previously glBindTexture( GL_TEXTURE_2D, 0 ); for( int z = 0; z < height-1; z++ ) { glBegin( GL_TRIANGLE_STRIP ); // UL vertex float colour = buffer[z*width]/256.0f; float tx = -((width*10.0f)/2); float tz = -(z*10.0f); glColor3f( 0, colour, 0 ); glTexCoord2f( tx/width, tz/height ); glVertex3f( tx, buffer[z*width]/26.0f, tz ); // LL vertex colour = buffer[(z+1)*width]/256.0f; tx = -(width*10.0f)/2; tz = -((z+1)*10.0f); glColor3f( 0, colour, 0 ); glTexCoord2f( tx/width, tz/height ); glVertex3f( tx, buffer[(z+1)*width]/26.0f, tz ); // starting vertices for this row for( int x = 0; x < width-1; x++ ) { // UR vertex float colour = buffer[(x+1)+z*width]/256.0f; float tx = -(width*10.0f)/2+((x+1)*10.0f); float tz = -(z*10.0f); glColor3f( 0, colour, 0 ); glTexCoord2f( tx/width, tz/height ); glVertex3f( tx, buffer[(x+1)+z*width]/26.0f, tz ); // LR vertex colour = buffer[(x+1)+(z+1)*width]/256.0f; tx = -(width*10.0f)/2+((x+1)*10.0f); tz = -((z+1)*10.0f); glColor3f( 0, colour, 0 ); glTexCoord2f( tx/width, tz/height ); glVertex3f( tx, buffer[(x+1)+(z+1)*width]/26.0f, tz ); } glEnd(); } return 1;}
SwapBuffers() is called after this in a seperate function.
Any ideas? I''ve tried using simple polygons ( ie, a square ), and it still does it, and I have no idea why!
you have
gluPerspective( 90.0, float(width/height), 0.0, 1000.0 );
change that to
gluPerspective( 90.0, float(width/height), 10.0, 1000.0 );
It does indeed seem to be a Z fighting issue
gluPerspective( 90.0, float(width/height), 0.0, 1000.0 );
change that to
gluPerspective( 90.0, float(width/height), 10.0, 1000.0 );
It does indeed seem to be a Z fighting issue
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement