Vsync with GLee

Started by
2 comments, last by SuperVGA 10 years, 7 months ago

Hi Guys,

I've again stuck with something I shouldn't and spent way too much time staring myself blind on something what I thought was simple. I got it to work some time ago, but lost my code for it. I even did a small app to demonstrate my issue.

I can't for the world make my windowed gl context start up with vsync enabled (The vsync state is decided by the driver) - once I enter glutGameMode and go back to windowed mode, vsync is now enabled. I have vsync set as the first thing after these two very important lines:


// Initialize glut
glutInit(&argc, argv);
// Initialize glee
GLeeInit();

Followed by:


glutInitWindowSize(gl_current_width, gl_current_height);
glut_window_hWnd = glutCreateWindow(window_title.c_str());
glutInitDisplayMode(GLUT_DEPTH | GLUT_RGB | GLUT_DOUBLE | GLUT_ALPHA | GLUT_STENCIL);

// Set swap interval
#ifdef _WIN32
 char *extensions = (char*)glGetString( GL_EXTENSIONS );
 if( strstr( extensions, "WGL_EXT_swap_control\0" ) == 0 )
 {
  return; // Error: WGL_EXT_swap_control extension not supported on your computer.");
 }
 else
 {
  wglSwapIntervalEXT = (GLEEPFNWGLSWAPINTERVALEXTPROC)wglGetProcAddress( "wglSwapIntervalEXT\0" );
  if( wglSwapIntervalEXT )
  {
   wglSwapIntervalEXT(1);
  }
 }
#endif
#ifdef __linux__
 glXSwapIntervalSGI(1);    // Nix
#endif

I can't make it work without changing display mode. Should I call wglSwapIntervalEXT elsewhere?

I think GLee supplies wglSwapIntervalEXT already, and I've noticed that it ends up targeting the same procedure, but I just wanted to make sure.

Calling wglSwapIntervalEXT(1) should've been sufficient.

I've tested on 4 cards; an nVidia GTX 275, an AMD Radeon 6500M, an nVidia Quadro 3000M and Intel HD 4500.

All exhibit the same behaviour (also, all run Win7 64bit. It might have something to do with it, although I can't see why)

EDIT: Why, thank you for removing half of my post! smile.png

EDIT 2: Sanitized code, removed religious glError checks...

Advertisement

Have you tried enabling VSync after creating the window (on GLUT i think it was called glutCreateWindow(), not sure though), if it was not created before?

I've never actually tried to having VSync enabled at startup, i only had it possible to enable it by pressing a key.

Have you tried enabling VSync after creating the window (on GLUT i think it was called glutCreateWindow(), not sure though), if it was not created before?

I've never actually tried to having VSync enabled at startup, i only had it possible to enable it by pressing a key.

Whoops, Thanks! -looks like I accidentally erased that line when I first cleaned tabs and spaces in my post. (I fixed it now) smile.png

I also tested with setting the statement before glutCreateWindow(), that results in the GL error 1282 (Invalid operation) - I guess because the context is not ready at that point.

Now here's hoping someone tried starting up with vsync enabled...

EDIT: No, it was there all along, and in the wrong place too. I'm just blind...

I managed to fix it! You brought my attention to glutCreateWindow, and then I looked at the order of the other statements.

I also thought it was weird to set glutInitDisplayMode after glutCreateWindow. anyways, this is the order i ended up with, and it works! :)

glutInitDisplayMode(GLUT_DEPTH | GLUT_RGB | GLUT_DOUBLE | GLUT_ALPHA | GLUT_STENCIL);
glutInitWindowSize(gl_current_width, gl_current_height);
glut_window_hWnd = glutCreateWindow(window_title.c_str());
// Set swap interval...

(Also, it's incredible what difference a trip to the man pages can do. That and the religious error checking L. Spiro suggested is two of the best pieces of advice I realized/got for OpenGL)

This topic is closed to new replies.

Advertisement