Hi Guys,
I've again stuck with something I shouldn't and spent way too much time staring myself blind on something what I thought was simple. I got it to work some time ago, but lost my code for it. I even did a small app to demonstrate my issue.
I can't for the world make my windowed gl context start up with vsync enabled (The vsync state is decided by the driver) - once I enter glutGameMode and go back to windowed mode, vsync is now enabled. I have vsync set as the first thing after these two very important lines:
// Initialize glut
glutInit(&argc, argv);
// Initialize glee
GLeeInit();
Followed by:
glutInitWindowSize(gl_current_width, gl_current_height);
glut_window_hWnd = glutCreateWindow(window_title.c_str());
glutInitDisplayMode(GLUT_DEPTH | GLUT_RGB | GLUT_DOUBLE | GLUT_ALPHA | GLUT_STENCIL);
// Set swap interval
#ifdef _WIN32
char *extensions = (char*)glGetString( GL_EXTENSIONS );
if( strstr( extensions, "WGL_EXT_swap_control\0" ) == 0 )
{
return; // Error: WGL_EXT_swap_control extension not supported on your computer.");
}
else
{
wglSwapIntervalEXT = (GLEEPFNWGLSWAPINTERVALEXTPROC)wglGetProcAddress( "wglSwapIntervalEXT\0" );
if( wglSwapIntervalEXT )
{
wglSwapIntervalEXT(1);
}
}
#endif
#ifdef __linux__
glXSwapIntervalSGI(1); // Nix
#endif
I can't make it work without changing display mode. Should I call wglSwapIntervalEXT elsewhere?
I think GLee supplies wglSwapIntervalEXT already, and I've noticed that it ends up targeting the same procedure, but I just wanted to make sure.
Calling wglSwapIntervalEXT(1) should've been sufficient.
I've tested on 4 cards; an nVidia GTX 275, an AMD Radeon 6500M, an nVidia Quadro 3000M and Intel HD 4500.
All exhibit the same behaviour (also, all run Win7 64bit. It might have something to do with it, although I can't see why)
EDIT: Why, thank you for removing half of my post!
EDIT 2: Sanitized code, removed religious glError checks...