Vsync with Linux (SDL+GL)

Started by
10 comments, last by Boder 16 years, 2 months ago
I am using Ubuntu Gusty and I have SDL set up my OpenGL context and render a spinning cube. I have been looking for a way to make my app use vsync without having to go into NVCLOCK or NVIDIA X Server settings and manually set it. I have searched gamedev as well as the net for "OpenGL vsync linux", "OpenGL vsync sdl linux" and a few other terms but most responses end up saying to use a WGL extension which is Windows based (AFAIK). I think these came up in my search because "linux" is somewhere on most of the pages and the real question on the page is about Windows. I am at a loss here, is there another extension I can use? SDL function? Should I implement a frame rate limiter (what about different refresh rates)? Should I just force vsync in driver using above apps? Any insight will help tremendously! Thanks! Extra info that might or might not help... Not using Mesa Geforce 8800 GTX (or GTS forgot which one) Latest NVIDIA driver [Edited by - Imgelling on February 7, 2008 7:28:16 PM]
my blog contains ramblings and what I am up to programming wise.
Advertisement
I use to think there was a GLX function for this but the last time I did searches, I couldn't find anything. I beleive changing the vsync is not possible but I'm not 100% sure. Even when looking at other people's code, they have wglSwapIntervalEXT for Windows but nothing in there code for Linux.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Well thanks for your input, it was basically what I found, but appreciated. So, what should I do? Should I just make the frame rate limiter? Or what?

After my own personal research I couldn't find anything. After watching random default screen savers... some seem to run erratically (no vsync with shearing/etc) and some seem to do just fine. Which makes me think some enable vsync or just happen to refresh at the right time. I remember in DOS and with mode 13 (Im old), there was a way through asm to check if the monitor was in vertical refresh; does this still work today? or even under Linux? I need to go digging for the code.

I would take this to the Unix forum but there doesn't seem to be very much activity there. Any other info I should know? Maybe a SDL technique?
my blog contains ramblings and what I am up to programming wise.
I hope this helps. I found it in the Secret Maryo source I'm working on

SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );	// if VSync is enabled	if( initialised && pPreferences->video_vsync )	{		SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 );	}
So now I have....
    // create a new window    SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );    SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );    SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );    SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );    SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );    SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );    SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );    SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 );  // Suggested code here    SDL_Surface* screen = SDL_SetVideoMode(SCREENW, SCREENH,32,                                           SDL_OPENGL);    if ( !screen )    {        printf("Unable to set %ix%i video: %s\n",SCREENW,SCREENH, SDL_GetError());        return 1;    }

Hmmm, still doesn't work for me. But I am new to Linux (went dedicated machine about 2 months ago) and trying to re-learn OpenGL through this new OS.

Am I placing the SetAttribute in the wrong place? It still spins rapidly, but it should take about 3 seconds to go through 180 degrees at 60hz and its WAY WAY faster than that.

Thanks for your help folks!

and if this helps, I am using "SDL_GL_SwapBuffers( );"
my blog contains ramblings and what I am up to programming wise.
You could try GLFW. It has a function

void glfwSwapInterval( int interval )
It is worth taking a look at what
SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 );
does under the hood. I can't do it right now.

You can't do int 13. Everything needs to be done by some X function calls or some other lib such as GLX.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Look on nvidia-settings source code. There is NV-CONTROL X-server extension, you can use it to control everything that nvidia-settings does.
SDL_GL_SWAP_CONTROL requires GLX_MESA_swap_control extenrion, but nvidia's driver doesn't support it.
Use extension GLX_SGI_swap_control (function glXSwapIntervalSGI), it's supported on most cards i used (although iirc ATI linux drivers still does not support it). FYI, it's also used internally by GLFW library.
SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 ); //<-- Only works on Windows. (and with SDL v.1.2.10 or higher).

This topic is closed to new replies.

Advertisement