Jump to content
  • Advertisement
Sign in to follow this  
Imgelling

OpenGL Vsync with Linux (SDL+GL)

This topic is 3844 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am using Ubuntu Gusty and I have SDL set up my OpenGL context and render a spinning cube. I have been looking for a way to make my app use vsync without having to go into NVCLOCK or NVIDIA X Server settings and manually set it. I have searched gamedev as well as the net for "OpenGL vsync linux", "OpenGL vsync sdl linux" and a few other terms but most responses end up saying to use a WGL extension which is Windows based (AFAIK). I think these came up in my search because "linux" is somewhere on most of the pages and the real question on the page is about Windows. I am at a loss here, is there another extension I can use? SDL function? Should I implement a frame rate limiter (what about different refresh rates)? Should I just force vsync in driver using above apps? Any insight will help tremendously! Thanks! Extra info that might or might not help... Not using Mesa Geforce 8800 GTX (or GTS forgot which one) Latest NVIDIA driver [Edited by - Imgelling on February 7, 2008 7:28:16 PM]

Share this post


Link to post
Share on other sites
Advertisement
I use to think there was a GLX function for this but the last time I did searches, I couldn't find anything. I beleive changing the vsync is not possible but I'm not 100% sure. Even when looking at other people's code, they have wglSwapIntervalEXT for Windows but nothing in there code for Linux.

Share this post


Link to post
Share on other sites
Well thanks for your input, it was basically what I found, but appreciated. So, what should I do? Should I just make the frame rate limiter? Or what?

After my own personal research I couldn't find anything. After watching random default screen savers... some seem to run erratically (no vsync with shearing/etc) and some seem to do just fine. Which makes me think some enable vsync or just happen to refresh at the right time. I remember in DOS and with mode 13 (Im old), there was a way through asm to check if the monitor was in vertical refresh; does this still work today? or even under Linux? I need to go digging for the code.

I would take this to the Unix forum but there doesn't seem to be very much activity there. Any other info I should know? Maybe a SDL technique?

Share this post


Link to post
Share on other sites
I hope this helps. I found it in the Secret Maryo source I'm working on

SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
// if VSync is enabled
if( initialised && pPreferences->video_vsync )
{
SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 );
}

Share this post


Link to post
Share on other sites
So now I have....

// create a new window
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 ); // Suggested code here
SDL_Surface* screen = SDL_SetVideoMode(SCREENW, SCREENH,32,
SDL_OPENGL);
if ( !screen )
{
printf("Unable to set %ix%i video: %s\n",SCREENW,SCREENH, SDL_GetError());

return 1;
}




Hmmm, still doesn't work for me. But I am new to Linux (went dedicated machine about 2 months ago) and trying to re-learn OpenGL through this new OS.

Am I placing the SetAttribute in the wrong place? It still spins rapidly, but it should take about 3 seconds to go through 180 degrees at 60hz and its WAY WAY faster than that.

Thanks for your help folks!

and if this helps, I am using "SDL_GL_SwapBuffers( );"

Share this post


Link to post
Share on other sites
It is worth taking a look at what
SDL_GL_SetAttribute( SDL_GL_SWAP_CONTROL, 1 );
does under the hood. I can't do it right now.

You can't do int 13. Everything needs to be done by some X function calls or some other lib such as GLX.

Share this post


Link to post
Share on other sites
Look on nvidia-settings source code. There is NV-CONTROL X-server extension, you can use it to control everything that nvidia-settings does.
SDL_GL_SWAP_CONTROL requires GLX_MESA_swap_control extenrion, but nvidia's driver doesn't support it.

Share this post


Link to post
Share on other sites
Use extension GLX_SGI_swap_control (function glXSwapIntervalSGI), it's supported on most cards i used (although iirc ATI linux drivers still does not support it). FYI, it's also used internally by GLFW library.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!