glutIdleFunc, glutDisplayFunc, and Vsync

Started by
1 comment, last by ElectroDruid 15 years, 10 months ago
Hiya, I'm trying to work out the exact relationship between glutIdleFunc, glutDisplayFunc, and Vsync, so that I can properly decouple my game logic updates and the rendering updates, but I'm a bit confused as to how it works. This is what I think should happen, based on the documentation I've read: glutMainLoop() will call whatever function you specify in glutIdleFunc() every time it goes round the loop, with the possible exception of iterations in which input events are triggered. I'm not sure why it doesn't just call it every iteration - I suppose it might well do. I imagine the innards of glutMainLoop() to be a glorified while(true) { /* stuff */ } , so I'd expect the glutIdleFunc to get called as often as the processor allows. I'd expect the glutDisplayFunc callback to be called every time the monitor needs refreshing (ie, assuming not too much strain on the processor or graphics card, it'd match the monitor refresh rate if vsync was turned on, and be updated at some higher speed if it was turned off). This is what I actually observe seeming to happen, although I'm not sure why, or if I've got it right: The callbacks in both glutIdleFunc() and glutDisplayFunc() are affected by vsync. Both will get called 60 times a second or thereabouts if vsync is turned on (my monitor refresh rate being 60Hz), and both get called much more often if vsync is turned off. Questions, then: 1 - Do my observations sound right to you? Are both the idle and the display callbacks clamped to vsync if it's turned on, and left to run free if it's turned off? 2 - If I am right, what's the reasoning behind deliberately slowing the idle func when vsync is turned on? Vsync is to do with the settings for the hardware of the monitor, so why would it affect the overall running speed of the rest of the game logic? Wouldn't it make more sense to clamp the display func but leave the idle func running as often as it can? Have I misunderstood the reasons behind the relationships here? 3 - Is there a generally accepted way, under GLUT, to have one bit of code (the game logic, called from the glutIdleFunc callback, in my case) to update at a rate selected by the coder, controlled perhaps by a custom high-precision timer (I've played about with glutTimerFunc, the accuracy is awful, so I've written my own), independently of the monitor refresh rate, but still have the display function controlled by vsync in order to prevent shearing and all those other nasty artifacts?
"We two, the World and I, are stubborn fellows at loggerheads, and naturally whichever has the thinner skull will get it broken" - Richard Wagner
Advertisement
I don't think decoupling game logic and rendering is a good idea. If you do, you need to make sure that only one has access to your game's data at a given time (by using mutexes, for example), otherwise you'll most likely end up running into problems where the position of one of your entities gets updated halfway through rendering it and thus one half of the entity gets drawn in one place and one half in another.

Now if you want to run your logic on certain intervals of time that don't coincide with your vsync, you can always run multiple instances of your logic code before rendering each frame. I'm not familiar with GLUT (and so my example won't use GLUT), but I believe this should look familiar to you:

void gameloop(){    bool live = true;    float logic_interval = .1f; //max time between logic updates in seconds    float cur_time = 0, prev_time, dt;    while(live){        prev_time = cur_time;        cur_time = getTime();        dt = cur_time - prev_time;        collect_input();        for(int i = 0; i <= static_cast<int>(dt / logic_interval); i++){            update_logic();        }        draw_frame();        swap_buffers();    }}
Quote:Original post by CrimsonSun
I don't think decoupling game logic and rendering is a good idea.


It's an extremely good idea if you want a game with a stable physics engine, or a game featuring networked multiplayer - basically any game which would benefit from a fixed update rate for the game logic, regardless of differences in rendering hardware, whether vsync is turned on, etc. Not every game loop needs to be decoupled, but every professional game I've seen has done it.

Quote:Original post by CrimsonSun
If you do, you need to make sure that only one has access to your game's data at a given time (by using mutexes, for example), otherwise you'll most likely end up running into problems where the position of one of your entities gets updated halfway through rendering it and thus one half of the entity gets drawn in one place and one half in another.


I wasn't talking about multithreading particularly (I don't get the impression that GLUT deals with multithreading anyway) so there wouldn't be any need for mutexes. The update function and the render function would be atomic, so you'd know that each function can run all the way through without getting interrupted by the other. (Also, it's possible if you're careful to have multithreaded games work perfectly without any mutexes at all, but that's another story :) )

For what it's worth, I'm looking into doing something like this:

http://www.gaffer.org/game-physics/fix-your-timestep

But whereas that code uses a nice straightforward while(!quit) loop, GLUT doesn't seem to be quite so straightforward in how it loops, which is why I've been trying to pin down its exact behaviour, and the reasons behind it.
"We two, the World and I, are stubborn fellows at loggerheads, and naturally whichever has the thinner skull will get it broken" - Richard Wagner

This topic is closed to new replies.

Advertisement