SDL Regulating frame rate

Started by
9 comments, last by Joakim Thor 10 years, 11 months ago

Hey

I have followed most the beginner tutorials out there I can find on SDL and am finally getting around to making my first game, something I want to include now is going from beginner standard to industry standard and the first thing is moving from controlling frame rate by ticks and start using time to do this as I have seen lots of comments with people saying this is much better.

I cannot however find anything like out there that explains how to do this so am posting here for a basic description on how to do it and hopefully a code snippet.

Thanks in advance!

Advertisement

Didn't really understand what you want... Do you want to limit your game frame rate, for instance 60 frames per second? Or could it be that you want to control your game by time (which is the standard), for instance, the movement of a character for 2 seconds be the same for someone running at 30 FPS and other running as 200 FPS?

Currently working on a scene editor for ORX (http://orx-project.org), using kivy (http://kivy.org).

If you have a good level of understanding with what you have learnt so far, and feel comfortable with your language, these two articles provide quite detailed explanations of what (I assume) you would like to learn.

They cover the topics of using delta time in your games, implementing a fixed time-step and rendering using interpolation. I have read through them a few times in the past but have yet to put them into practice (studying Maths in my spare time atm). The logic seems quite sound but I can't recommend from a practical perspective.

I hope this helps,

Stitchs.

EDIT: I just realised I never linked the actual articles, I do apologise:

http://gafferongames.com/game-physics/fix-your-timestep/

http://www.koonsolo.com/news/dewitters-gameloop/

See Lazy Foo's tutorials. First this one, then this one. Might want to look at this article also.

Eh so at the moment I am using the way in which you use GetTicks and then work out if its going too fast and then delay it if it is, the way the above tutorials show it.

I have read in quite a few places that its best to handle your animation and stuff like that using time passed instead of frames passed so I want to do it this way but cannot find anything on it.

Is that clearer?

Stiches you didnt post the articles in your post, I would love to see them though!

Thanks again

GetTicks is the amount of millliseconds since the program started.

if you in your update loop save the delta


int time = GetTicks();
int delta = time - lastTime;
lastTime = time; //last time is a variable that you save

//move something 15 units per second in x

float deltaSeconds = 0.01f *  delta; //ms to sec

myObject.x += 15 * deltaSeconds ;

If you hve low fps delta time will be high, if the fps is high delta time will be low. Using delta time means the the the movent of myObject will be frame rate independatant.

Eh so at the moment I am using the way in which you use GetTicks and then work out if its going too fast and then delay it if it is, the way the above tutorials show it.

That's the way the first tutorial I linked to shows it, yes.

I have read in quite a few places that its best to handle your animation and stuff like that using time passed instead of frames passed so I want to do it this way but cannot find anything on it.

That's the way the second tutorial I linked to does it. smile.png

Thanks for both your posts, I finally have something to go on! :)

I did go through that tutorial Servant of the Lord but it confused me because as far as I can tell the timer was not delaying anything, it was just counting and resetting.

I am most definitely wrong, I just didn't understand it sorry

Thanks again!

No problem. If you don't understand any of it, let us know what parts and we can explain it.

I personally like to keep my 'deltas' as floats, with 1.0 being one second of time - I find them easier to manage that way.


amountToMoveThisFrame = (MovementSpeedOverOneSecond * deltaTime);

Oh so delay isn't actually used, that makes much more sense, thanks so much!

This topic is closed to new replies.

Advertisement