Sign in to follow this  
Followers 0
Beshon

SDL Regulating frame rate

10 posts in this topic

Hey

 

I have followed most the beginner tutorials out there I can find on SDL and am finally getting around to making my first game, something I want to include now is going from beginner standard to industry standard and the first thing is moving from controlling frame rate by ticks and start using time to do this as I have seen lots of comments with people saying this is much better.

 

I cannot however find anything like out there that explains how to do this so am posting here for a basic description on how to do it and hopefully a code snippet.

 

Thanks in advance!

0

Share this post


Link to post
Share on other sites

Didn't really understand what you want... Do you want to limit your game frame rate, for instance 60 frames per second? Or could it be that you want to control your game by time (which is the standard), for instance, the movement of a character for 2 seconds be the same for someone running at  30 FPS and other running as 200 FPS?

0

Share this post


Link to post
Share on other sites

If you have a good level of understanding with what you have learnt so far, and feel comfortable with your language, these two articles provide quite detailed explanations of what (I assume) you would like to learn.

 

They cover the topics of using delta time in your games, implementing a fixed time-step and rendering using interpolation. I have read through them a few times in the past but have yet to put them into practice (studying Maths in my spare time atm). The logic seems quite sound but I can't recommend from a practical perspective.

 

I hope this helps,

 

Stitchs.

 

EDIT: I just realised I never linked the actual articles, I do apologise:

 

http://gafferongames.com/game-physics/fix-your-timestep/

 

http://www.koonsolo.com/news/dewitters-gameloop/

Edited by stitchs
0

Share this post


Link to post
Share on other sites

Eh so at the moment I am using the way in which you use GetTicks and then work out if its going too fast and then delay it if it is, the way the above tutorials show it.

 

I have read in quite a few places that its best to handle your animation and stuff like that using time passed instead of frames passed so I want to do it this way but cannot find anything on it.

 

Is that clearer? 

 

Stiches you didnt post the articles in your post, I would love to see them though!

 

Thanks again

Edited by Beshon
0

Share this post


Link to post
Share on other sites

GetTicks is the amount of millliseconds since the program started.

if you in your update loop save the delta

int time = GetTicks();
int delta = time - lastTime;
lastTime = time; //last time is a variable that you save

//move something 15 units per second in x

float deltaSeconds = 0.01f *  delta; //ms to sec

myObject.x += 15 * deltaSeconds ;

If you hve low fps delta time will be high, if the fps is high delta time will be low. Using delta time means the the the movent of myObject will be frame rate independatant.

Edited by HermanssoN
0

Share this post


Link to post
Share on other sites

Eh so at the moment I am using the way in which you use GetTicks and then work out if its going too fast and then delay it if it is, the way the above tutorials show it.

That's the way the first tutorial I linked to shows it, yes.
 

I have read in quite a few places that its best to handle your animation and stuff like that using time passed instead of frames passed so I want to do it this way but cannot find anything on it.

That's the way the second tutorial I linked to does it. smile.png

0

Share this post


Link to post
Share on other sites

Thanks for both your posts, I finally have something to go on! :)

 

I did go through that tutorial Servant of the Lord but it confused me because as far as I can tell the timer was not delaying anything, it was just counting and resetting.

 

I am most definitely wrong, I just didn't understand it sorry

 

Thanks again!

0

Share this post


Link to post
Share on other sites

No problem. If you don't understand any of it, let us know what parts and we can explain it.

I personally like to keep my 'deltas' as floats, with 1.0 being one second of time - I find them easier to manage that way.

 

 

amountToMoveThisFrame = (MovementSpeedOverOneSecond * deltaTime);
0

Share this post


Link to post
Share on other sites

Oh so delay isn't actually used, that makes much more sense, thanks so much!

0

Share this post


Link to post
Share on other sites
What others have mentioned, you usually don't use SDL_Delay() too regulate framerate. While this is a quick and easy approach to fix a game running to fast, the results will vary on slower and faster computers than your own.
 
Usually velocity is measured in pps, pixels per second. You want something to move 10 pixels per second, you set velocity = 10. deltaTime how long time it took to run a frame. If your game takes 1 second too run 1 frame, you get:
 
velocity * deltaTime = 10pps * 1000ms = 10ppf (10 pixels per frame)
 
0.5s to run 1frame would be:
 
velocity * deltaTime = 10pps * 500ms = 5ppf
 
So if your game runs faster, your character moves shorter per frame. If your game runs slower, your character moves faster. If the game run double the speed, the character only moves 5ppf. If the game runs 1s/frame he moves 10ppf. This should make sence. This makes gameplay run equally fast on different computers, but with altering framerates. This is what many others said before, but no one mentioned pixels per second or frames per second which, IMHO, makes this subject more easy to think about.

 

Tip: When using SDL everything is measured in ms, NOT seconds, hence why I used ms in my explanation. When calculating deltaTime you would do something like:

 

Uint32 getDeltaTime()
{
   Uint32 deltaTime = SDL_GetTicks() - previousFrameTime;
   previousFrameTime = SDL_GetTicks();
   return deltaTime;
}

 

This would return 1000 if the time between each frame is 1sec. Doing position += velocity * getDeltaTime() will scale your velocity by 1000 if you defined your velocity as pps. You must either define your velocity as ppms (pixels per milliseconds) or divide deltaTime by 1000:

 

Uint32 getDeltaTime()
{
   Uint32 deltaTime = SDL_GetTicks() - previousFrameTime;
   previousFrameTime = SDL_GetTicks();
   return deltaTime / 1000;              //Seconds instead of milliseconds
}

 

Division is an expensive operation you might say, but considering the circumstances its a piss in the ocean (as we say in sweden, instead of "it doesn't matter"). Dividing by 1000 once every frame isn't gonna cause your game to meltdown. New game programmers tend to prematurely optimize everything, including myself, and it's easy to think dividing by 1000 every single frame is bad code. It isn't.

Edited by Kuxe
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0