• Create Account

## SDL Regulating frame rate

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

10 replies to this topic

### #1Beshon  Members

108
Like
0Likes
Like

Posted 09 May 2013 - 07:55 AM

Hey

I have followed most the beginner tutorials out there I can find on SDL and am finally getting around to making my first game, something I want to include now is going from beginner standard to industry standard and the first thing is moving from controlling frame rate by ticks and start using time to do this as I have seen lots of comments with people saying this is much better.

I cannot however find anything like out there that explains how to do this so am posting here for a basic description on how to do it and hopefully a code snippet.

### #2KnolanCross  Members

1940
Like
0Likes
Like

Posted 09 May 2013 - 11:42 AM

Didn't really understand what you want... Do you want to limit your game frame rate, for instance 60 frames per second? Or could it be that you want to control your game by time (which is the standard), for instance, the movement of a character for 2 seconds be the same for someone running at  30 FPS and other running as 200 FPS?

Currently working on a scene editor for ORX (http://orx-project.org), using kivy (http://kivy.org).

### #3stitchs  Members

1361
Like
0Likes
Like

Posted 09 May 2013 - 12:40 PM

If you have a good level of understanding with what you have learnt so far, and feel comfortable with your language, these two articles provide quite detailed explanations of what (I assume) you would like to learn.

They cover the topics of using delta time in your games, implementing a fixed time-step and rendering using interpolation. I have read through them a few times in the past but have yet to put them into practice (studying Maths in my spare time atm). The logic seems quite sound but I can't recommend from a practical perspective.

I hope this helps,

Stitchs.

EDIT: I just realised I never linked the actual articles, I do apologise:

http://gafferongames.com/game-physics/fix-your-timestep/

http://www.koonsolo.com/news/dewitters-gameloop/

Edited by stitchs, 10 May 2013 - 10:32 AM.

### #4Servant of the Lord  Members

33553
Like
1Likes
Like

Posted 09 May 2013 - 01:32 PM

See Lazy Foo's tutorials. First this one, then this one. Might want to look at this article also.

It's perfectly fine to abbreviate my username to 'Servant' or 'SotL' rather than copy+pasting it all the time.
All glory be to the Man at the right hand... On David's throne the King will reign, and the Government will rest upon His shoulders. All the earth will see the salvation of God.
Of Stranger Flames -

### #5Beshon  Members

108
Like
0Likes
Like

Posted 09 May 2013 - 02:17 PM

Eh so at the moment I am using the way in which you use GetTicks and then work out if its going too fast and then delay it if it is, the way the above tutorials show it.

I have read in quite a few places that its best to handle your animation and stuff like that using time passed instead of frames passed so I want to do it this way but cannot find anything on it.

Is that clearer?

Stiches you didnt post the articles in your post, I would love to see them though!

Thanks again

Edited by Beshon, 09 May 2013 - 02:18 PM.

### #6HermanssoN  Members

148
Like
0Likes
Like

Posted 09 May 2013 - 02:44 PM

GetTicks is the amount of millliseconds since the program started.

if you in your update loop save the delta

int time = GetTicks();
int delta = time - lastTime;
lastTime = time; //last time is a variable that you save

//move something 15 units per second in x

float deltaSeconds = 0.01f *  delta; //ms to sec

myObject.x += 15 * deltaSeconds ;


If you hve low fps delta time will be high, if the fps is high delta time will be low. Using delta time means the the the movent of myObject will be frame rate independatant.

Edited by HermanssoN, 09 May 2013 - 02:46 PM.

### #7Servant of the Lord  Members

33553
Like
0Likes
Like

Posted 09 May 2013 - 02:51 PM

Eh so at the moment I am using the way in which you use GetTicks and then work out if its going too fast and then delay it if it is, the way the above tutorials show it.

That's the way the first tutorial I linked to shows it, yes.

I have read in quite a few places that its best to handle your animation and stuff like that using time passed instead of frames passed so I want to do it this way but cannot find anything on it.

That's the way the second tutorial I linked to does it.

It's perfectly fine to abbreviate my username to 'Servant' or 'SotL' rather than copy+pasting it all the time.
All glory be to the Man at the right hand... On David's throne the King will reign, and the Government will rest upon His shoulders. All the earth will see the salvation of God.
Of Stranger Flames -

### #8Beshon  Members

108
Like
0Likes
Like

Posted 09 May 2013 - 03:01 PM

Thanks for both your posts, I finally have something to go on!

I did go through that tutorial Servant of the Lord but it confused me because as far as I can tell the timer was not delaying anything, it was just counting and resetting.

I am most definitely wrong, I just didn't understand it sorry

Thanks again!

### #9Servant of the Lord  Members

33553
Like
0Likes
Like

Posted 09 May 2013 - 03:07 PM

No problem. If you don't understand any of it, let us know what parts and we can explain it.

I personally like to keep my 'deltas' as floats, with 1.0 being one second of time - I find them easier to manage that way.

amountToMoveThisFrame = (MovementSpeedOverOneSecond * deltaTime);

It's perfectly fine to abbreviate my username to 'Servant' or 'SotL' rather than copy+pasting it all the time.
All glory be to the Man at the right hand... On David's throne the King will reign, and the Government will rest upon His shoulders. All the earth will see the salvation of God.
Of Stranger Flames -

### #10Beshon  Members

108
Like
0Likes
Like

Posted 09 May 2013 - 03:31 PM

Oh so delay isn't actually used, that makes much more sense, thanks so much!

### #11stillLearning()  Members

212
Like
0Likes
Like

Posted 10 May 2013 - 10:00 AM

What others have mentioned, you usually don't use SDL_Delay() too regulate framerate. While this is a quick and easy approach to fix a game running to fast, the results will vary on slower and faster computers than your own.

Usually velocity is measured in pps, pixels per second. You want something to move 10 pixels per second, you set velocity = 10. deltaTime how long time it took to run a frame. If your game takes 1 second too run 1 frame, you get:

velocity * deltaTime = 10pps * 1000ms = 10ppf (10 pixels per frame)

0.5s to run 1frame would be:

velocity * deltaTime = 10pps * 500ms = 5ppf

So if your game runs faster, your character moves shorter per frame. If your game runs slower, your character moves faster. If the game run double the speed, the character only moves 5ppf. If the game runs 1s/frame he moves 10ppf. This should make sence. This makes gameplay run equally fast on different computers, but with altering framerates. This is what many others said before, but no one mentioned pixels per second or frames per second which, IMHO, makes this subject more easy to think about.

Tip: When using SDL everything is measured in ms, NOT seconds, hence why I used ms in my explanation. When calculating deltaTime you would do something like:

Uint32 getDeltaTime()
{
Uint32 deltaTime = SDL_GetTicks() - previousFrameTime;
previousFrameTime = SDL_GetTicks();
return deltaTime;
}


This would return 1000 if the time between each frame is 1sec. Doing position += velocity * getDeltaTime() will scale your velocity by 1000 if you defined your velocity as pps. You must either define your velocity as ppms (pixels per milliseconds) or divide deltaTime by 1000:

Uint32 getDeltaTime()
{
Uint32 deltaTime = SDL_GetTicks() - previousFrameTime;
previousFrameTime = SDL_GetTicks();
return deltaTime / 1000;              //Seconds instead of milliseconds
}


Division is an expensive operation you might say, but considering the circumstances its a piss in the ocean (as we say in sweden, instead of "it doesn't matter"). Dividing by 1000 once every frame isn't gonna cause your game to meltdown. New game programmers tend to prematurely optimize everything, including myself, and it's easy to think dividing by 1000 every single frame is bad code. It isn't.

Edited by Kuxe, 10 May 2013 - 10:19 AM.

If you don't understand the stuff written here, please sharpen your C++ skills.

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.