SDL Regulating frame rate

Started by
9 comments, last by Joakim Thor 10 years, 11 months ago
What others have mentioned, you usually don't use SDL_Delay() too regulate framerate. While this is a quick and easy approach to fix a game running to fast, the results will vary on slower and faster computers than your own.
Usually velocity is measured in pps, pixels per second. You want something to move 10 pixels per second, you set velocity = 10. deltaTime how long time it took to run a frame. If your game takes 1 second too run 1 frame, you get:
velocity * deltaTime = 10pps * 1000ms = 10ppf (10 pixels per frame)
0.5s to run 1frame would be:
velocity * deltaTime = 10pps * 500ms = 5ppf
So if your game runs faster, your character moves shorter per frame. If your game runs slower, your character moves faster. If the game run double the speed, the character only moves 5ppf. If the game runs 1s/frame he moves 10ppf. This should make sence. This makes gameplay run equally fast on different computers, but with altering framerates. This is what many others said before, but no one mentioned pixels per second or frames per second which, IMHO, makes this subject more easy to think about.

Tip: When using SDL everything is measured in ms, NOT seconds, hence why I used ms in my explanation. When calculating deltaTime you would do something like:


Uint32 getDeltaTime()
{
   Uint32 deltaTime = SDL_GetTicks() - previousFrameTime;
   previousFrameTime = SDL_GetTicks();
   return deltaTime;
}

This would return 1000 if the time between each frame is 1sec. Doing position += velocity * getDeltaTime() will scale your velocity by 1000 if you defined your velocity as pps. You must either define your velocity as ppms (pixels per milliseconds) or divide deltaTime by 1000:


Uint32 getDeltaTime()
{
   Uint32 deltaTime = SDL_GetTicks() - previousFrameTime;
   previousFrameTime = SDL_GetTicks();
   return deltaTime / 1000;              //Seconds instead of milliseconds
}

Division is an expensive operation you might say, but considering the circumstances its a piss in the ocean (as we say in sweden, instead of "it doesn't matter"). Dividing by 1000 once every frame isn't gonna cause your game to meltdown. New game programmers tend to prematurely optimize everything, including myself, and it's easy to think dividing by 1000 every single frame is bad code. It isn't.

If you don't understand the stuff written here, please sharpen your C++ skills.

This topic is closed to new replies.

Advertisement