Time based programming
Hi, I'm writing a 2D platform game but now I ran into some problem.
I first programmed it as fast as possible (no time factor) which ran smoothly (frame based). I now turned it into time based and updating animations with the delta time and updating the movement with position += speed * deltatime. If I do not run several applications in the background it runs decently, but when some background applications are present the movement isn't that smooth (the frame skip within the animation isn't noticeable) (however it is a 1,4 Ghz). I'm using Direct3D with the ID3DXSprite interface.
I do some collision detection with background tiles and the algorithm will check if speed * deltatime can be added to the position and if it will be less room for that then the maximum available pixels will be added to the position. So far no problem.
But when the main character is jumping there's no limitation. The speed will be decreased during the jump, but when the system is slowing down during the jump deltatime will increase significantly which will result in a higher jump during a slowdown of the computer I think. Any suggestions how to handle this? I think checking on how many time has passed isn't an option, because when there's a huge delay the character can be falling in the next frame (and thus already reached it's top) while the currect frame is at the start of a jump.
I also read about waiting for several clock ticks but that doesn't give a formal/normal way of movement if there's a delay (same as fast as possible, frame based). I also read a thread that time based programming is preferably set a side for 3D games and frame based programming for 2D, I don't know how far this is the standard.
Further I'm thinking about the way to handle if the next frame is wider (I want to create different sized frames) than the current frame and there's no room left. Objects have then already run into each other so anticipating isn't possible anymore which I do during the check on horizontal movement. I think I can figure it out myself but any tips on that would come in handy.
Thanks
One of the easiest solutions is to use a fixed timestep.
lets say you have a deltatime of 30ms
and a fixed timestep of 10ms (100 logic frames / second)
then you would simply run a update function 3 times during one frame
pseudo code
Edit: you can/should ofcourse also add in safeguards that stops the timer if the game lags behind too much (so that the logic can catch up)
(For multiplayer games it is better to try to resynch with the server though)
lets say you have a deltatime of 30ms
and a fixed timestep of 10ms (100 logic frames / second)
then you would simply run a update function 3 times during one frame
pseudo code
int timestep = 10;int carryOver=0;while (!done) {int deltatime = calculateDT()+carryOver;for (int i=0;i<deltatime/timestep;i++) update();carryOver = deltatime%timestep;}
Edit: you can/should ofcourse also add in safeguards that stops the timer if the game lags behind too much (so that the logic can catch up)
(For multiplayer games it is better to try to resynch with the server though)
Ok, that's useful. Gonna try the fixed timestep algorithm. I was wondering what is the normal way of implementing such safeguard? In my mainloop I determine how many time was passed since the last frame (I get it from my timer object). And when, for example, 10 frames of at least 20 ms have passed I should pause the game?
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement