opinions on frame based vs time based movement

Started by
36 comments, last by graveyard filla 19 years, 10 months ago
quote:
Wasted processing time. Your just doing nothing waiting for 16 milisecs to pass by.


Actually, it''s not wasted time. The delay function _frees_ the processor, allowing whatever tasks you are running in the background (like a separate audio thread) better access to the processor. Not wasted at all.

quote:
Also as mentioned on a slow system capable of less than 60fps this code would not degrade gracefully.


Wow, can you tell just by looking at it how graceful it will be? Actually, I upped the resolution and put a few thousand asteroids on the screen to see how well the game played when the processor couldn''t keep up. Amazingly, the gameplay became only a bit choppy but remained responsive as user input is still checked and accounted for each frame. Also, the speed of gameplay did not change. You should try it out before you bash it.

quote:
Rough Example.


I said it was my first game and my timing code does work, however "rough" it may be.

While Main_Loop {    While Last_Time + Interval <= Current_Time {        Last_Time = Last_Time + Interval        Do_AI()    }    DrawFrame} 


That''s a good idea. Actually from the discussion on this forum I''ve decided to rewrite the game using timing based movement. I agree it would probably be better in the long run. It will make collision detection more difficult and all the other timing things that counted frames. I guess I spent a lot of time with old consoles and console emulators and kind of assumed fixed frame-rate was standard.


steveth45 = new gamecoder;
steveth45 = new gamecoder;
Advertisement
I would not pass the time since last frame. The problem is with the "time tick resolution". Like if you are getting 500fps and there are only 1000 ticks, you are passing either 1 or 2 ticks as "time since last interval". You can see how there is a 100% difference between 1 and 2! 50% between 2 and 3. That little difference shows up as herky-jerky movement.

Pass the current time. Not all objects will use time the same way. If an object needs interval time, it can calculate that itself. Or you can keep TIME and INTERVAL_TIME as global variables so objects can use either one and only calculate them once. whatever you want.
what do you mean by pass the current time? you mean something like

//main game loopwhile(!done){  Uint32 time = SDL_GetTicks();  //do my whole game here  //Player.Update(time);  //Enemy.Update(time)}Player/Enemy::Update(Uint32 time){      //now what? what do i do with the current time?}



you see, im a little confused on what to do with the time once my object has the time.. also, why not call SDL_GetTicks() from inside of update? then the objects wouldnt need to receive the time.. they could find out for themselves.. OR, do you purposely only call SDL_GetTicks() ONCE, so there is only ONE time for all your objects,because if you called SDL_GetTicks() from each Update(), like this:

player.Update();
enemy.Update();


the player Update and enemy Update would have DIFFERENT times, maybe only different by a few milliseconds, but still difference. would this throw things off sync? is that why you only take the time once, so all the objects have the same time to work with? but like i said, could you explain please what to do once that object has the time. in my above example, i could calculate movement if i had the time that passed since the last frame.. but how do i calculate movement if i have just the current clock ticks ? thanks for any help!!!


FTA, my 2D futuristic action MMORPG
here is a timer class

class CTimer{  protected:    double m_Frequency;    __int64 m_StartClock;    float m_FrameTime;  float m_FrameStart;  float m_FrameEnd;  public:    float GetFrameTime() { return m_FrameTime; }    double GetTime();    void Init();  void Update();  };


#include <windows.h>#include "timer.h"  double CTimer::GetTime(){  __int64 EndClock;    QueryPerformanceCounter((LARGE_INTEGER*)&EndClock);    return (double)(EndClock-m_StartClock)*m_Frequency;}  void CTimer::Init(){  __int64 rate;    // Get the performance frequency  QueryPerformanceFrequency((LARGE_INTEGER*)&rate);    // Invert it so we can multiply instead of divide  m_Frequency = 1.0/(double)rate;    // Get the start time  QueryPerformanceCounter((LARGE_INTEGER*)&m_StartClock);    m_FrameTime  = 0.0f;  m_FrameStart = (float)GetTime();  m_FrameEnd   = 0.0f;}  void CTimer::Update(){  // Cap the frame rate to the timer frequency  // The timer frequency is normally always faster than the frame rate  do {    m_FrameEnd = (float)GetTime(); // Get the end frame time  } while( m_FrameEnd == m_FrameStart);    //   m_FrameTime  = m_FrameEnd - m_FrameStart; // Get the elapsed time  m_FrameStart = m_FrameEnd; // Set the end time to be the next start time}


here is how you can use it

CTimer timer;
timer.Init();

// game loop
{
// update the timer
timer.Update();

// update sprite animation at 1 frame per second
sprite.updateAnimation(1.0, timer.GetFrameTime());
}

	QueryPerformanceFrequency(&frame_delay);	frame_delay.QuadPart/=85;//set at 85 fps	QueryPerformanceCounter(&ticks);	QueryPerformanceCounter(&last_frame);//------------------------GAME LOOP-------------------------------	do	{		do		{			while(PeekMessage(&Msg, NULL, 0, 0, PM_REMOVE))				{				TranslateMessage(&Msg);				DispatchMessage(&Msg);				switch(Msg.message)				{				case WM_LBUTTONDOWN:				break;				case WM_QUIT:				game_over=true;				}			}		QueryPerformanceCounter(&ticks);		}while(ticks.QuadPart<last_frame.QuadPart+frame_delay.QuadPart);		QueryPerformanceCounter(&last_frame);			Inputs(Game);		UpdateWorld(Game);		HitDetection(Game);		Refresh(Game);	}while(!game_over);



and i have constant deltas for 85 fps as one would expect. to support constant framerate that are determined at start up i could easily gives those deltas a coefficients. now if im going to be locking the framerate at the refresh rate in this manner, is it worth worying about rare instances of poor performance? (this is a 2d game anyway)

[edited by - unliterate on May 27, 2004 9:49:29 PM]
well, you only want to call SDL_GetTicks() ONCE. Otherwise some of your objects will move ahead of / more than / less than other objects. You want them all to move the same amount each frame so give everybody the same time. You also want to call it only once because it cuts down on function calls as well, especially if you have thousands of objects that need the time. By making time a global, you can also cut down on passing it around every loop as too.
lei, (i know you werent adressing me but) would that apply if the framerate is locked? that doesnt seem to take account the only problem applicable to the way im doing things, and that problem would be certain game function calls taking way longer than expected (because of something going on in the game, like lots of explosions or something). also, by ''time'' do you mean the diference in between the current time and the last time check?
i don''t want to sound like my way is the best way, but the way *I* do it is just to let the gameloop run as fast as it can and update everything as fast as i can. It doesn''t matter if the game is bogged down at 10 FPS or is running like the Devil at 500 FPS, objects still all move and such at the same rate. Instead of saying an object should move X pixels per frame, you make it pixels per second.

I pass the current absolute time to my objects. Different objects handle that time differently. Some completely ignore it. Some keep track of the last update and calculate the difference. Some only care about the absolute time (if i want something to happen at exacty 10 seconds into the game or something), some need the difference (moving objects), some need the time since they were born (particles), some need the time relative to their whole lifespan (conditional animations), etc.

Now to (not) answer your question, i''m not sure how to get the framerate to be nice while doing other CPU intensive activities like AI besides using threads. I haven''t started AI functions for my project. I''m having to much fun designing special FX :-) But when the screen gets really bogged down, the framerate slows to a crawl, but objects still move at the same pace.
Haven't read everybody's responses, but this is how I do it, and it works fine for me.

1. set a time on which the game objects move at "normal" speed. this is like the standard time.
2. get the time spent to render a single frame. compare this time with the standard time.
3. adjust all movements of objects according to the comparison.

example:
// the game objects move at normal speed when it's 60fpsfloat standard_time = 60.0f/1000.0f;// this will adjust all movements of the objectsfloat fps_factor = 1.0f;void render(){   float start_time = SDL_GetTicks();   // do stuff   objectA.x += objectA.speed * fps_factor;   objectB.x += objectB.speed * fps_factor;   float end_time = SDL_GetTicks();   float difference = start_time - end_time;   fps_factor = difference * standard_time;}  


[edited by - alnite on May 27, 2004 12:34:51 AM]
quote:Original post by leiavoia
i don't want to sound like my way is the best way, but the way *I* do it is just to let the gameloop run as fast as it can and update everything as fast as i can. It doesn't matter if the game is bogged down at 10 FPS or is running like the Devil at 500 FPS, objects still all move and such at the same rate. Instead of saying an object should move X pixels per frame, you make it pixels per second.

I pass the current absolute time to my objects. Different objects handle that time differently. Some completely ignore it. Some keep track of the last update and calculate the difference. Some only care about the absolute time (if i want something to happen at exacty 10 seconds into the game or something), some need the difference (moving objects), some need the time since they were born (particles), some need the time relative to their whole lifespan (conditional animations), etc.

Now to (not) answer your question, i'm not sure how to get the framerate to be nice while doing other CPU intensive activities like AI besides using threads. I haven't started AI functions for my project. I'm having to much fun designing special FX :-) But when the screen gets really bogged down, the framerate slows to a crawl, but objects still move at the same pace.




thanks but i have another question, what is a good way at determining what velocity? to use while moving an object... like this (inside class Player)

xPos += (xVel * LastFramesTime);

yPos += (yVel * LastFramesTime);

what is a good way to figure out what an objects velocity should be if its moving? do i just guess and check or is there something that would help me figure this out ? also, to the guys talking about QueryPerformanceTimer.. im assuming this is windows only? im trying to keep my game cross platform...

thanks for your help....

also, what else should use the time besides my player/enemies/bullets ? im sure my animations will need the time, but stuff like taking input or something will not use it at all ? just wondering if theres anything else i should use the time with specifically besides obviouse stuff like movement...

[edited by - graveyard filla on May 28, 2004 1:26:10 AM]

[edited by - graveyard filla on May 28, 2004 1:26:52 AM]
FTA, my 2D futuristic action MMORPG

This topic is closed to new replies.

Advertisement