frame rate independent movement

Started by
2 comments, last by scarypajamas 15 years, 4 months ago
I'm having some trouble making a frame rate independent timer. The problem is no matter how I configure the code, the timer doesn’t return the same result from OS to OS. I've created a start() function (below) to get my current time from the loop. Then, I take the object I'm moving and multiply its velocity with the get_ticks function. Except it’s not working. I was originally dividing the result of SDL_GetTicks - startTicks by 1000 (to get the milliseconds) except when I tried it on my Mac, it ran to slow, then I adjusted the formula to divide by 3000 but then it was too fast for my PC! Is there any way to figure out what to divide by? 1000 isn't working and I can't think of what I'm doing wrong.

int startTicks;

void CTimer::start()
{
    //current clock time   
    startTicks = SDL_GetTicks();
}

float CTimer::get_ticks()
{
	//i've had to adjust this equation from machine to machine to get it to run the same
	#if WIN32
		return (SDL_GetTicks() - startTicks) / 3000.f;
	#else
		return (SDL_GetTicks() - startTicks) / 10000.f;
	#endif
        
}


main() //what my loop looks like
{
	CTimer Timer;

	while(1) //main loop
	{
		Timer.start();

		DRAWSTUFF();
	}
}

Advertisement
Well the docs say SDL_GetTicks returns the number of milliseconds since library initialization (and also that the value wraps if run for more than ~50 days but I guess that isn't part of your debug process xD) so if it isn't doing this for you then you should tell the developers, and then either wait for them to respond or do the (probably) fairly trivial task of writing your own high frequency timer (using QueryPerformanceCounter on Win32 and whatever the equivilent is for other platforms you want to support).
Bear in mind it also returns Uint32 NOT int (but this isn't your problem I guess).
SDL_GetTicks() works just fine. The problem is you're resetting your timer each frame. This means your timer value will always be (roughly) zero.
Thank you for you suggestions. I’ve already re-written my timer code using both mac and pc specific timers.

I’m positing my timer class below to save anyone else the trouble of getting the same headache I did.

#ifdef __APPLE__	#include <CoreServices/CoreServices.h>#endif//The timerclass CTimer{	private:		//The clock time when the timer started		#if WIN32			float time, frameTime, lastTime;			__int64 timerStart, ticksPerSecond;		#elif __APPLE__			UnsignedWide startTicks, currentTicks;			double frameTime;		#endif	public:		CTimer();		void start();		void update();		float get_ticks();   //gets the timer's time};


#ifdef WIN32	#include "mmsystem.h"#elif __APPLE__	#include <CoreServices/CoreServices.h>	double ConvertMicrosecondsToDouble(UnsignedWidePtr microsecondsValue)	{ 		double twoPower32 = 4294967296.0; 		double doubleValue; 			double upperHalf = (double)microsecondsValue->hi; 		double lowerHalf = (double)microsecondsValue->lo; 			doubleValue = (upperHalf * twoPower32) + lowerHalf; 		return doubleValue;	}#endif//frame frame independent movementCTimer::CTimer(){}void CTimer::start(){	//get the time	#if WIN32		QueryPerformanceFrequency( (LARGE_INTEGER *)&ticksPerSecond );		QueryPerformanceCounter( (LARGE_INTEGER *)&timerStart );		lastTime = timerStart;	#elif __APPLE__		Microseconds(&startTicks);	#endif}void CTimer::update(){	#if WIN32		__int64 now;		QueryPerformanceCounter( (LARGE_INTEGER *)&now );		time = float(double(now - timerStart) / double(ticksPerSecond));		frameTime = time - lastTime;		lastTime = time;	#elif __APPLE__		Microseconds(&currentTicks);				frameTime = ConvertMicrosecondsToDouble(&currentTicks) - ConvertMicrosecondsToDouble(&startTicks);			//	frameTime = frameTime / 1000000; //OR use the one below...		frameTime *= 0.000001;			Microseconds(&startTicks);	#endif}float CTimer::get_ticks(){	#if WIN32		return frameTime;	#elif __APPLE__		return (float)frameTime;	#endif}

This topic is closed to new replies.

Advertisement