Timer problem

Started by
7 comments, last by blizzard999 18 years, 7 months ago
Hi, everybody! :) I have a following question. Why when I use glutGet(GLUT_ELAPSED_TIME) in the main loop of my game it returns one and the same value for some time and then jumps through 16 milliseconds and returns that new value? And my game runs at around 2000 fps. I want to implement fps independent animation but this problem won't allow that because this function for some time tells my program that no time has passed.
Advertisement
Probably you need a precison timer; a precision timer can be achieved under Win32 with QueryPerformanceCounter() (you can have precision < 1 ms...I dont remember how much but it's very precise)
I tried to use this QueryPerformanceTimer but it saves its value to some LARGE_INTEGER type which I can't transpose to int or long int and this type has all its operations locked, so I can't subtract one value of time from another.
LARGE_INTEGER is a struct which has a couple of members, Microsoft's documentation defines it like that:

typedef union _LARGE_INTEGER {  struct {    DWORD LowPart;    LONG HighPart;  };  struct {    DWORD LowPart;    LONG HighPart;  } u;  LONGLONG QuadPart;} LARGE_INTEGER, *PLARGE_INTEGER;


So if you want to use the numbers of a LARGE_INTEGER, either use the 64 bits value directly with QuadPart, or use both 32 bits with LowPart and HighPart.
Something like the following would do very well:
//Initiate Timer Function::nehe.gamedev.net/void TimerInit(void){	memset(&timer, 0, sizeof(timer));	if (!QueryPerformanceFrequency((LARGE_INTEGER *) &timer.frequency))	{		timer.performance_timer	= FALSE;		timer.mm_timer_start	= timeGetTime();		timer.resolution		= 1.0f/1000.0f;		timer.frequency			= 1000;		timer.mm_timer_elapsed	= timer.mm_timer_start;	}	else	{		QueryPerformanceCounter((LARGE_INTEGER *) &timer.performance_timer_start);		timer.performance_timer = TRUE;		timer.resolution = (float)(((double)1.0f)/((double)timer.frequency));		timer.performance_timer_elapsed	= timer.performance_timer_start;	}}//Very Precise Get Time Function::nehe.gamedev.net/float TimerGetTime(){	__int64 time;	if(timer.performance_timer)	{		QueryPerformanceCounter((LARGE_INTEGER *) &time);		return((float)(time-timer.performance_timer_start)*timer.resolution)*1000.0f;	}	else	{		return((float)(timeGetTime()-timer.mm_timer_start)*timer.resolution)*1000.0f;	}}
Thank you guys. It was very helpful. But still is this glutGet so retarded? I am just trying to make my game platform independent and it seems it's not that simple. Eh...
Quote:Original post by Tamior
Thank you guys. It was very helpful. But still is this glutGet so retarded? I am just trying to make my game platform independent and it seems it's not that simple. Eh...


You can simply write your own time wrapper.

#ifdef _WINDOWS_  // or what is your defined symbol   use QueryPerformanceTimer()#else  use glut timer#endif


QueryPerformanceTimer is unrealiable and should be not used in an app that u want other ppl to use
have u looked at sdl its platform independant + is better maintained than glut which hasnt been updated this century, the timer seems to be accurate to 1ms, if youre getting over 1000fps u can always average time over a few frames (which u will do anyways for a fps timer as u dont want the numbers jumping around)
Quote:Original post by zedzeek
QueryPerformanceTimer is unrealiable and should be not used in an app that u want other ppl to use


why ?

This topic is closed to new replies.

Advertisement