# Timer Functions

This topic is 3213 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello everybody I am currently working on a 2D side scrolling game...using C++ and opengl... I would like to know as to what can i use for getting time in milliseconds...and if i make a call to the timer function it should always start sequentially i.e from zero and then progress...please advice on this... i always want my timer to start from 0 and then increment...

##### Share on other sites
On windows you can call GetTickCount() to get the time in milliseconds. Typically at the start of the application you cache the current value of GetTickCount() and then for each successive frame you have a last point in time to refer to. Then yo do your logic and then update the last point in time to the current one, rinse and repeat.

If you want a more high resolution timer you can use QueryPerformanceCounter and QueryPerformanceFrequency methods.

##### Share on other sites
Please do note that GetTickCount() will have a resolution of around 15 ms (or a different value on older Windows versions). If that is enough for you, then by all means go with it, as it is the fastest and easiest way to get hold of "time".

The resolution issue is true in principle for timeGetTime(), but you can use timeBeginPeriod() to give it a real millisecond resolution (and for all I can tell, more or less perfectly accurate). timeGetTime() is slightly slower than GetTickCount(), but not painfully slow. If you need 1 ms resolution, use this.

QueryPerformanceCounter() is total shit, unless you only target Windows Vista or Windows 7 on the newest line of processors, since it will use RDTSC in all other cases. Under Vista/7, with last generation processors, admittedly, QueryPerformanceCounter() is ok, as it uses reliable hardware timers. But well, you usually don't know what OS/hardware your program runs on.
RDTSC does not count time, but cycles, which is unreliable, and possibly non-monotonic. On multi-core machines, you may get significantly different results depending on what core your code runs on, so you may appear to jump back in time. On speed-stepped machines (all laptops and the vast majority of desktop machines during the last couple of years), there is no real relation between the counter and time, either. A CPU might run at 2.2 GHz this second, and 0.55 GHz the next second.

EDIT: interesting article on the subject

##### Share on other sites
What OS/platform are you developing on?

##### Share on other sites
Quote:
 Original post by anupguptaa call to the timer function it should always start sequentially i.e from zero and then progress...

I thought it'd be fun to try the 'clock' function:

#include <ctime>#include <iostream>//// Returns the number of milliseconds elapsed since this function was last called//unsigned int GetTimeElapsed(bool reset){	static clock_t START_CLOCK = clock();	clock_t stop_clock = clock();	unsigned int time_elapsed = ((stop_clock - START_CLOCK) * 1000) / CLOCKS_PER_SEC;	if(reset)	{		START_CLOCK = stop_clock;	}	return time_elapsed;}//// Example usage//int main(){	// Call the function once to get it started - should return 0	unsigned int ms = GetTimeElapsed(true);	// ...	// Get the amount of time elapsed	ms = GetTimeElapsed(true);	std::cout << "Time elapsed: " << ms << " ms\n";	return 0;}

EDIT: Fixed

##### Share on other sites
DXUT (the DirectX application framework) includes a timer class which, AFAIK, you can just rip out wholesale and use for yourself if you're interested.

##### Share on other sites
Also consider timeGetTime or QueryPerformanceCounter

timeGetTime Returns the system time, in milliseconds. DWORD timeGetTime(void); Windows NT/2000 default precision of the timeGetTime function can be five milliseconds or more, depending on the machine.

// "i always want my timer to start from 0 and then increment"

DWORD tick = 0;
DWORD timeZeroOffset = timeGetTime();
// ...
DWORD timeN = timeGetTime();

timeN -= timeZeroOffset;

tick++;

##### Share on other sites
I did an experiment a long time ago trying to find out which was the best timer. The winner by FAR was the timing function implemented in OpenMP. Highest resolution (around 50 ns!) and consistently accurate. Furthermore, it worked on multiple cores inherently becuase its a part of OpenMP.

• ### What is your GameDev Story?

In 2019 we are celebrating 20 years of GameDev.net! Share your GameDev Story with us.

(You must login to your GameDev.net account.)

• 28
• 16
• 10
• 10
• 11
• ### Forum Statistics

• Total Topics
634113
• Total Posts
3015570
×