# Still dont get timeGetTime()

This topic is 4868 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I still dont know how to use timegettime(), what does it do, how do u use it?

##### Share on other sites
timeGetTime returns system milliseconds since the last time Windows was started.

Perhaps you could be more specific regarding what you are confused about?

-fel

##### Share on other sites
You can use it to find the elapsed time.

For example, if you want to measure how long some code takes to execute.
DWORD FirstTime = timeGetTime();//some codeDWORD ElapsedTime = timeGetTime() - FirstTime;

##### Share on other sites
I do not recommend using timeGetTime(). It does not actually measure milliseconds because it doesn't have millisecond precision. Testing shows it seems to be updated every 10 milliseconds or so, meaning it'll print the same value for 10 milliseconds and suddenly jump ahead to the new value. That's due to timer resolution issues.

If you need precise timing, like for showing framerate or doing time-based movement/animation, use QueryPerformanceFrequency() and QueryPerformanceCounter() in Windows, or gettimeofday() in Linux.

##### Share on other sites
The precision can be improved by calling timeBeginPeriod( 1 ) at the start of your program and timeEndPeriod( 1 ) at program shutdown – works well enough form me even when synchronizing timers over a network. However, I do believe that QueryPerformanceCounter() still produces better results (but nothing particularly notable in my experience).

Hope this helps,
Jackson Allan

##### Share on other sites
What I do is use timeGetTime to work out when more than a second has passed, then I calculate the clock speed of the CPU using RDTSC and a previously read value. During a frame I can use RDTSC and its previous value to calculate how many clock ticks have passed between frames, and use that value and my previously calculated clock speed to figure out the time passed in milliseconds.

This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).

##### Share on other sites
whats wrong with GetTickCount all of a sudden?

##### Share on other sites
Correct me if I'm wrong (I'm sure someone will :) ), but GetTickCount is only updated at the resolution returned from GetSystemTimeAdjustment, which on my PC (Win2K, Athlon XP 2000+) returns 100144 nanoseconds between updates, which is about 10 milliseconds resolution.

I should also have mentioned that on my computer the return for timeAdjustDisabled is true, so the resolution isn't great and, to add to the confusion, the system could adjust the time of day clock to make sure it's accurate. So it's not just low-resolution it's also potentially prone to jumping around (much like timeGetTime) if the system needs to fix it up.

##### Share on other sites
Quote:
 Original post by GooberiusWhat I do is use timeGetTime to work out when more than a second has passed, then I calculate the clock speed of the CPU using RDTSC and a previously read value. During a frame I can use RDTSC and its previous value to calculate how many clock ticks have passed between frames, and use that value and my previously calculated clock speed to figure out the time passed in milliseconds.This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).

QueryPerformanceCounter() finds the number of clock cycles elapsed, and QueryPerformanceFrequency() finds the number of ticks per second (2.4GHz for me). Your way is basically an inferior hack due to not knowing about the above functions.

##### Share on other sites
The thing i dont get about it is, like what does it return? like say there was a variable eg:

float time = timeGetTime();

what would time equal?

##### Share on other sites
It would equal some number between 0 and 2^32 - 1, which as fel said is the number of milliseconds since the last boot.

##### Share on other sites
I think i get it, i think i got it from the guy that posted the example code. I needed to know it for framerate, Thanks!

##### Share on other sites
So to get the number of seconds, do:
float Seconds = float(timeGetTime()) * 0.001f;

##### Share on other sites
Quote:
 Original post by CGameProgrammerQueryPerformanceCounter() finds the number of clock cycles elapsed, and QueryPerformanceFrequency() finds the number of ticks per second (2.4GHz for me). Your way is basically an inferior hack due to not knowing about the above functions.

Not quite an inferior hack (I think ;). What bothers me about QueryPerformanceCounter and QueryPerformanceFrequency is that the documentation for QPF says that the value returned won't change while the system is running (click here for the QPF msdn page). Now, I know that some laptops *will* change their clock speed according to battery level and such, so either QPF/QPC are based off a different clock (in which case that's all well and good), or they don't quite support variable clock speed systems properly. I was under the impression that QPC was basically just RDTSC behind the scenes and that there was/is no other very high resolution timer currently in PC systems, hence my confusion.

I haven't looked into this personally because my method works just fine and dandy for me, but if anyone can shed light on this I'd be interested to hear about it.

##### Share on other sites
QueryPerformanceFrequency does indeed use the rdtsc instruction. My hack comment was about you using timeGetTime() instead of QueryPerformanceCounter(), which is far more precise (it's perfectly precise).

##### Share on other sites
Whoops, that post was by me.

##### Share on other sites
I just redid my whole timing system using QueryPerformanceCounter and QueryPerformanceFrequency and ran it alongside the system clock for about half an hour to test the accuracy, I think it lost about a quarter of a second in all that time, which is probably due to my code putting certain things outside of the containment of my StartClock() and EndClock() statements. Compared to my old millisecond thing that would lose between 2 and 7 seconds every minute and whose resolution led to very changeable framedeltas, this is awesome.

use it now.

##### Share on other sites
Quote:
 Original post by Anonymous PosterQueryPerformanceFrequency does indeed use the rdtsc instruction. My hack comment was about you using timeGetTime() instead of QueryPerformanceCounter(), which is far more precise (it's perfectly precise).

Ok, QPC is basically RDTSC. QPF returns the clock speed of the CPU. But the MSDN page for QPF states that the value returned won't change while the system is running. I know for a fact that some systems (notably laptops running on battery, but still worth considering for some of us) *do* change their clock speed on the fly, so QPF should return a different value when necessary. My "hack" is basically fixing the broken QPF function. I don't see what the problem is. Admittedly, using timeGetTime for accurate timing is madness, but just using it to see when >1000 milliseconds have passed should be accurate enough most of the time.