Still dont get timeGetTime()
timeGetTime returns system milliseconds since the last time Windows was started.
Perhaps you could be more specific regarding what you are confused about?
-fel
Perhaps you could be more specific regarding what you are confused about?
-fel
You can use it to find the elapsed time.
For example, if you want to measure how long some code takes to execute.
For example, if you want to measure how long some code takes to execute.
DWORD FirstTime = timeGetTime();//some codeDWORD ElapsedTime = timeGetTime() - FirstTime;
I do not recommend using timeGetTime(). It does not actually measure milliseconds because it doesn't have millisecond precision. Testing shows it seems to be updated every 10 milliseconds or so, meaning it'll print the same value for 10 milliseconds and suddenly jump ahead to the new value. That's due to timer resolution issues.
If you need precise timing, like for showing framerate or doing time-based movement/animation, use QueryPerformanceFrequency() and QueryPerformanceCounter() in Windows, or gettimeofday() in Linux.
If you want more information on those, let me know.
If you need precise timing, like for showing framerate or doing time-based movement/animation, use QueryPerformanceFrequency() and QueryPerformanceCounter() in Windows, or gettimeofday() in Linux.
If you want more information on those, let me know.
The precision can be improved by calling timeBeginPeriod( 1 ) at the start of your program and timeEndPeriod( 1 ) at program shutdown – works well enough form me even when synchronizing timers over a network. However, I do believe that QueryPerformanceCounter() still produces better results (but nothing particularly notable in my experience).
Hope this helps,
Jackson Allan
Hope this helps,
Jackson Allan
What I do is use timeGetTime to work out when more than a second has passed, then I calculate the clock speed of the CPU using RDTSC and a previously read value. During a frame I can use RDTSC and its previous value to calculate how many clock ticks have passed between frames, and use that value and my previously calculated clock speed to figure out the time passed in milliseconds.
This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).
This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).
Correct me if I'm wrong (I'm sure someone will :) ), but GetTickCount is only updated at the resolution returned from GetSystemTimeAdjustment, which on my PC (Win2K, Athlon XP 2000+) returns 100144 nanoseconds between updates, which is about 10 milliseconds resolution.
[edit]
I should also have mentioned that on my computer the return for timeAdjustDisabled is true, so the resolution isn't great and, to add to the confusion, the system could adjust the time of day clock to make sure it's accurate. So it's not just low-resolution it's also potentially prone to jumping around (much like timeGetTime) if the system needs to fix it up.
[edit]
I should also have mentioned that on my computer the return for timeAdjustDisabled is true, so the resolution isn't great and, to add to the confusion, the system could adjust the time of day clock to make sure it's accurate. So it's not just low-resolution it's also potentially prone to jumping around (much like timeGetTime) if the system needs to fix it up.
Quote:Original post by Gooberius
What I do is use timeGetTime to work out when more than a second has passed, then I calculate the clock speed of the CPU using RDTSC and a previously read value. During a frame I can use RDTSC and its previous value to calculate how many clock ticks have passed between frames, and use that value and my previously calculated clock speed to figure out the time passed in milliseconds.
This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).
QueryPerformanceCounter() finds the number of clock cycles elapsed, and QueryPerformanceFrequency() finds the number of ticks per second (2.4GHz for me). Your way is basically an inferior hack due to not knowing about the above functions.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement