Jump to content
  • Advertisement
Sign in to follow this  
dustydoodoo

Still dont get timeGetTime()

This topic is 4959 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
timeGetTime returns system milliseconds since the last time Windows was started.

Perhaps you could be more specific regarding what you are confused about?

-fel

Share this post


Link to post
Share on other sites
You can use it to find the elapsed time.

For example, if you want to measure how long some code takes to execute.

DWORD FirstTime = timeGetTime();

//some code

DWORD ElapsedTime = timeGetTime() - FirstTime;




Share this post


Link to post
Share on other sites
I do not recommend using timeGetTime(). It does not actually measure milliseconds because it doesn't have millisecond precision. Testing shows it seems to be updated every 10 milliseconds or so, meaning it'll print the same value for 10 milliseconds and suddenly jump ahead to the new value. That's due to timer resolution issues.

If you need precise timing, like for showing framerate or doing time-based movement/animation, use QueryPerformanceFrequency() and QueryPerformanceCounter() in Windows, or gettimeofday() in Linux.

If you want more information on those, let me know.

Share this post


Link to post
Share on other sites
The precision can be improved by calling timeBeginPeriod( 1 ) at the start of your program and timeEndPeriod( 1 ) at program shutdown – works well enough form me even when synchronizing timers over a network. However, I do believe that QueryPerformanceCounter() still produces better results (but nothing particularly notable in my experience).

Hope this helps,
Jackson Allan

Share this post


Link to post
Share on other sites
What I do is use timeGetTime to work out when more than a second has passed, then I calculate the clock speed of the CPU using RDTSC and a previously read value. During a frame I can use RDTSC and its previous value to calculate how many clock ticks have passed between frames, and use that value and my previously calculated clock speed to figure out the time passed in milliseconds.

This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).

Share this post


Link to post
Share on other sites
Correct me if I'm wrong (I'm sure someone will :) ), but GetTickCount is only updated at the resolution returned from GetSystemTimeAdjustment, which on my PC (Win2K, Athlon XP 2000+) returns 100144 nanoseconds between updates, which is about 10 milliseconds resolution.

[edit]

I should also have mentioned that on my computer the return for timeAdjustDisabled is true, so the resolution isn't great and, to add to the confusion, the system could adjust the time of day clock to make sure it's accurate. So it's not just low-resolution it's also potentially prone to jumping around (much like timeGetTime) if the system needs to fix it up.

Share this post


Link to post
Share on other sites
Quote:
Original post by Gooberius
What I do is use timeGetTime to work out when more than a second has passed, then I calculate the clock speed of the CPU using RDTSC and a previously read value. During a frame I can use RDTSC and its previous value to calculate how many clock ticks have passed between frames, and use that value and my previously calculated clock speed to figure out the time passed in milliseconds.

This doesn't require millisecond accuracy from timeGetTime, but it does give you a nice accurate timer and it allows your app to respond reasonably quickly should it be running on a laptop with a variable clock speed (some systems turn the clock speed down to save battery life).

QueryPerformanceCounter() finds the number of clock cycles elapsed, and QueryPerformanceFrequency() finds the number of ticks per second (2.4GHz for me). Your way is basically an inferior hack due to not knowing about the above functions.

Share this post


Link to post
Share on other sites
The thing i dont get about it is, like what does it return? like say there was a variable eg:


float time = timeGetTime();

what would time equal?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!