• Advertisement

Archived

This topic is now archived and is closed to further replies.

Windows + precision timer

This topic is 5150 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This is not a question but a function I''ve ''discovered'' To get time I always used timeGetTime() or clock() because they work well for my purpose; today when I debugged a program I discovered that there is a function VOID GetSystemTimeAsFileTime(LPFILETIME lpSystemTimeAsFileTime); that fills a structure with two 32 bit integer (in practice a 64-bit integer). The LSB of this integer is equivalent to 100 nanoseconds!!! In practice this value is divided by 10000 to get milliseconds. (this function is also used by clock() ). I dont know if this is useful however it would be nice to have a timer with 0.0001 ms precision.

Share this post


Link to post
Share on other sites
Advertisement
It''s not that good. I used that function to implement a micro-second clock for boost, but it turns out it''s granularity is much bigger than 100ns - something like 50ms IIRC (looks like it''s dependant upon the RTC).

QueryPerformanceCounter is good to about 5us, and there''s the Pentium rdtsc instruction which gives you CPU ticks.

Share this post


Link to post
Share on other sites
Thanks!

So to get time in ms with QueryPerformance() I should do something like this (right?)


LONG GetMilliseconds(void)
{
//

//

LARGE_INTEGER ticks;
QueryPerformanceCounter(&ticks);
//

//

LARGE_INTEGER ticks_per_second;
QueryPerformanceFrequency(&ticks_per_second);
//

//

LONGLONG milliseconds;
milliseconds = 1000*ticks.QuadPart/ticks_per_second.QuadPart;
//

//

return((LONG)milliseconds);
}



[edited by - blizzard999 on January 12, 2004 11:36:49 AM]

Share this post


Link to post
Share on other sites
Right. But you only need to query the frequency once. Do the call in an initialization function and save off the frequency for use in subsequent calls to GetMilliseconds().

Share this post


Link to post
Share on other sites
clock is implemented with GetSystemTimeAsFileTime. GSTAFT''s tick rate is 10 ms on HALx86 / 15 ms on SMP HAL, and is reported by GetSystemTimeAdjustment.

Share this post


Link to post
Share on other sites
Thanks for info!

I implemented a GetTimeMillis() function that uses clock() (#ifndef _WINDOWS_) or use QueryPerformance...; in the case a performance timer is not supported I use timeGetTime().

Share this post


Link to post
Share on other sites

  • Advertisement