What's the differences between two Timers?

Started by
1 comment, last by pluto_knight 21 years, 1 month ago
I''ve foud two Timers on MSDN .After reading its remark,I wonder the real meaning of "The timeGetTime function has less overhead than timeGetSystemTime".What''s more ,I want to ask ,how many values the timeGetTime returns.If one ,why it says "You should always use the difference between two timeGetTime return values in computations". Thanks a lot in advance.
Advertisement
DWORD Start = GetTickCount();

//Do work

DWORD MilliSecsUsed = GetTickCount() - Start;

If you need more accurate timers do a search on
"QueryPerformanceCounter" on..I think you''ll find
some examples either here or at www.flipcode.com on
high precision timers
quote:Original post by pluto_knight
I''ve foud two Timers on MSDN. After reading its remark,I wonder
the real meaning of "The timeGetTime function has less overhead than timeGetSystemTime".

It takes less time to execute timeGetTime than timeGetSystemTime (Are you sure that isn''t just GetSystemTime? It sounds like a Win32 API.) This probably because of the level of precision, predicated by the hardware that is accessed to obtain the current timestamp.

quote:What''s more, I want to ask, how many values the timeGetTime returns. If one, why it says "You should always use the difference between two timeGetTime return values in computations".

timeGetTime only returns one value, so to use the difference between two return values you need to call the function twice - once at the start of the interval you wish to measure and once at the end:
long t0 = timeGetTime(); // something like this; can''t remember exact parameter list off head...long t1 = timeGetTime();long t_diff = t1 - t0;// use t_diff 

This topic is closed to new replies.

Advertisement