Archived

This topic is now archived and is closed to further replies.

Daaark

Timing under windows.

Recommended Posts

I''ve been looking around for timing articles under windows, and I have found lots of good info for setting them up using the API functions QueryXXX, but they are all dependant on the hardware having a high resolution timer. What am I supposed to do if the hardware doesn''t support the high resolution timer? All the articles I have seen never give any info on what to do when these functions fail. Am I supposed to fall back to using GetTickCount()? (bleah!) Or is this not even an issue anymore, and all machines should support the timer?

Share this post


Link to post
Share on other sites
I''ve only used GetTickCount and the QueryPerformance... functions.

There are also GetSystemTimeAsFileTime, GetThreadTimes, GetProcessTimes, and timeGetTime, if you''re interesting in digging them out of the docs.

Hardware that causes QueryPerformance to fail is pretty dang rare, I doubt it will even be an issue for you.

You may want to check out the May 2003 issue of Windows Developer magazine, it''s got a long, good article on this exact subject.

Share this post


Link to post
Share on other sites
The same story as every Win32 function lol.

quote:
Quoting MSDN:

Windows NT/2000: The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine.

Windows 95: The default precision of the timeGetTime function is 1 millisecond.


I guess I will use the Query functions and just abort if they fail.

[edited by - Vampyre_Dark on May 27, 2003 3:35:04 PM]

Share this post


Link to post
Share on other sites
If you need to use timeGetTime() and want the best precision I recommend doing the following before using it:


  
TIMECAPS caps;
bool yep = false;

if (timeGetDevCaps(&caps,sizeof(caps)) == TIMERR_NOERROR)
{
yep = true;
timeBeginPeriod(caps.wPeriodMin);
}

// ..and when you''re done with the timer


if (yep)
timeEndPeriod(caps.wPeriodMin);

Share this post


Link to post
Share on other sites
Ok, I''ve put together a class that can get the frequency, poll the timer and return the elapsed time since the last poll. How do I calculate FPS with this?

The OpenGL FAQ pages say that you do fps = 1.0 / ElapsedTime, this returns the numbers 0.00007x in 640x480, and 0.00005x in 1024x768. Not what I wanted.

How can I tell when a second has passed with the value returned by the QueryPerformanceCounter funtion? I know this is a stupid question, but I have never had to do this before. I always had a library where I could make functions run once a second or whenever needed.

Share this post


Link to post
Share on other sites
Look at QueryPerformanceFrequency. That gives the number of ticks per second that the timer does. So you take the value from QueryPerformanceCounter and divide it by the value you got from QueryPerformanceFrequency. That will tell you the number of seconds that have passed (in a fractional amount) since the last query. Divide 1.0 by that value and you have your fps.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Games are multimedia applications.
According to MS, timeGetTime will give accurate enough measures for multimedia applications.
I never experienced any trouble while using is it.
A lot of errors in updates are not due to inaccurate timing, it''s because of bad animation paths. To calculate more accurate time does not mean that it''s more accurate, if you compare it to the actual time past. It''s still totally impossible to run an accurate real-time system on a windows platform.

/Fredrik

Share this post


Link to post
Share on other sites
Ah,

fps = 1.0 / ((FLOAT)ctimer.ElapsedTime/(FLOAT)ctimer.ClockFrequency);

Thanks, that works great. I''m getting ~92.xxx on my amateur unoptimized crap!

Share this post


Link to post
Share on other sites