Jump to content
  • Advertisement
Sign in to follow this  
Nairou

Current state of cross-platform timing

This topic is 2945 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've done a lot of Googling on this topic, but most of the discussions I've found have been years old. I'm hoping someone has more up-to-date information regarding what is available for doing high-precision (as far as games go) timing.

I primarily have two questions:


1. Is there a cross-platform timing library that provides sufficient resolution for frame time measurement?

I've been looking at boost::ptime, which on first glance appears to have the necessary resolution, but in the documentation it says that the win32 implementation uses ftime and therefore lacks sufficient resolution. Is this actually true still? Using boost is appealing since I'm already using the library. I'd like to avoid using another all-encompassing library (such as SDL) just for it's timing functionality.

The only alternative I can find is to implement timing myself on each platform, and needing to use QueryPerformanceCounter on win32. Which leads to my other question...


2. Does QueryPerformanceCounter still have problems leaping forward in time on modern versions of Windows? Everything I've read on this seems fairly old, and I can't tell if this issue is only on older versions of Windows (95/98/2000) or if it affects newer versions as well. Having to monitor and try to compensate for the leaping sounds like it could be a pain.

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by Nairou
2. Does QueryPerformanceCounter still have problems leaping forward in time on modern versions of Windows? Everything I've read on this seems fairly old, and I can't tell if this issue is only on older versions of Windows (95/98/2000) or if it affects newer versions as well. Having to monitor and try to compensate for the leaping sounds like it could be a pain.


Yes, I can testify seeing weird things, on recent laptops, when using QPC for frame timing. Personally I currently reserve QPC only for code profiling, and measure frametime with timeGetTime(). If the default accuracy (10ms I think) is not enough, you can use beginTimePeriod() to set your own custom period.

Share this post


Link to post
Share on other sites
Quote:
Original post by AgentC
Yes, I can testify seeing weird things, on recent laptops, when using QPC for frame timing. Personally I currently reserve QPC only for code profiling, and measure frametime with timeGetTime(). If the default accuracy (10ms I think) is not enough, you can use beginTimePeriod() to set your own custom period.

I've heard about timeGetTime() being more stable, but I was concerned about the accuracy in regards to measuring frame time for games. 10ms is definitely too low, but even at 1ms (assuming you can reliably get it that low?) I wonder if you might eventually run into a problem with fast computers cycling the game loop in only a handful of ms, and not having much resolution left over to determine how much time actually passed. Do you not run into accuracy problems?

Share this post


Link to post
Share on other sites
Most games run at 30 fps and fixed time step, so it's not really a big deal.

For more, use highest resolution, then test across hardware and all that. Or use a real world tested existing engine, they either solved this problem, or it isn't a problem at all.

The only other alternative will be 60fps or 16.6ms per frame.

Share this post


Link to post
Share on other sites
That makes sense. So 1ms accuracy should be fine, so long as frame rate is limited (i.e. vsync).

I'm concerned about the effect that timeBeginPeriod/timeEndPeriod would have on the system when you crank up the resolution to maximum. From the MSDN article:
Quote:
This function affects a global Windows setting. Windows uses the lowest value (that is, highest resolution) requested by any process. Setting a higher resolution can improve the accuracy of time-out intervals in wait functions. However, it can also reduce overall system performance, because the thread scheduler switches tasks more often.

Since a game will want the resolution turned up all the time, for the duration of the game, what effect would this have on the rest of the system?

It also mentions how QPC isn't affected by this. Bah I wish QPC was more reliable...

Share this post


Link to post
Share on other sites
Quote:
Original post by Nairou
I wonder if you might eventually run into a problem with fast computers cycling the game loop in only a handful of ms, and not having much resolution left over to determine how much time actually passed. Do you not run into accuracy problems?


Yeah I can easily get over 1000 fps with a simple enough scene, in which case the timer returns zero time passed (I use 5ms period to be able to measure up to 200 fps). In that case, the logic state stays the same, and several frames with same graphics get drawn, until the timer actually returns non-zero. I don't think that's a problem.

Using a low period value can cause system-wide extra overhead due to the interrupt triggering more often. On modern (> 1GHz) machines it probably doesn't have much effect, though, and personally I don't worry about it.

Share this post


Link to post
Share on other sites
Commercial engines uses QPC too. I've played several games on my old hardware where the bug caused animations to jump back and forwards, with the effect getting worse the longer the computer was switched on. (due to drift in the values on each CPU)

I'm pretty sure there are patches around now, so you can do your users a favour and install the fix when they install the game. I think you have to be running both Windows XP and a dual core CPU (not quad), with a specific mainboard chipset. Otherwise the problem is fixed already.

In any case, your game should be able to handle negative timesteps without crashing or interpreting as unsigned (which would cause an enormous jump forward)

Share this post


Link to post
Share on other sites
timeGetTime() has a default resolution of 1ms on all versions of windows since 95, excluding NT/2000.

Share this post


Link to post
Share on other sites
timeGetTime is perfectly fine; if you've ever played Quake II or Quake III, they both use timeGetTime and did you notice any other system-wide problems?

Just make sure that you use timeBeginPeriod at the start of your game (don't reply on the default period being the same in future versions of Windows in other words) and timeEndPeriod when done. Otherwise the only real problem you'll encounter will be integer wraparound after 40-odd days.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!