Jump to content
  • Advertisement
Sign in to follow this  
Frank Force

Expert question: How to get a better time delta?

This topic is 1956 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement

Using a "fixed" timestep is not about expecting your timer to fire ever N milliseconds on the nose. Even when running fixed-step simulation you need to measure the actual wall clock time that elapses and use that to advance your simulation. Otherwise you will get clock drift and indeterminacy over time.

 

So yeah, you're basically right. I'm not sure what the question is, though, unless you're just looking to make sure you understand the situation correctly (which it seems you do).

Share this post


Link to post
Share on other sites

How do you measure the actual wall clock time that elapses?  I think that's what I'm already doing that I found was wrong. So, there are two different deltas here, the wall clock delta and the fixed time step delta.  The problem I'm talking about is that the measured wall clock is not ever an exact even multiple of the vsync interval even though it should be in order for the rendered frames to be interpolated properly.  Can you see how that would be an issue?

Share this post


Link to post
Share on other sites

I don't see why you need some theoretical magic time value for anything to work correctly. Run your simulation at as close to fixed as you can, and you're done.

 

You will never get 100% precisely perfect timings on a consumer OS, they simply aren't designed to give you that guarantee. (Look at process timeslice quantization rules for example.) Design your code accordingly.

Share this post


Link to post
Share on other sites

You say it's not possible to get perfect timing, but you also say that doesn't really need that to work correctly.  I agree that a game engine will still work mostly correct without the stuff I'm talking about, every game engine I've ever seen has certainly worked fine without it.  But I don't understand why people are willing to settle for mostly correct, if we could get perfect timing that would be better right?  I mean, it's not theoretical or magical, monitors have a very specific vsync interval that images are displayed at.

 

The whole point of what I'm trying to talk about here is that you can get perfect timing that is 100% precise and this is how you do it. The only sacrifice is a small amount of unavoidable latency which is a small price to pay for perfect timing and smoothness of motion.  With triple buffering it doesn't really matter what the os does or how much fluctuation between deltas there is. What about my plan makes you think it won't yield perfect timing?

Share this post


Link to post
Share on other sites
I've never thought of / tried this, but it does make sense in a vsync'ed system to only use deltas that are a multiple of the refresh rate...

When running at 60hz, what kind of deltas were you measuring befor implementing this system / what size correction deltas are you applying?
How is your timer itself implemented? Do you use 64bit absolute time as much as possible over 32bit deltas?

Last I checked in my game, when vsync'ed my deltas were 16.66667, which seems correct, but I'm still keen to check out your correction when I get a chance and see if it helps.

Share this post


Link to post
Share on other sites

My point is that your corrected timing code is fine.

 

You will never get guaranteed time deltas without a hard-realtime OS. The kernel is free to schedule some other process for a quantum, which means you don't get a chance to even run when your timestep "should" be firing. This is why you get timing variances like you describe, and in a multitasking OS, it's actually a very good thing.

 

This compensation is actually relatively common in my experience; basically you're just eating the time variance by amortizing it across the subsequent frame(s).

Share this post


Link to post
Share on other sites

Hodgman - 

 

My deltas don't vary much, generally between 16 and 17 or so but can be much less/greater. Also when triple buffering multiple updates must sometimes happen during the same vsync interval, otherwise you aren't really buffering anything, right? My engine uses a double for the timer, but that shouldn't affect this issue.  Changing to a fixed point solution or increasing the timer accuracy won't get rid of the fluctuation. I am curious about why your measured deltas don't vary, have you tried logging them to a file?

 

ApochPiQ - 

 

Thanks that helps a little.  I like to think of it as buffering the time delta.  I figured this was a common thing that I must have never came across or seen mentioned anywhere because it seems pretty damn important.  That's why I'm asking if I am over thinking things or maybe just using the wrong terminology.

Share this post


Link to post
Share on other sites

I am curious about why your measured deltas don't vary, have you tried logging them to a file?

Yeah I was mistaken, when synced to display at 60Hz, the delta log looks like:
...
0.016761
0.016591
0.016652
0.016698
0.016710
0.016666
...
Which over time seems to average out to the correct value, but yes, there is jitter.
[edit]They actually don't average out to 1/60 - they're generally higher than 1/60[/edit]
 
I am running my physics on a fixed time step of 60Hz (with some internal parts of the physics taking 10 sub-steps for each 1 physics tick), so it seems that occasionally I should actually be doing no physics updates for a frame, followed to two the next frame. Using a minimum delta of 1/60th (and the correction buffer) might smooth this out. Thanks. Edited by Hodgman

Share this post


Link to post
Share on other sites

Interesting idea. I can see how this would help make things extra smooth. I'll have to try it out some time and see what the difference is like (I'm on OS X currently so I can't try your demo).

 

Question: is there a way to support monitors with different refresh rates? Most monitors these days are 60Hz, but there are some displays and systems that aren't running at 60Hz, but something "odd" (for example, some projectors may be 48 or 72Hz (since most movies are 24Hz)). If I'm understanding it right, you can't use this method and "properly" support monitors with different refresh rates (then again, the traditional fixed time step doesn't either).

 

I'm going to have to think about this one some more...

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!