Expert question: How to get a better time delta?

Started by
29 comments, last by Frank Force 11 years ago

Hi,

I'm working on a game engine and something occurred to me about how accurately we are measuring the delta between frames. I normally use a timer to check the delta, but even when running vsynced the measured delta is never exactly equal to the vsync interval. On average it's equal to the vsync interval but the measured delta time can fluctuate quite a bit. This means that if you have a fancy fixed time step with interpolation it's still going to be wrong because the delta that is being represented between frames will not ever be equal to the vsync interval, even though the frames themselves are always shown at exactly the refresh rate. This causes a tiny bit if jitter that maybe most people don't notice but I do, and it was really starting to bug me. Clearly something must be done to correct the discrepancy, right?
Please understand that using a fixed time step with interpolation is not going to fix this issue! What interpolation fixes is temporal aliasing, this is more like temporal fluctuation. The solution I worked out corrects the time delta in advance so it will always be in phase with the vsync interval. My game engine runs incredibly smooth with this enabled so I already know that it works.
My question is, am I crazy or is this kind of important for every game engine to have? Is there some other more simple method I am unaware of that people use to deal with this? I tried posting in a few other places but no one seems interested or maybe they just don't understand what I'm talking about. One guy was very insulting and basically called me a noob... I've been working in the game industry at major studios for over a decade. If just one person can understand what I'm talking about and/or explain why I'm wrong that would be totally awesome. Here's a link to my blog post with more info and code...

Advertisement

Using a "fixed" timestep is not about expecting your timer to fire ever N milliseconds on the nose. Even when running fixed-step simulation you need to measure the actual wall clock time that elapses and use that to advance your simulation. Otherwise you will get clock drift and indeterminacy over time.

So yeah, you're basically right. I'm not sure what the question is, though, unless you're just looking to make sure you understand the situation correctly (which it seems you do).

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

How do you measure the actual wall clock time that elapses? I think that's what I'm already doing that I found was wrong. So, there are two different deltas here, the wall clock delta and the fixed time step delta. The problem I'm talking about is that the measured wall clock is not ever an exact even multiple of the vsync interval even though it should be in order for the rendered frames to be interpolated properly. Can you see how that would be an issue?

I don't see why you need some theoretical magic time value for anything to work correctly. Run your simulation at as close to fixed as you can, and you're done.

You will never get 100% precisely perfect timings on a consumer OS, they simply aren't designed to give you that guarantee. (Look at process timeslice quantization rules for example.) Design your code accordingly.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

You say it's not possible to get perfect timing, but you also say that doesn't really need that to work correctly. I agree that a game engine will still work mostly correct without the stuff I'm talking about, every game engine I've ever seen has certainly worked fine without it. But I don't understand why people are willing to settle for mostly correct, if we could get perfect timing that would be better right? I mean, it's not theoretical or magical, monitors have a very specific vsync interval that images are displayed at.

The whole point of what I'm trying to talk about here is that you can get perfect timing that is 100% precise and this is how you do it. The only sacrifice is a small amount of unavoidable latency which is a small price to pay for perfect timing and smoothness of motion. With triple buffering it doesn't really matter what the os does or how much fluctuation between deltas there is. What about my plan makes you think it won't yield perfect timing?

I've never thought of / tried this, but it does make sense in a vsync'ed system to only use deltas that are a multiple of the refresh rate...

When running at 60hz, what kind of deltas were you measuring befor implementing this system / what size correction deltas are you applying?
How is your timer itself implemented? Do you use 64bit absolute time as much as possible over 32bit deltas?

Last I checked in my game, when vsync'ed my deltas were 16.66667, which seems correct, but I'm still keen to check out your correction when I get a chance and see if it helps.

My point is that your corrected timing code is fine.

You will never get guaranteed time deltas without a hard-realtime OS. The kernel is free to schedule some other process for a quantum, which means you don't get a chance to even run when your timestep "should" be firing. This is why you get timing variances like you describe, and in a multitasking OS, it's actually a very good thing.

This compensation is actually relatively common in my experience; basically you're just eating the time variance by amortizing it across the subsequent frame(s).

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

Hodgman -

My deltas don't vary much, generally between 16 and 17 or so but can be much less/greater. Also when triple buffering multiple updates must sometimes happen during the same vsync interval, otherwise you aren't really buffering anything, right? My engine uses a double for the timer, but that shouldn't affect this issue. Changing to a fixed point solution or increasing the timer accuracy won't get rid of the fluctuation. I am curious about why your measured deltas don't vary, have you tried logging them to a file?

ApochPiQ -

Thanks that helps a little. I like to think of it as buffering the time delta. I figured this was a common thing that I must have never came across or seen mentioned anywhere because it seems pretty damn important. That's why I'm asking if I am over thinking things or maybe just using the wrong terminology.

I am curious about why your measured deltas don't vary, have you tried logging them to a file?

Yeah I was mistaken, when synced to display at 60Hz, the delta log looks like:
...
0.016761
0.016591
0.016652
0.016698
0.016710
0.016666
...
Which over time seems to average out to the correct value, but yes, there is jitter.
[edit]They actually don't average out to 1/60 - they're generally higher than 1/60[/edit]

I am running my physics on a fixed time step of 60Hz (with some internal parts of the physics taking 10 sub-steps for each 1 physics tick), so it seems that occasionally I should actually be doing no physics updates for a frame, followed to two the next frame. Using a minimum delta of 1/60th (and the correction buffer) might smooth this out. Thanks.

Interesting idea. I can see how this would help make things extra smooth. I'll have to try it out some time and see what the difference is like (I'm on OS X currently so I can't try your demo).

Question: is there a way to support monitors with different refresh rates? Most monitors these days are 60Hz, but there are some displays and systems that aren't running at 60Hz, but something "odd" (for example, some projectors may be 48 or 72Hz (since most movies are 24Hz)). If I'm understanding it right, you can't use this method and "properly" support monitors with different refresh rates (then again, the traditional fixed time step doesn't either).

I'm going to have to think about this one some more...

[size=2][ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]

This topic is closed to new replies.

Advertisement