Jump to content

  • Log In with Google      Sign In   
  • Create Account

Frank Force

Member Since 05 Jan 2009
Offline Last Active Apr 26 2013 01:06 AM

#5049258 Expert question: How to get a better time delta?

Posted by on 02 April 2013 - 12:20 PM

Hodgeman - I'm not surprised your game already felt smooth because using a fixed time step that is equal to the vsync is the ideal case to not have jitters.  Its interesting that tried that method to force it to produce duplicate frames and still couldn't really detect it visually.  It does depend on the type of game, for some games it is less obvious.  The most obvious case is a 2d scrolling game where pops like this are very apparent.  In 3d fps type games it seems less obvious.  I also think that many of even the very best games have this jittery issue so we are trained to not notice it.


I think the whole point of triple buffering is to allow the CPU to fall behind a bit to smooth out pops by sacrificing latency.  So even if you have just 1 new backbuffer to display and the vsync happens then it will flip it, why wouldn't it?  The next vsync you will just need to do 2 frames to catch up.  This is how pops are smoothed out even if you have a very long frame.  Triple buffering + vsync is a way of actually decoupling the vsync from the game update.  Even though you get 1 update per vsync on average, it's really more about keeping the back buffers updated then waiting for the vsync.


Icebone - Wow, that's a really cool idea!  Keep in mind though that due to rounding that gaps in the pattern may actually be correct.  So for example if it's moving at 1.1 pixels per vsync, you would expect it to jump an extra pixel every 10 vsyncs. The math to make things move an integer number of pixels when there's a fixed time step with interpolations is a bit complicated.  What I do to visualize jitter is have a wrapping image scroll continuously across the screen.  When running at 60 fps it should move extremely smooth.  I will need to think more about your idea though, I like how it is visualized over time.

#5048801 Expert question: How to get a better time delta?

Posted by on 31 March 2013 - 11:49 PM

Hodgeman - Mine seems to fluctuate a bit more, but that seems about right.  I think I know why it averages higher then 1/60.  Any frames that get skipped will raise the average above the refresh rate.  Are you doing any kind of interpolation?  The correction buffer thing will ensure the delta is greater than 1 / refreshRate.  You don't want to cap it the minimum delta directly because even though the measured delta can be less then 1/60 it still needs to be accounted for.


Cornstalks - With a fixed time step you need interpolation to correct for the difference between your monitor's refresh rate and the fixed time step, that is a different issue.  This method should work fine regardless of the refresh rate of the monitor.

#5048774 Expert question: How to get a better time delta?

Posted by on 31 March 2013 - 08:13 PM

Hodgman - 


My deltas don't vary much, generally between 16 and 17 or so but can be much less/greater. Also when triple buffering multiple updates must sometimes happen during the same vsync interval, otherwise you aren't really buffering anything, right? My engine uses a double for the timer, but that shouldn't affect this issue.  Changing to a fixed point solution or increasing the timer accuracy won't get rid of the fluctuation. I am curious about why your measured deltas don't vary, have you tried logging them to a file?


ApochPiQ - 


Thanks that helps a little.  I like to think of it as buffering the time delta.  I figured this was a common thing that I must have never came across or seen mentioned anywhere because it seems pretty damn important.  That's why I'm asking if I am over thinking things or maybe just using the wrong terminology.

#5048696 Expert question: How to get a better time delta?

Posted by on 31 March 2013 - 03:29 PM

You say it's not possible to get perfect timing, but you also say that doesn't really need that to work correctly.  I agree that a game engine will still work mostly correct without the stuff I'm talking about, every game engine I've ever seen has certainly worked fine without it.  But I don't understand why people are willing to settle for mostly correct, if we could get perfect timing that would be better right?  I mean, it's not theoretical or magical, monitors have a very specific vsync interval that images are displayed at.


The whole point of what I'm trying to talk about here is that you can get perfect timing that is 100% precise and this is how you do it. The only sacrifice is a small amount of unavoidable latency which is a small price to pay for perfect timing and smoothness of motion.  With triple buffering it doesn't really matter what the os does or how much fluctuation between deltas there is. What about my plan makes you think it won't yield perfect timing?

#5048674 Expert question: How to get a better time delta?

Posted by on 31 March 2013 - 02:40 PM


I'm working on a game engine and something occurred to me about how accurately we are measuring the delta between frames.  I normally use a timer to check the delta, but even when running vsynced the measured delta is never exactly equal to the vsync interval.  On average it's equal to the vsync interval but the measured delta time can fluctuate quite a bit.  This means that if you have a fancy fixed time step with interpolation it's still going to be wrong because the delta that is being represented between frames will not ever be equal to the vsync interval, even though the frames themselves are always shown at exactly the refresh rate.  This causes a tiny bit if jitter that maybe most people don't notice but I do, and it was really starting to bug me. Clearly something must be done to correct the discrepancy, right?
Please understand that using a fixed time step with interpolation is not going to fix this issue!  What interpolation fixes is temporal aliasing, this is more like temporal fluctuation. The solution I worked out corrects the time delta in advance so it will always be in phase with the vsync interval. My game engine runs incredibly smooth with this enabled so I already know that it works.
My question is, am I crazy or is this kind of important for every game engine to have?  Is there some other more simple method I am unaware of that people use to deal with this?  I tried posting in a few other places but no one seems interested or maybe they just don't understand what I'm talking about.  One guy was very insulting and basically called me a noob... I've been working in the game industry at major studios for over a decade. If just one person can understand what I'm talking about and/or explain why I'm wrong that would be totally awesome.  Here's a link to my blog post with more info and code...