How to fix this timestep once and for all?

Started by
12 comments, last by Kylotan 6 years, 7 months ago

One of the biggest reasons why I haven't released my game is because of this annoying timestep issue I have.  To be frank, this game was poorly planned, poorly coded, and was originally written as a small tech demo and a mini-game.  Now it has evolved into a fully featured (and very messy code base of a) game.  If you thought Lugaru was bad, Looptil is far worse!  So what happens is that the delta is not really consistent.  Sometimes enemies don't spawn fast enough because the delta isn't even consistent at 60fps, which is a big reason why the game is broken.


static uint64_t last_time = 0;
uint64_t current_time = time_get_time(); //( get_current_time() * 1000.0f );

int fps_limit = 60;
float frame_time = float( current_time - last_time );

if( last_time != 0 )
	This->m_delta_speed = frame_time / ( 1000.0f / 60.0f );

And this is my timing function:


uint64_t time_get_time()
{
#ifdef _WINRT
	return GetTickCount64();
#endif
	
#if __ANDROID__	/* TODO: Fix std::chrono for Android NDK */
	uint64_t ms = 0;
	timeval tv;

	gettimeofday( &tv, NULL );

	ms = tv.tv_sec * 1000;
	ms += tv.tv_usec / 1000;

	return ms;
#else
	std::chrono::system_clock::time_point now = std::chrono::system_clock::now();
    std::chrono::system_clock::duration tp = now.time_since_epoch();
	std::chrono::milliseconds ms = std::chrono::duration_cast<std::chrono::milliseconds>(tp);
	return (uint64_t) ms.count();
#endif
}

Now I know some of you will cringe when you see GetTickCount64(), but that's the only function that gives me reliable results on Windows 10 (UWP) ports, so that's staying. 

One more thing to note here, my game has a badly written game loop.  It uses a switch statement, followed by draw_game_mode(), update_game_mode(), so I kinda screwed myself there.  I tried changing it, but it broke the game completely, so I left it in it's messy state.  Is it possible to simply just have a proper delta calculation function?  Because it's adjusting itself based on the current frame time.  This may not be the best of ideas, but it was something I whipped up because I needed to have this run okay when it goes down to 30fps without running half the speed.  This works in general, but it's innacurate and causes problems.

Any ideas?  Thanks.

Shogun

EDIT: Feel free to ask anything in case I missed a vital detail.  My lunch break is ending and it's time for me to go.  Thanks.

Advertisement

Methinks you need to decouple your graphics rendering from the rest of your game update.  If that sounds like gibberish, fear not.  Possibly the best-ever article written on fixing your timestep is available here:  https://gafferongames.com/post/fix_your_timestep/

Although I see that link shared alot, it actually made my timing issues worse for this particular game.  In the future, I'll be sure to follow that guide to avoid future headaches.

Also, I fixed the problem.  Instead of using frame times, I used my game's actual frame rate divided by 1000.  Now it works perfectly (so far).  L. Spiro is going to kill me if he reads this, but I just want this game to work!

Thanks.

Shogun

Frame rate = Frames Per Second by most usual definitions. Or, Frames Per 1000 Milliseconds, if you will.

If your framerate is N frames per second, then it is also true that your framerate is N/1000 frames per millisecond. Frame time is milliseconds per frame, or seconds per frame / 1000.

It sounds like you just have a units/order-of-magnitude mixup in the original code, and your 1000 is adjusting for it.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

On 7/30/2017 at 11:43 PM, blueshogun96 said:

Although I see that link shared alot, it actually made my timing issues worse for this particular game.  In the future, I'll be sure to follow that guide to avoid future headaches.

Also, I fixed the problem.  Instead of using frame times, I used my game's actual frame rate divided by 1000.  Now it works perfectly (so far).  L. Spiro is going to kill me if he reads this, but I just want this game to work!

Thanks.

Shogun

You.

Dirty.

RAT!!

You didn’t make the game work, you just hid the problem under a rug.  It will work differently on various devices so I am not sure how this helps you release anything.

You don’t show the whole game loop.  What is This->m_delta_speed?
Are you accumulating time from 0 = launch of game?  Why the conversion to float?


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

On 30/07/2017 at 6:40 AM, blueshogun96 said:

 



static uint64_t last_time = 0;
uint64_t current_time = time_get_time(); //( get_current_time() * 1000.0f );

int fps_limit = 60;
float frame_time = float( current_time - last_time );

if( last_time != 0 )
	This->m_delta_speed = frame_time / ( 1000.0f / 60.0f );

 

Is that the real algorithm? You never update last_time in that algorithm.

On 30/07/2017 at 6:40 AM, blueshogun96 said:

Now I know some of you will cringe when you see GetTickCount64(), but that's the only function that gives me reliable results on Windows 10 (UWP) ports, so that's staying. 

 

Every Win10/UWP device supports the QueryPerformanceCounter/Frequency API - the normal way to do precision timing.

Milliseconds are shitty for game timing. If your loop frequency is 60Hz, then a millisecond timer is accurate to +/- 6%... What's worse is that GetTickCount64 says that it typically has 10-16ms accuracy (+/-60% to 96% error) :(

On 31/07/2017 at 4:43 PM, blueshogun96 said:

Also, I fixed the problem.  Instead of using frame times, I used my game's actual frame rate divided by 1000.  Now it works perfectly (so far).  L. Spiro is going to kill me if he reads this, but I just want this game to work!

 

Maybe you've got code that just happens to output a small value every frame, which is not actually a measurement of delta time, but happens to simply be some arbitrary number that's small enough to act as a plausible fixed timestep value.

e.g. if you simply hardcode an arbitrary delta, such as "m_delta_speed = 0.06f;" do you get similar results?

On 8/4/2017 at 4:41 AM, L. Spiro said:

You.

Dirty.

RAT!!

You didn’t make the game work, you just hid the problem under a rug.  It will work differently on various devices so I am not sure how this helps you release anything.

You don’t show the whole game loop.  What is This->m_delta_speed?
Are you accumulating time from 0 = launch of game?  Why the conversion to float?


L. Spiro

Yes, now I am finding the flaws as they surface.  Sometimes after coming out of the background or a suspended state, the FPS calculation will spew a really high number and cause the game to move rapidly for one second, then go back to normal.  This will result in death many times for the user.  So yes, I dun f@#%ed up even more.

The entire gameloop is too large and is a complete mess (I'll never code a game this way ever again).  The delta_speed variable is a percentage that is multiplied against the entity's speed value so that it moves at an adjusted speed based on frame rates.  I am not accumulating time as I did not plan this thing ahead or even consider the need for time based movement when I originally wrote it.  Then when primitive counts started reaching the millions, frame rates dropped and then I realize "I dun screwed up".

 

On 8/4/2017 at 7:30 AM, Hodgman said:

Is that the real algorithm? You never update last_time in that algorithm.

Every Win10/UWP device supports the QueryPerformanceCounter/Frequency API - the normal way to do precision timing.

Milliseconds are shitty for game timing. If your loop frequency is 60Hz, then a millisecond timer is accurate to +/- 6%... What's worse is that GetTickCount64 says that it typically has 10-16ms accuracy (+/-60% to 96% error) :(

Maybe you've got code that just happens to output a small value every frame, which is not actually a measurement of delta time, but happens to simply be some arbitrary number that's small enough to act as a plausible fixed timestep value.

e.g. if you simply hardcode an arbitrary delta, such as "m_delta_speed = 0.06f;" do you get similar results?

The loop is updated further down.  I forgot to add that.

If milisecond timing is a bad design choice, then I will do a way with it pronto.  I wasn't aware of the poor accuracy, and if the margin of error is that great, then I'll most definitely stop using it.  I wrote that half arsed timing function out of laziness.  Speaking of high resolution timers, I'll need one that's portable to all three major OSes.  Which I did find here: http://roxlu.com/2014/047/high-resolution-timer-function-in-c-c--


/* ----------------------------------------------------------------------- */
/*
  Easy embeddable cross-platform high resolution timer function. For each 
  platform we select the high resolution timer. You can call the 'ns()' 
  function in your file after embedding this. 
*/
#include <stdint.h>
#if defined(__linux)
#  define HAVE_POSIX_TIMER
#  include <time.h>
#  ifdef CLOCK_MONOTONIC
#     define CLOCKID CLOCK_MONOTONIC
#  else
#     define CLOCKID CLOCK_REALTIME
#  endif
#elif defined(__APPLE__)
#  define HAVE_MACH_TIMER
#  include <mach/mach_time.h>
#elif defined(_WIN32)
#  define WIN32_LEAN_AND_MEAN
#  include <windows.h>
#endif
static uint64_t ns() {
  static uint64_t is_init = 0;
#if defined(__APPLE__)
    static mach_timebase_info_data_t info;
    if (0 == is_init) {
      mach_timebase_info(&info);
      is_init = 1;
    }
    uint64_t now;
    now = mach_absolute_time();
    now *= info.numer;
    now /= info.denom;
    return now;
#elif defined(__linux)
    static struct timespec linux_rate;
    if (0 == is_init) {
      clock_getres(CLOCKID, &linux_rate);
      is_init = 1;
    }
    uint64_t now;
    struct timespec spec;
    clock_gettime(CLOCKID, &spec);
    now = spec.tv_sec * 1.0e9 + spec.tv_nsec;
    return now;
#elif defined(_WIN32)
    static LARGE_INTEGER win_frequency;
    if (0 == is_init) {
      QueryPerformanceFrequency(&win_frequency);
      is_init = 1;
    }
    LARGE_INTEGER now;
    QueryPerformanceCounter(&now);
    return (uint64_t) ((1e9 * now.QuadPart)  / win_frequency.QuadPart);
#endif
}
/* ----------------------------------------------------------------------- */

Since this game is cross platform, it has to work on everything.  If nano seconds are the way to go, then I'll use that instead.

And yes, using the frame rate isn't really a reliable way to do this (it blew up in my face).  I found that using a fixed value will give me consistent results.  A fixed delta doesn't generate any issues for me. 

Shogun

As a side note, in case it helps you (on some future project I guess): bullet (the physics API) ticked at constant rate, but it still allowed for frame-specific updates. OFC it only interpolated those between two known states. So it's has both predictable behavior and hi-frame-rate-butter-smooth goodness; apparently nobody noticed it's a tick late.

I tried something similar in an TD game I tried years ago I don't think you remember: the implication is that you have to correct for inconsistencies as an enemy spawned at 0.5 tick still has to be half-a-tick evolved and cannot be backwards interpolated at 0.3 ticks. Since the game wanted to be deterministic in nature I couldn't let players the chance to get different patterns due to hardware power. Ew! Hopefully you don't need this detail!

Previously "Krohm"

On 10/08/2017 at 10:56 AM, blueshogun96 said:

  I found that using a fixed value will give me consistent results.  A fixed delta doesn't generate any issues for me. 

;(

Try it in a PC with a 144Hz monitor now..

'Easy' fix: run your physics with a 720 Hz timestep. Iterate it 5 times per frame for 144Hz monitors/vsync, 12 times for a 60Hz monitor, and 24 times for a 30Hz monitor. No interpolation needed!

This topic is closed to new replies.

Advertisement