Time Based Movement: NPCs Spawning Too Slow when Framerates are High...

Started by
5 comments, last by frob 8 years, 8 months ago

This game I have was originally written as an experiment and used frame based movement and locked at 60fps. Now that the game actually has some favour and has evolved greatly since then, I changed the game's timing system completely since users could take advantage of intense scenes with low framerates. Everything that uses my frame based movement code works just as intended, except for one thing. NPCs spawning way too slow when framerates are significantly above 60fps! This doesn't make any sense, especially since everything else works.

The spawning timer is really simple, decrease the timer until it reaches 0, then reset it. When I was using frame based movement, I would simply decrement the timer by 1. The timer is typically set to 30 or 20, which means 2 or 3 NPCs are spawned every second (60 fps), so -1 every frame. For timer based updates, I subtract the delta time between each frame. This doesn't work, and the higher my framerates go, the slower the spawning speed.

This is the code I'm using:

Delta speed calculation.


	static uint64_t last_time = 0;
	uint64_t current_time = time_get_time(); //( get_current_time() * 1000.0f );
	int fps_limit = 60;

	//if( ( current_time - last_time ) < ( 1000 / fps_limit ) )
	//	return;

	if( last_time != 0 )
		This->m_delta_speed = float( current_time - last_time ) / ( 1000.0f / 60.0f );

	last_time = current_time;

NPC spawning code


    if( !m_gameover )
    {
        update_user();

		m_spawn_timer -= m_delta_speed;
		if (m_spawn_timer < 0)
        {
            add_green_square();

            m_spawn_speed = (This->get_difficulty() ? 30.0f : 20.0f);
            m_spawn_timer = m_spawn_speed;
        }
    }

Once again, the timing code for calculating the delta time works fine for everything else (or at least it looks fine), but the above code used to spawn enemies doesn't work properly above 60fps. Any ideas? Thanks.

Shogun.

Advertisement
last_time = current_time; should be last_time += (1000.0f/60.0f);

edit: scratch that, i didn't realize you had that logic commented out. it's actually in your spawn timer:

m_spawn_timer = m_spawn_speed; should be m_spawn_timer+=m_spawn_speed;


edit2:

Too explain, you are not carrying over whatever negative value the last spawn timer was at. let's say m_spawn_speed = 2; and m_delta_speed is 3. then in the next frame m_delta_speed is 1. in your current logic, m_spawn_timer would end up at 2, and by the second frame 1(which really should be 0) so now you've missed a spawning event. by incrementing by the frequency instead of resetting it, you don't lose any accumulation time.
Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.


uint64_t current_time = time_get_time(); //( get_current_time() * 1000.0f );

Hmmm... I would sugguest to check the timer resolution. If it is too low, then you will skip a fraction of each frame.

Some simple check code


    static float frame_time = 0.0f;
    static uint64_t last_time = 0;
    
    uint64_t current_time = time_get_time(); //( get_current_time() * 1000.0f );
    int fps_limit = 60;

    //if( ( current_time - last_time ) < ( 1000 / fps_limit ) )
    //    return;

    if( last_time != 0 )
        This->m_delta_speed = float( current_time - last_time ) / ( 1000.0f / 60.0f );

    frame_time += This->m_delta_speed;
    // check for new second ?
    uint64_t last_second = last_time/1000;
    uint64_t current_second = current_time/1000;
    if(last_second!=current_second) {
      ..output frame_time, should be ~60.0
      frame_time = 0.0f;
    }

    last_time = current_time;

Not sure if that will fix your current problem, but anyway, why are you using float for time?

You are cutting down a 64 bit number to 24 bits of precision.

https://randomascii.wordpress.com/2012/02/13/dont-store-that-in-a-float/

first, as Ashaman73 say, your timer may be an issue. time_get_time only has a default resolution of +/-5ms. you may want to consider using queryperformancecounter to track time.

second, as wintertime says, its probably best to use ints to keep it simple. odds are you'll start out with some sort of int time value anyway (signed, unsigned, long, whatever).

i see no problem with:

counter = turns between spawns

counter -= time_delta

if counter <=0 spawn and rest counter

if you're running at close to 60 fps, a "turn" is only about 15ms. so the accumulation error metioned by slicer4ever can probably be ignored. if you want to handle it, reset counter to counter = turns _between_spawns + counter, instead of counter = turns _between_spawns. note that when you're resetting, counter will always be zero or negative. it will be negative if there is "accumulation error". so you want to ADD counter to the turns_between_spawns, not subtract it. i suspect any accumulation error is caused by 15ms turn times vs time_get_time's default +/-5ms error amount. IE its just not really accurtate enough, and the most likely cause of your problems. sounds like you really need query performance counter. with turn times of 15ms, you really just cant work with inaccuracies of 5ms.

other things things you may want to consider:

fix your timestep, or a framerate limiter to get you a fixed rate for update. either of those will eliminate the effects of variable render times on update rate, giving you a constant turn time.

fix your timestep runs update at a fixed rate, and renders as fast as possible, tweening between current and previous states as defined by update. a cap on renderDT fed into update() for processing in "turn time" sized chunks must be used to avoid dropping frames and ensure graceful degredation under heavy graphics loads.

a framerate limiter caps the speed of the main game loop to some FPS you determine, thus ensuring a constant rate for update and a constant turn time. so for example, you might run your game at 60 fps max. or maybe 50 fps, so it never slowed down even when graphics got really hot and heavy.

remember, movies only run at 24fps. games are playable down to 15fps steady state (for when you have really long render and update times - IE a big game that does a lot and draws a lot). higher render fps gets you smoother animation. higher input fps gets you more responsive controls. higher update fps gets you more responsive AI. the minimums seem to be 15fps for render and input, and something like maybe 2-5 fps for update. if render drops below 15 fps, the player doesn't get visual feedback fast enough to continue playing. if input drops below 15 fps, controls become noticeably unresponsive. update at less than 2-5 fps can cause noticeable lag in the response time of AI, especially in games with fast intense combat.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

last_time = current_time; should be last_time += (1000.0f/60.0f);

edit: scratch that, i didn't realize you had that logic commented out. it's actually in your spawn timer:

m_spawn_timer = m_spawn_speed; should be m_spawn_timer+=m_spawn_speed;


edit2:

Too explain, you are not carrying over whatever negative value the last spawn timer was at. let's say m_spawn_speed = 2; and m_delta_speed is 3. then in the next frame m_delta_speed is 1. in your current logic, m_spawn_timer would end up at 2, and by the second frame 1(which really should be 0) so now you've missed a spawning event. by incrementing by the frequency instead of resetting it, you don't lose any accumulation time.

Tried it, didn't work :(


uint64_t current_time = time_get_time(); //( get_current_time() * 1000.0f );

Hmmm... I would sugguest to check the timer resolution. If it is too low, then you will skip a fraction of each frame.

I'm assuming this is it. I forgot to include the source to my timer solution. The results are generally the same on Windows and MacOSX.


#ifdef _WIN32
int gettimeofday(struct timeval * tp, void * tzp)
{
    // Note: some broken versions only have 8 trailing zero's, the correct epoch has 9 trailing zero's
    static const uint64_t EPOCH = ((uint64_t) 116444736000000000ULL);

    SYSTEMTIME  system_time;
    FILETIME    file_time;
    uint64_t    time;

    GetSystemTime( &system_time );
    SystemTimeToFileTime( &system_time, &file_time );
    time =  ((uint64_t)file_time.dwLowDateTime )      ;
    time += ((uint64_t)file_time.dwHighDateTime) << 32;

    tp->tv_sec  = (long) ((time - EPOCH) / 10000000L);
    tp->tv_usec = (long) (system_time.wMilliseconds * 1000);
    return 0;
}
#endif

uint64_t time_get_time()
{
	uint64_t ms = 0;
	timeval tv;

	gettimeofday( &tv, NULL );

	ms = tv.tv_sec * 1000;
	ms += tv.tv_usec / 1000;

	return ms;
}

Not sure if that will fix your current problem, but anyway, why are you using float for time?

You are cutting down a 64 bit number to 24 bits of precision.

https://randomascii.wordpress.com/2012/02/13/dont-store-that-in-a-float/

Wait, since when is float 24-bit precision? This is news to me, I thought it was 32-bit.

first, as Ashaman73 say, your timer may be an issue. time_get_time only has a default resolution of +/-5ms. you may want to consider using queryperformancecounter to track time.

second, as wintertime says, its probably best to use ints to keep it simple. odds are you'll start out with some sort of int time value anyway (signed, unsigned, long, whatever).

i see no problem with:

counter = turns between spawns

counter -= time_delta

if counter <=0 spawn and rest counter

if you're running at close to 60 fps, a "turn" is only about 15ms. so the accumulation error metioned by slicer4ever can probably be ignored. if you want to handle it, reset counter to counter = turns _between_spawns + counter, instead of counter = turns _between_spawns. note that when you're resetting, counter will always be zero or negative. it will be negative if there is "accumulation error". so you want to ADD counter to the turns_between_spawns, not subtract it. i suspect any accumulation error is caused by 15ms turn times vs time_get_time's default +/-5ms error amount. IE its just not really accurtate enough, and the most likely cause of your problems. sounds like you really need query performance counter. with turn times of 15ms, you really just cant work with inaccuracies of 5ms.

other things things you may want to consider:

fix your timestep, or a framerate limiter to get you a fixed rate for update. either of those will eliminate the effects of variable render times on update rate, giving you a constant turn time.

fix your timestep runs update at a fixed rate, and renders as fast as possible, tweening between current and previous states as defined by update. a cap on renderDT fed into update() for processing in "turn time" sized chunks must be used to avoid dropping frames and ensure graceful degredation under heavy graphics loads.

a framerate limiter caps the speed of the main game loop to some FPS you determine, thus ensuring a constant rate for update and a constant turn time. so for example, you might run your game at 60 fps max. or maybe 50 fps, so it never slowed down even when graphics got really hot and heavy.

remember, movies only run at 24fps. games are playable down to 15fps steady state (for when you have really long render and update times - IE a big game that does a lot and draws a lot). higher render fps gets you smoother animation. higher input fps gets you more responsive controls. higher update fps gets you more responsive AI. the minimums seem to be 15fps for render and input, and something like maybe 2-5 fps for update. if render drops below 15 fps, the player doesn't get visual feedback fast enough to continue playing. if input drops below 15 fps, controls become noticeably unresponsive. update at less than 2-5 fps can cause noticeable lag in the response time of AI, especially in games with fast intense combat.

Very insightful. The reason I don't use QueryPerformanceCounter directly is because this game isn't exclusive to Windows. Tried it, and it made my problem even worse. In fact, I'm considering not releasing this game for Windows at all since it is more mobile and console focused.

I've tried a number of solutions, and the same thing keeps happening or it just goes too fast. So what I ultimately will plan on doing is locking this game at 60fps, going back to frame based movement and optimizing my poor excuse for game code. This game doesn't really need to run at 500fps, and the only time I'd want this game to run at over 60fps is for when it's running on the Surface Hub because it runs at a constant 120fps. Although it's not a gaming device, somebody at Microsoft wants to see my game on that thing, and I'm happy to oblige. I think I should probably focus more on efficiency since such a simple game suffers 300fps drops from simple scenes. The rendering routines in this game are terrible, and the old linked list code I was too lazy to remove isn't helping (cache miss galore). If I do a sequel to this game, then I will definitely choose different design choices.

Shogun.

Fix your timestep, your graphics frame rate should have absolutely nothing to do with your simulation update rate.

Consider that with almost no effort you can adjust your monitor frequency. Many support update frequencies as low as 30 Hz, some support frequencies as high as 200 Hz. If the player takes further steps to disable vsync they may be able to reach radically higher framerates, potentially into the thousands per second.

If the two are related I can easily exploit your game. I may reduce my screen refresh rate to artificially slow down the game, or I may speed it up so I can get that many more updates to speed it up.

This topic is closed to new replies.

Advertisement