Timing, wasting time, recycling time [solved]

This topic is 4350 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

I was going to post in the Game Programming forum, but I would really like some smart slap-in-the-face replies. No offense to the posters there, but they seem too forgiving. I'm having some strange issues with my windows timer. This is basically how I've always had it set up:
if( threaded_timer != game_timer )
{
frame_time = game_timer - last_game_timer;

UpdateAndRender( frame_time );

last_game_timer = game_timer;
}
The threaded_timer value is updated outside of my game's thread, so I only read the value once at the beginning of each frame. The if( threaded_timer != game_timer ) line is there to prevent game frames from being updated when no milliseconds in time passes. I've had issues in the past when trying to update my game with zero time, and realized it was pretty pointless anyway. I can't move or animate anything if no time passes, and that makes it pointless to render the screen. However, I did some checking, and no more than 15 seconds into runtime, my game has already thrown out more than 50,000,000 frames because of the timer not changing. When I tried removing the if( threaded_timer != game_timer ) line to let my game update even though no time passed, it reduced the number of no-time frames down to less than 2,000 in 15 seconds. It also doubled the frame rate. What gives? I get an average of 5 ms per frame used up to update and render my game. I don't understand how even one frame could manage to go by with zero time. Anyway, I'm wondering what alternatives there are. I don't want to update my game with zero time. But I want to avoid waiting for the timer to change. Would it be really dumb to try to use time that is yet to exist (like 1 millisecond) and then subtract that used up time from a 2+ ms frame later on? Would that cause a lot of jerking and jumping around? I use the timeSetEvent methods to set up my timer with timeGetDevCaps().wPeriodMin resolution. I appreciate any info. [Edited by - Kest on July 22, 2006 11:58:35 AM]

Share on other sites
As you are performing updates to the threaded_timer variable in another thread, have you marked this variable as volatile? The compiler may be making assumptions about the value not changing as you never modify the threaded_timer variable in the game loop.

Share on other sites
Quote:
 Original post by jamessharpeAs you are performing updates to the threaded_timer variable in another thread, have you marked this variable as volatile? The compiler may be making assumptions about the value not changing as you never modify the threaded_timer variable in the game loop.

Unfortunately, it didn't seem to have any effect. But I didn't realize my variable should have been using that keyword, so thanks a lot.

edit:

Also, changing some code around to..

while( threaded_timer == game_timer ) {}game_timer = threaded_timer;frame_time = game_timer - last_game_timer;UpdateAndRender( frame_time );last_game_timer = game_timer;

Had no effect whatsoever. Just thought I would add this as extra information.

Share on other sites
Ugh. It was just Windows.

I'm really sorry for wasting everyone's time. Apparently, Windows was just getting extremely cranky from not being restarted in three weeks. Everything seems to be working fine after a restart.

Share on other sites
Assuming "threaded_timer" means it's being updated in a separate thread, that doesn't sound entirely reliable and will probably cause impossible-to-reproduce bugs.

Hopefully the time is stored in a double or an int64 (else you'll overflow or run out of precision when people enjoy your game so much they play it continuously for three weeks) - but that means it's unlikely to be updated atomically, so the game thread might read the first 32 bits of threaded_timer and then the timer thread updates the value and then the game thread reads the last 32 bits of the new value and gets all confused and does something crazy, particularly when running on a multiprocessor/multicore system.
Also, the timer's resolution will be limited by the OS's thread scheduling period, particularly when running on a single processor, which could be in the region of 100ms, which is not very good - it'd be more precise and more robust to just use a function like QueryPerformanceCounter inside the game thread.

Share on other sites
Quote:
 Original post by ExcorsAssuming "threaded_timer" means it's being updated in a separate thread, that doesn't sound entirely reliable and will probably cause impossible-to-reproduce bugs.

I'll cover the ones you mentioned, but feel free to add any others.

Quote:
 Hopefully the time is stored in a double or an int64 (else you'll overflow or run out of precision when people enjoy your game so much they play it continuously for three weeks)

32-bits. And it's actually more than seven weeks. Overflow is not very likely to happen, but it's totally safe if it does. Note that only frame_time is sent to the game updating routines. I left this next bit of code out because it was irrelevant. It should be included right above the "frame_time = .." line.

if( last_game_timer > game_timer )
last_game_timer = game_timer - 1;

The most that will happen on a loop around is a small dump off of whatever that frame's time actually was.
Quote:
 but that means it's unlikely to be updated atomically, so the game thread might read the first 32 bits of threaded_timer and then the timer thread updates the value and then the game thread reads the last 32 bits of the new value and gets all confused and does something crazy, particularly when running on a multiprocessor/multicore system.

That sounds pretty intense. I'm glad I don't need to worry about it.

Quote:
 Also, the timer's resolution will be limited by the OS's thread scheduling period, particularly when running on a single processor, which could be in the region of 100ms, which is not very good - it'd be more precise and more robust to just use a function like QueryPerformanceCounter inside the game thread.

Surely you're exaggerating? I'm pretty sure that's incorrect. This timer is a hardcore multimedia library timer. It's made for exactly this reason. Music playing, sound syncing, and video games. I've been using it since way back in my 32 mb 233mhz P1 laptop days, and have never seen anything other than 1 ms intervals. What horrible OS thread schedule design would result in 100 ms delays?

Share on other sites
Quote:
 Original post by Kest32-bits. And it's actually more than seven weeks.

Ah, if it's only millisecond resolution then that shouldn't be a problem.

Quote:
 What horrible OS thread schedule design would result in 100 ms delays?

Apparently "[NT 4] Server threads have quantums of 120ms", though it's typically less in non-server Win2K/XP setups. But I guess a properly-designed multimedia timer can find ways to make sure it's updated frequently enough, regardless of what else is tying up the CPU, so it's fine if it's as accurate as necessary and isn't doing anything unpleasant to the hardware that hurts overall performance (like altering timeBeginPeriod and causing more time to be wasted inside the scheduler) [smile]

• 46
• 12
• 10
• 10
• 9
• Forum Statistics

• Total Topics
631373
• Total Posts
2999638
×