Sign in to follow this  
jimbogd

please help with this game loop timing code

Recommended Posts

jimbogd    122
This is my game loop, I'm attempting to run a constant game update speed independent of frame rate. That part works fine. My problem is where I'm trying to update the "frames per second" and "updates per second" variables. I seem to be missing something though as the "updates per second" should be 50hhz, but is reading more like the "frames per second" (on my PC its about 76Hz). Can anyone spot if I'm doing something wrong please? Many thanks.

        const int GAME_UPDATES_PER_SECOND = 50; // desired game speed in hz
        const float PIC_SPEED_PIXELS_PER_UPDATE = 4.0f;
        const int TICKS_TO_WAIT_FOR_NEXT_GAME_UPDATE = 1000 / GAME_UPDATES_PER_SECOND;
        const int MIN_FPS_LIMIT = 5;
        const int MAX_FRAMESKIP = GAME_UPDATES_PER_SECOND / MIN_FPS_LIMIT;

	float fInterpolation;
	float fDelta = 1.0f; // not needed in this game loop implementation
	int iNextUpdateTick = SDL_GetTicks();
	int iCurrentFPSTick;
	int iPreviousFPSTick = iNextUpdateTick;
	int iElapsedFPSTicks;
	int iElapsedUpdateTicks;
	int iPreviousUpdateTick = iPreviousFPSTick;
	int iCurrentUpdateTick;
	int iLoops;

	g_fFrameRate = 0.0f;
	g_fUpdateRate = 0.0f;

	while( !g_bQuit && !g_bSpace ) {

		// reset update loops counter
		iLoops = 0;

		// perform update (maybe more than once on slow hardware)
		while( SDL_GetTicks() > iNextUpdateTick && iLoops < MAX_FRAMESKIP ) {

			// update game state
			update( PIC_SPEED_PIXELS_PER_UPDATE, fDelta );

			// work out the update rate
			iCurrentUpdateTick = SDL_GetTicks();
			iElapsedUpdateTicks = iCurrentUpdateTick - iPreviousUpdateTick;
			g_fUpdateRate = 1000.0f / static_cast<float>( iElapsedUpdateTicks );
			iPreviousUpdateTick = iCurrentUpdateTick;	

			// is it time for another update?
			iNextUpdateTick += TICKS_TO_WAIT_FOR_NEXT_GAME_UPDATE;
			iLoops++;
		}

		// get the interpolation value for the rendering
		fInterpolation = static_cast<float>( SDL_GetTicks() + TICKS_TO_WAIT_FOR_NEXT_GAME_UPDATE - iNextUpdateTick ) 
					   / static_cast<float>( TICKS_TO_WAIT_FOR_NEXT_GAME_UPDATE );

		// render as fast as possible with interpolation
		render( fInterpolation );

		// work out the frame rate from the elapsed ticks
		iCurrentFPSTick = SDL_GetTicks();
		iElapsedFPSTicks = iCurrentFPSTick - iPreviousFPSTick;
		g_fFrameRate = 1000.0f / static_cast<float>( iElapsedFPSTicks );
		iPreviousFPSTick = iCurrentFPSTick;
	}

Thanks for any help, jimbogd

Share this post


Link to post
Share on other sites
apollodude217    122
I'm not sure about the line:

g_fUpdateRate = 1000.0f / static_cast<float>( iElapsedUpdateTicks );

I think you want a regular-ol', C-style unary cast, eh? The value is decided at runtime, so it should NOT be cast statically (at compile time), eh? That will give you garbage.

Someone please correct me if I'm wrong.

Share this post


Link to post
Share on other sites
nobodynews    3126
Quote:
Original post by apollodude217
I'm not sure about the line:

g_fUpdateRate = 1000.0f / static_cast<float>( iElapsedUpdateTicks );

I think you want a regular-ol', C-style unary cast, eh? The value is decided at runtime, so it should NOT be cast statically (at compile time), eh? That will give you garbage.

Someone please correct me if I'm wrong.


Well, I'm pretty sure that this isn't correct. static_cast is, for built-in types, identical to c-style casting. While the memory changes at run-time, the interpretation does not, which is all the cast actually does: reinterprets the data. This is something that is done merely by using different instructions which may always stay the same.

I will say that the cast is probably unnecessary as iElapsedUpdateTicks should be converted implicitly from an int to a float.

edit: I've looked though it and I can't see anything that looks suspicious in this code. This is probably silly question, but in your drawing code you aren't printing the frames per second for both the update and frames per second value, are you?

Share this post


Link to post
Share on other sites
jimbogd    122
Quote:
Original post by nobodynews
edit: I've looked though it and I can't see anything that looks suspicious in this code. This is probably silly question, but in your drawing code you aren't printing the frames per second for both the update and frames per second value, are you?


Hi thanks for looking at the code. I've double checked and I'm definitely not printing the frames per second for both the update and fps value (I wish i was)!

I've been over this code for 2 days now and I still can't see anything wrong with it. :(

cheers

jimbogd

Share this post


Link to post
Share on other sites
Is it generally just updating the logic once? If it were doing the loop more than once, your value would be wrong, as it's only going to time how long one pass through the logic takes (which is presumably a lot quicker than 1/50th of a second).

Share this post


Link to post
Share on other sites
jimbogd    122
I'm not sure I see what you mean. I want it to time how long it takes between game updates (not frames per second). I'm doing this by comparing the current tick count to the last time the update() function was run. Its in the inner while() loop because the update() function may be processed more than once on very slow rendering hardware (to keep game speed the same independent of rendering speed).

jimbogd

Share this post


Link to post
Share on other sites
I'm only really talking about the case when your rendering is going slow enough that the inner loop has to run several times.

Say your overall framerate was around 25fps, due to slow rendering. Each time you got to the inner loop, you'd run through it twice in order for the logic to catch up. In this case, you'd come out of the loop with g_fUpdateRate being quite a high number, because you've just run the logic twice in quick succession.

The first run through the inner loop would calculate a low value for the update rate, but you're then throwing that away on the second update, and effectively just working out the time that update() takes to run (iElapsedUpdateTicks would come out to however many ticks it takes to go through the update() a second time).

You might find that if you average the value over a few frames, it comes out to the right figure. Putting the update rate into a small cyclic array each time you calculate it, then averaging the values when you display it generally works quite well.

Share this post


Link to post
Share on other sites
jimbogd    122
Hi

If anyone is interested, I've uploaded the code here:
http://www.bu22.com/files/timetest.zip

Its a comparison between various different game loop types (giving a running example of each with some simple animation). Press space to switch between game loops. Might be of interest to some people! :)

If anyone can spot the problem with the frame rate calculation (see initial post above) I'd be grateful.

cheers

jimbogd

Share this post


Link to post
Share on other sites
Spoonbender    1258
Quote:
Original post by apollodude217
I think you want a regular-ol', C-style unary cast, eh? The value is decided at runtime, so it should NOT be cast statically (at compile time), eh? That will give you garbage.

Someone please correct me if I'm wrong.

Consider yourself corrected. [wink]
static_cast simply means that the code for the cast is generated at compiletime. (You're casting between two static, well known types)
The opposite is dynamic_cast, which is used for polymorphic types, where you may not know at compiletime exactly which type you're dealing with.

So static_cast is simply, well, a regular cast. [grin]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this