Confused About Frames Per Second

Started by
28 comments, last by frob 7 years, 11 months ago

I am confused however because it typically ends up being > 1000 fps. If it drops below 400, I start to notice a lag. If it were to run at 60 fps it would be cripplingly slow. But 60 fps is what most games run at, isn't it? That is my understanding anyway. Why are the numbers I'm seeing not matching up with my understanding? Is this method incorrect?

It's your own code. You should be able to describe what is happening better than "lag". You should be able to see in your code what is happening.

I didn't see an answer to this question: (Please excuse me if I missed it.)

1. I assume your objects all are moving based on the deltaTime?! If not, you will need to make that change because if not, things will move at different speeds on different computers.

To be clear, where in your code do you determine how far something should move in one frame? Does the time factor into it? You're almost certainly doing it wrong, if frame rate breaks your game in any way other than how it looks.

Advertisement

I honestly am not totally understanding the logic of that code. I would instead just use the deltaTime and do a formula to get the FPS than to actually count frames. It should just be a simple division. You need to get the deltaTime in either seconds(likely 0.xxxx) or in miliseconds. Then use either 1 second or 1000 milliseconds and divide by the deltaTime. That gives you how many "deltaTime" there is in a second, which happends to be frames per second. Now, you may want to add on some "smoothing" and/or averaging things out, but that gets you started.

Also, I would pay attention, maybe more attention, to the deltaTime itself. Some people find it easier to understand the numbers with that, because it is directly saying how many MS the frames are taking, so it is more direct.

Thanks, this is good advice, however I don't think it really clears up my confusion. Even after making this change I'm still getting the same results. My deltaTimes are typically ~.001 seconds so when I divide 1.0 by ~.001 I'm still getting around 1000 FPS. Why is this so far off from the 60FPS I hear is the norm, and why, when it drops to anything lower than 400FPS, is it drastically lagging?

1,000 fps isn't unusual, don't think you have done it wrong just because you are getting significantly more fps than 60. If you have AAA quality graphics going on and you are getting 1,000 fps then chances are you are doing something wrong but if your scene is very basic then you will probably have high fps.

That makes sense, because it's just a 2D game with very few entities in the scene at the moment, but I guess what concerns me most is the lag I get when the FPS is showing as being ~300. That seems like it should still be plenty to run smoothly.

I am confused however because it typically ends up being > 1000 fps. If it drops below 400, I start to notice a lag. If it were to run at 60 fps it would be cripplingly slow. But 60 fps is what most games run at, isn't it? That is my understanding anyway. Why are the numbers I'm seeing not matching up with my understanding? Is this method incorrect?

It's your own code. You should be able to describe what is happening better than "lag". You should be able to see in your code what is happening.

I didn't see an answer to this question: (Please excuse me if I missed it.)

1. I assume your objects all are moving based on the deltaTime?! If not, you will need to make that change because if not, things will move at different speeds on different computers.

To be clear, where in your code do you determine how far something should move in one frame? Does the time factor into it? You're almost certainly doing it wrong, if frame rate breaks your game in any way other than how it looks.

Well, as far as looking t what's going on in my code, that's actually a whole other issue entirely. 90% of the time it works exactly like I expect it to, but every now and then there's a dip in FPS, sometimes for a second or two, sometimes for a little longer (maybe 10 seconds?) I still haven't been able to determine what causes this, but the way I see it, it's irrelevant to this issue. The FPS is still high enough that it should run smoothly. I'm not really sure how else to describe it. It feels jittery I guess? Movement doesn't feel fluid. Which sort of makes sense right? If deltaTime is higher, the objects' positions are going to be changing by larger distances in less frequent intervals, making it seem like its jumping around. Or am I misunderstanding this?

I am taking deltaTime into consideration in the update step. Here is the code where that is done:


void DynamicsController::update(double time)
{
	accelerationX_ = (forceX_ / mass_);
	accelerationY_ = (forceY_ / mass_);

	velocityX_ += (accelerationX_ * time);
	velocityY_ += (accelerationY_ * time);
	
	velocityX_ += (velocityX_ * linearDampX_);
	velocityY_ += (velocityY_ * linearDampY_);
	
	positionX_ += ((velocityX_ + movementX_) * time);
	positionY_ += ((velocityY_ + movementY_) * time);
	
	position_->setX((int)floor(positionX_));
	position_->setY((int)floor(positionY_));
}

I honestly am not totally understanding the logic of that code. I would instead just use the deltaTime and do a formula to get the FPS than to actually count frames. It should just be a simple division. You need to get the deltaTime in either seconds(likely 0.xxxx) or in miliseconds. Then use either 1 second or 1000 milliseconds and divide by the deltaTime. That gives you how many "deltaTime" there is in a second, which happends to be frames per second. Now, you may want to add on some "smoothing" and/or averaging things out, but that gets you started.

Also, I would pay attention, maybe more attention, to the deltaTime itself. Some people find it easier to understand the numbers with that, because it is directly saying how many MS the frames are taking, so it is more direct.

Thanks, this is good advice, however I don't think it really clears up my confusion. Even after making this change I'm still getting the same results. My deltaTimes are typically ~.001 seconds so when I divide 1.0 by ~.001 I'm still getting around 1000 FPS. Why is this so far off from the 60FPS I hear is the norm, and why, when it drops to anything lower than 400FPS, is it drastically lagging?

1,000 fps isn't unusual, don't think you have done it wrong just because you are getting significantly more fps than 60. If you have AAA quality graphics going on and you are getting 1,000 fps then chances are you are doing something wrong but if your scene is very basic then you will probably have high fps.

That makes sense, because it's just a 2D game with very few entities in the scene at the moment, but I guess what concerns me most is the lag I get when the FPS is showing as being ~300. That seems like it should still be plenty to run smoothly.

You're assuming that your frame times are consistent from frame to frame. But if you have intermittent extreme frame times mixed in among mostly short frame times, your game will be noticeably laggy but your FPS counter will still show a high frame rate.

Say for example that, for whatever reason, your normal frame time is 1 ms but every 30th frame takes 70 ms - the game would have an obvious stutter, but your frame counter would still show about 300 FPS.

I honestly am not totally understanding the logic of that code. I would instead just use the deltaTime and do a formula to get the FPS than to actually count frames. It should just be a simple division. You need to get the deltaTime in either seconds(likely 0.xxxx) or in miliseconds. Then use either 1 second or 1000 milliseconds and divide by the deltaTime. That gives you how many "deltaTime" there is in a second, which happends to be frames per second. Now, you may want to add on some "smoothing" and/or averaging things out, but that gets you started.

Also, I would pay attention, maybe more attention, to the deltaTime itself. Some people find it easier to understand the numbers with that, because it is directly saying how many MS the frames are taking, so it is more direct.

Thanks, this is good advice, however I don't think it really clears up my confusion. Even after making this change I'm still getting the same results. My deltaTimes are typically ~.001 seconds so when I divide 1.0 by ~.001 I'm still getting around 1000 FPS. Why is this so far off from the 60FPS I hear is the norm, and why, when it drops to anything lower than 400FPS, is it drastically lagging?

1,000 fps isn't unusual, don't think you have done it wrong just because you are getting significantly more fps than 60. If you have AAA quality graphics going on and you are getting 1,000 fps then chances are you are doing something wrong but if your scene is very basic then you will probably have high fps.

That makes sense, because it's just a 2D game with very few entities in the scene at the moment, but I guess what concerns me most is the lag I get when the FPS is showing as being ~300. That seems like it should still be plenty to run smoothly.

You're assuming that your frame times are consistent from frame to frame. But if you have intermittent extreme frame times mixed in among mostly short frame times, your game will be noticeably laggy but your FPS counter will still show a high frame rate.

Say for example that, for whatever reason, your normal frame time is 1 ms but every 30th frame takes 70 ms - the game would have an obvious stutter, but your frame counter would still show about 300 FPS.

Edit: Disregard this I misunderstood.

So I think this should clarify what I'm talking about.

You can download the executable here: http://www.mediafire.com/download/7c9ub4dw3vweam7/Demo.zip

When running, if you press the "~" key, it will toggle debug mode. If debug mode is active, then I added a sleep call which will cause the frame rate to drop to right around 60FPS. If you run and jump around you should see what I'm talking about. To me, it's a very noticeable change in how it feels. What I don't understand is, if most games run at 60FPS, why does mine feel so off? It seems like it should still run smoothly either way,

Oh yeah, the controls are arrow keys for movement, and spacebar for jump.

> What I don't understand is, if most games run at 60FPS, why does mine feel so off? It seems like it should still run smoothly either way,

Not all frames-per-seconds are equal.

Create another trio of timers, updated every second:

Min ms / Avg ms / Max ms

That is, the fastest frame time in milliseconds, the shortest frame time in milliseconds, and the average frame time.

60FPS means 16.6 milliseconds per frame, but with those three numbers you can see quite a lot more.

60FPS can give 16.4 / 16.6 / 16.6

60FPS can give 12.8 / 16.6 / 38.3

60FPS can give 1.4 / 16.6 / 143.8

All are at 60 frames per second. The first is rock solid. The second has at least one small hiccup, a frame taking 2 frames worth of time and others making up for it. The last one, however, indicates some fairly erratic behavior is going on.

Even better, have several sets of timers. One for the full time including the page swap, another one for just your render time. And perhaps another for your simulation times. And perhaps another for your physics times. You'll probably want those timers in microseconds or even nanoseconds rather than milliseconds.

Before long, you'll have enough profiling data that you can start to make informed decisions about the performance of your game.

> What I don't understand is, if most games run at 60FPS, why does mine feel so off? It seems like it should still run smoothly either way,

Not all frames-per-seconds are equal.

Create another trio of timers, updated every second:

Min ms / Avg ms / Max ms

That is, the fastest frame time in milliseconds, the shortest frame time in milliseconds, and the average frame time.

60FPS means 16.6 milliseconds per frame, but with those three numbers you can see quite a lot more.

60FPS can give 16.4 / 16.6 / 16.6

60FPS can give 12.8 / 16.6 / 38.3

60FPS can give 1.4 / 16.6 / 143.8

All are at 60 frames per second. The first is rock solid. The second has at least one small hiccup, a frame taking 2 frames worth of time and others making up for it. The last one, however, indicates some fairly erratic behavior is going on.

Even better, have several sets of timers. One for the full time including the page swap, another one for just your render time. And perhaps another for your simulation times. And perhaps another for your physics times. You'll probably want those timers in microseconds or even nanoseconds rather than milliseconds.

Before long, you'll have enough profiling data that you can start to make informed decisions about the performance of your game.

That's a good idea, thanks. I'll add those in and see what I can learn.

Just wanted to update, I think I have an idea what's going on. I'm using a fixed timestep described in this article like so:


gameTimer_->tick();

timeAccumulator_ += gameTimer_->getDeltaTime();

while (timeAccumulator_ >= fixedTimeStep_) 
{ 
	entityManager_->update(fixedTimeStep_);
		
	timeAccumulator_ -= fixedTimeStep_;
}

My fixed timestep is 0.1, but my deltaTimes are usually closer to 0.002, so while this time accumulates, there are a few frames where no updates happen. I think maybe I should be disregarding those no-update frames when calculating the frames per second, because what's the purpose of counting a frame where nothing happens, right?

Edit: I made a change so now I'm only counting frames where an update occurred and my FPS is now displaying as 100, which makes sense because 1 / .01 = 100. When it's slowing down it's at about 20. This makes more sense, so I think this was the problem. Thanks to everyone who contributed in this thread, I don't think I would have figured this out had you guys not pointed me in the right direction! I feel so much better now.

>> I made a change so now I'm only counting frames where an update occurred and my FPS is now displaying as 100,

then you are counting updates per second, not renders per second. which do you want?

you are tweening based on accumulator when you render, right? that's part of the "fix your timestep" algo too - not just "consume ET in DT sized chunks".

.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

This topic is closed to new replies.

Advertisement