Dammit, I knew should have waited until I got home to post the question ! I think my pseudo-code might have thrown you off track and caused more confusion than it was worth. Here is the real code:
Stopwatch sw = new Stopwatch();
sw.Start();
long previousElapsedMilliseconds = sw.ElapsedMilliseconds;
// FPS vars
static int framesPerSecond = 0;
static int frameCount = 0; // how many frames
static long frameMillis; // how many milliseconds spent in frames
while (isGameRunning)
{
// calc difference in time since last update
long currentElapsedMilliseconds = sw.ElapsedMilliseconds;
long deltaMillis = currentElapsedMilliseconds - previousElapsedMilliseconds;
previousElapsedMilliseconds = currentElapsedMilliseconds;
// do all updates and draw
UpdateAndRender(deltaMillis);
// calc FPS
frameCount ++;
frameMillis += deltaMillis;
if (frameMillis >= 1000)
{
// 1 second elapsed - time to dispaly FPS, and restart counting
framesPerSecond = Convert.ToInt32(frameCount * 1000.0 / frameMillis);
frameMillis -= 1000;
frameCount = 0;
}
displayFPS (framesPerSecond);
}
Now for my comments based on the responses so far (thanks by the way!):
First: I don't know why you see what you're seeing. Likely has to do with using integers and how you determine "milliseconds."
Can you check my updated code for any glaring errors? I've been staring at it forever and I don't see anything that would randomly increase the FPS by a factor of 7ish.
Second: why do you care? Your updates should take place with fixed intervals (for various reasons, you should fix your timestep ) and you should be controlling your framerate. Anything over 60 fps (actual rendering frequency, not loop delta-time) isn't needed visually, likely isn't compatible with some monitors, can be hard on hardware (GPU and cooling fans, in particular), and is a waste of energy.
Theres something extremely weird going on here, any decent programmer would care and want to know what & why. Fixed timesteps are an alternative sure, but this game isn't using them currently. I also know that my eyes cannot detect 650 FPS, and that my current code might be unnecessarily burdening the video card... however none of that really has to do with the issue. I still want to solve the issue. I might make some changes to all of the above later, but first I want to know whats happening.
Third: you should be determining delta-times in microseconds**, not milliseconds. If you're really interested in fine differences in time, you shouldn't be using integers to total up floating point values such as 8.33 (1/120), 1.5 (1/650) and 1.3 (1/750). The integer sum = 8 + 1 + 1 = 10. The floating point sum = 11.13. That's just over 3 possible time intervals and you already have a 10% error.
I've never heard of people using microseconds instead of milliseconds. I'll look into that but I'm not really convinced that milliseconds aren't good enough. You're definitely right about not using integers to add up float values. However now that I posted my real code you can see that the millisecond time values are indeed longs, not floats. so theres no rounding/truncating errors going on.
Also about using microseconds, I don't see how lack of accuracy is my current problem. Lets say that my current code is determining that 1 frame takes 12 milliseconds. Maybe if I replaced my StopWatch with a Microsecond timer, I would find out that the frame was in fact 12,183 microseconds (or 12.183 milliseconds). So what? If you follow the logic and math you can see that microseconds would not make much of a difference. Maybe my FPS calculation would be more accurate by 1 - 5 FPS. It does not explain why it varies between 90 FPS and 650 in between executions.
I think your measurements are off.
Your code is:
if (frameTime > 1000) {
print "FPS IS " + 1000 * frameCount / frameTime;
frameCount = 0;
frameTime -= 1000;
}
I'm not very good at math, but there's several things that seem odd to me. First, what if your frameTime is greater than two seconds? But you only subtract one second worth of frame? That means, from then on, your framerate would appear doubled.
It should be "frameTime %= 1000" (i.e. get the remainder of a division by 1000, which is basically the same as saying, "keep subtracting until it's below 1000").
My guess it, sometimes your program happens to take a few extra seconds to start up, or lags for a second, or is minimized for a second (perhaps behind your IDE after starting up), and gains 5000 or so frametime on the first frame... but you never get rid of the extra time, so it sticks around and accidentally inflates your measurements.
I see your point, but the framerate wouldn't be doubled "from then on", it would only be doubled for 1 or 2 iterations through the loop. So only for a few milliseconds, ie, not even noticeable.
Also, 'FPS' isn't the best measurement to use, because it doesn't scale linearly. One extra FPS when you're running at 100 FPS is not the same gain as one FPS when you are running at 10 FPS. You ought to measure your average frametime, not just the number of frames per second.
Yeah I know - I'm not going to put too much stock in the FPS of my game, I'm more interested in how it looks and feels. But now that I have this very strange problem happening, I just have to get to the bottom of it. Saying "oh well, I didn't really need FPS anyway" is not really my style :) I gotta figure this out :)
Thanks all!