a weird problem regarding fps & elapsedtime

Started by
2 comments, last by yuhaobo 15 years, 9 months ago
Not sure if this is the correct forum to post this question, however, i encounted a fps problem after running my game for several days. My game displays the fps and the elapsedtime in 1sec interval. (elapsedtime displayed is the frame elapsedtime when that 1sec is just up.) Initially as expected, the elapsedtime will be random but in the range of 1/fps. However, after a few days of continous running, the elapsedtime displayed will start to fix on 0.015625s, but the fps will still remain quite constant around 90~100. has any one encountered such a problem?
Advertisement
I just did a simple test.

I closed the program, and start a debug version of the same program. The display still the same, with 120+ fps but elapsed time 0.015625s.

Now this confirms that the problem is not 'internal', since on closing the long running game and start another game (debug version of it), problem still persists. It must have something to do with CPU or GPU throttling, or some other weird stuff.

With the debug game, i was able to print out the debug trace using debug view,
the output is like

00004291 26.65011978 [1736] 0.000000
00004292 26.66650009 [1736] 0.015625
00004293 26.66662025 [1736] 0.000000
00004294 26.67474937 [1736] 0.015625
00004295 26.68291664 [1736] 0.000000
00004296 26.69121933 [1736] 0.015625
00004297 26.69942474 [1736] 0.000000
00004298 26.71272469 [1736] 0.015625
00004299 26.71585846 [1736] 0.000000
00004300 26.72405052 [1736] 0.015625
00004301 26.73227501 [1736] 0.015625
00004302 26.74053764 [1736] 0.000000
00004303 26.74873543 [1736] 0.015625
00004304 26.76029587 [1736] 0.000000
00004305 26.76518250 [1736] 0.015625
00004306 26.77340889 [1736] 0.000000
00004307 26.78157997 [1736] 0.015625
00004308 26.78972244 [1736] 0.000000
00004309 26.79789734 [1736] 0.015625
00004310 26.80766296 [1736] 0.000000
00004311 26.81431198 [1736] 0.015625
00004312 26.82254982 [1736] 0.000000
00004313 26.83062935 [1736] 0.015625

where the last column is the elapsed time at every frame. It's 0 & 0.015625 alternate, no wonder i see 0.015625 constant as elpased time display but fps is 120+.

Anyone has any clue as to what is happening here?
Note that 0.015625 = 2**-6 (only one bit set), which indicates precision error.
so it looks like you are storing the absolute time in a float, i.e.
float startTime=(float)GetTickCount();

...

float frameTime=(float)GetTickCount()-startTime;


a float only has 23 bits of precision, which will not suffice in this case.
Converting the difference between the tick counts instead will avoid this problem:

DWORD startTime=GetTickCount();

...

float frameTime=(float)(GetTickCount()-startTime);

this will also handle the wrapparound (when ticks go from 2**32-1 back to 0) after ~50 days.
floats are evil.
:wq!
Emptyhead, thank you so much

That is exactly what happened, never realised i am this *stupid*..

This topic is closed to new replies.

Advertisement