Irregular Framerates

Started by
15 comments, last by christian h 17 years, 6 months ago
Hello out there, I am experiencing quite a disturbing effect in my opengl application. Every now and then (usually at around every say 20 frames), my frame time doubles for one frame and the next frame finishes at around 1 ms. As all my animation is time based, the motion appears quite jerky, though the app is running at a moderate frame rate (~ 30 fps @ ATI 9600) In plain numbers, this is an excerpt from what I log : ... Sun Oct 08 17:28:46 2006 DEBUG Update Time 0.236902 Sun Oct 08 17:28:46 2006 DEBUG Render Time 40.363510 Sun Oct 08 17:28:46 2006 DEBUG Update Time 0.301156 Sun Oct 08 17:28:46 2006 DEBUG Render Time 45.085339 Sun Oct 08 17:28:46 2006 DEBUG Update Time 0.304229 Sun Oct 08 17:28:46 2006 DEBUG Render Time 48.914038 Sun Oct 08 17:28:46 2006 DEBUG Update Time 0.300597 Sun Oct 08 17:28:46 2006 DEBUG Render Time 89.839630 // <-- DOUBLED ! Sun Oct 08 17:28:46 2006 DEBUG Update Time 0.343619 Sun Oct 08 17:28:46 2006 DEBUG Render Time 1.100978 // BLAZING FAST ALL OF A SUDDEN ! Sun Oct 08 17:28:46 2006 DEBUG Update Time 0.240813 Sun Oct 08 17:28:46 2006 DEBUG Render Time 43.929606 ... Could this be in conjunction with the vertical sync ? I am using the wglSwapIntervalEXT extension though... Has anybody an idea where to start looking or what I could do about it? I might well interpolate between the update times, but that seems more like a fix then a real solution. Thanks in advance for any suggestions, Christian [Edited by - Novative on October 8, 2006 4:12:41 PM]
Advertisement
The obvious suggestion would be ask your profiler where all the time is going.

Alternatively, it could be that the time is spent outside the game. Yes, that happens, if other processes need the CPU. (an extra 40 ms (if it is milliseconds you're measuring) sounds a bit excessive for that though)
Thanks for the suggestions :) Unfortunately it does not seem to be the code as such that gives rise to this problem, instead what really stuns me, is that the following frame completes in about 1 ms, though it runs through all the same code that averagely needs 40 ms to complete otherwise. How can this be possible ?

The effect is repeatable (with varying time between the peaks) so I am tempted to rule out the influence of other processes...and am left rather puzzled about this rather annoying behaviour.

Thanks anyway :)

does the problem occur when your loggin is disabled? or at least the output...
Unfortunately not, this is why this problem troubles me so much. I can tell by the jerky movements that the problem still persists when I have the logging turned off.

Maybe I try a different machine any time soon, just to rule out some weird influences.

Is there any logical explanation why my code can finish rendering in 1 ms ? It seems as if two frames get batched up (thus taking roughly twice the time) and the opengl calls of the succeeding frame are ignored/delayed (thus the heavy time saving). But this is beyond my knowledge of opengl (windows?) synchronization.

P.S: The code is not multi-threaded, its a plain update/renderloop in double buffer mode under WinXP.

any chance of posting some code? just to make sure you haven't over looked anything that someone else might spot which might be causing an issue?
In addition to posting the code, maybe upload a stripped down version of the app. (simple enough to duplicate the issue) for others to try.

If the problem is happening across multiple computers, focus on your code. If other computers are not affected, then the issue would seem to be with another process running in the background sucking up your resources (spyware?).

Also be sure to check for any Sleep(rand()%100); calls that might be mysteriously placed throughout your code [grin].

- Dan
I had this problem once too, and i found that i had memory leaks. The framerate would go very low for a millisecond every few seconds. It was something to do with how the OS (mac osX in my case) was handling the page files; run your program, look in task manager and note down the amount of memory its using (virtual and physical) and come back an hour later and look.
My application was leaking one struct of about 24 bytes per frame, and managed to use up all 768mb of my physical memory within four hours of execution. look out for memory leaks when using malloc()!
Don't thank me, thank the moon's gravitation pull! Post in My Journal and help me to not procrastinate!
[edit]seeing as your probelm is to do with the time rendering is taking, my suggestions are probably wrong. However, how are you timing how long it takes to render?[/edit]

I had a simmilar problem when I was using the performance timer incorrectly.
My guess is you're losing precision in your timer code, which could have the effect of returning times that are shorter than, or longer than the actual elapsed time for one frame.
eg. once I corrected my timer code and compared it with the old version I found results like this (nb. values are from memory and are illustrative):
actual | returned33ms   | 16ms31ms   | 30ms31ms   | 30ms31ms   | 80ms32ms   | 16ms36ms   | 60ms

This caused the game to animate very jerky, appearing smooth for a few frames, then jumping, then stalling, the going smooth again...

A "quick fix" is to average the timer for the last few frames, but thats a real hack... Perhaps you could post your timer code?
Allways question authority......unless you're on GameDev.net, then it will hurt your rating very badly so just shut the fuck up.
Thanks for all the valuable input :)

I checked my timer implementation (thanks @PhilMorton for the hint :)) against simple GetTickCount() difference(am using QueryPerformanceCounter otherwise) around my render-code.

The timings differ within their natural resolution, but the overall picture is the same (with the second value being the GetTickCount difference) :

Mon Oct 09 07:34:21 2006 DEBUG Render Time 34.136461 / 31
Mon Oct 09 07:34:21 2006 DEBUG Render Time 74.885775 / 78
Mon Oct 09 07:34:21 2006 DEBUG Render Time 0.895365 / 0 // !!!
Mon Oct 09 07:34:21 2006 DEBUG Render Time 34.011306 / 47
Mon Oct 09 07:34:21 2006 DEBUG Render Time 75.542283 / 63
Mon Oct 09 07:34:21 2006 DEBUG Render Time 0.910730 / 0 // !!!!

I also checked for memory leaks (thanks @ speciesUnknown) , but at least there I seem to have done my homework ;)

Tried it on my laptop just right now, its the same. I put together a little binary package that should contain all that is needed to run my little sample application. Please note however that the source is not thoroughly tested and that I do not assume that it will run on any hardware (unfortunately for now FBO and VBO support is hardcoded ...)

I would be really grateful if someone might take a look at it.

All you need to do is start the program, let it run for a few seconds (use arrow keys for basic orbiting around if you feel that the vast sample scene is a little too boring to look at;)) and then check the log file. I added a little marker, that shows up at the end of the line whenever the actual render time is 1.5 higher than the previous. This falsely marks a few spots where rendering time is higher due to changed viewing distance, but most of the time it hits the spots I am looking for.

Thanks for the support and in advance for everyone that risks having a look. here is the link to the binary :

Sample App [1.2 MB]

Please report, if any files are missing.

This topic is closed to new replies.

Advertisement