• Advertisement
Sign in to follow this  

Mysterious thing slows down application

This topic is 3626 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm making a game in OpenGL and C++. I have 2 PC's with each 2 OS's. Here's the FPS of the game on each configuration: Athlon 1700XP with GeForce 3, ArchLinux: 30 fps Athlon 1700XP with GeForce 3, Windows: 30 fps Intel E6600 with GeForce 7600, ArchLinux: 80 fps Intel E6600 with GeForce 7600, Windows: 11 fps, sometimes suddenly 60 fps (!!!!!!!) 60 fps is the max on Windows due to vsync. If I profile the application (with the profiler of devc++, which is by the way 1000x slower than gprof in Linux: gprof takes 0 seconds to output you a text file, devc++ takes 5 minutes to output the same text file in a window where you can't copypaste the text), it doesn't show what's causing the slowness. It shows the same functions as profiling in the other systems (which is related to the physics). However the only thing that seems is that the graphics are causing it, disabling those but keeping the physics makes it go to 60 fps. I'm quite sure that the application itself isn't the problem: on Linux it goes at much faster speed, and on Windows, it sometimes suddenly jumps to the max speed too. It's as if a random external thing keeps it at 11 fps and sometimes randomly decides to suddenly make it go to 60 fps anyway. What could be causing it?

Share this post


Link to post
Share on other sites
Advertisement
What's the CPU load.

If CPU is pegged at maximum, then it should trivial to find.

If it's well under 100% (or 100/n%;n = number of cores), then the problem lies in either misuse of a timer, or in resource contention due to locks, a live lock or something similar.

It could also be in heavy misuse of rendering pipeline, meaning that CPU spends most of its time waiting for graphics card, and vice-versa, rather than constantly streaming data.

Random spikes however sound very much like timing problem to me. Either rounding or accuracy issue.

Then there's a whole lot of other issues, including driver/OS conflicts or bugs, hardware issues, conflict with applications, IO congestion, ....

Share this post


Link to post
Share on other sites
It's a dual core CPU and the load when running the game is around 26%, so it takes half of 1 CPU.

Also, I wait 5 milliseconds every frame to make it free some CPU time. However, that is NOT the cause of the 11fps problem: I also already tried when disabling the 5 milliseconds wait as well as the receiving of any SDL events, and it's the same result.

AFAIK there's no multithreading, the sound code uses it, but there's no sound in the game currently nor is the sound system initialized so there should be only 1 thread.

On the old athlon CPU the problem doesn't occur.

Share this post


Link to post
Share on other sites
Just pondering...
- does it has anything to do with the graphics driver/app that could throttle the GPU when reach certain temperature?

- background app (Windows update for example) running?

- or some un-pre-emptable task that is 'stuck' in polling some device state... eg. DVD drive having some hard to read removable media; or wireless with bad signal trying to reconnect repeatedly.

Share this post


Link to post
Share on other sites
Quote:
Original post by Hodgman
Just guessing here:
Perhaps your 7600 drivers are just buggy? Try installer newer/older graphics drivers.


I installed the newest drivers, and interestingly, it's different now but still not good, so it must be something with my game.

Now, it starts at 20 and goes to 50 fps 5 seconds later, always the same pattern. It should be able to be 60 without problems normally, I'm quite sure, since it can get 80 fps or more with the same code in Linux.

Quote:
Original post by Antheus
Random spikes however sound very much like timing problem to me. Either rounding or accuracy issue.


Could you describe this more? What does a rounding or accuracy issue mean in the context of timing?

Share this post


Link to post
Share on other sites
Just checking, are you using GLut and printing text using GLut API-commands..?
I've found that for some unknown reason GLUT is slower on startup when printing text. After 5-7 seconds it's back to normal again.

Try disabling all text output and chec if that's the problem!

/Robert

Share this post


Link to post
Share on other sites
Quote:
Original post by Rasmadrak
Just checking, are you using GLut and printing text using GLut API-commands..?
I've found that for some unknown reason GLUT is slower on startup when printing text. After 5-7 seconds it's back to normal again.

Try disabling all text output and chec if that's the problem!

/Robert


I'm not using glut at all in fact. I must admit though that I'm drawing the letters from a bitmap font in a quite inefficient way: each letter is drawn as a textured quad.

So the text rendering is noticably slow, but certainly not slower than the drawing of the planets or the stars. Disabling the text has no different effect than disabling the sky or the planets.

Share this post


Link to post
Share on other sites
Perhaps have a look at the two following topics at opengl.org:
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Board=3&Number=164638&Searchpage=1&Main=34615&Words=affinity&topic=0&Search=true#Post164638
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Board=3&Number=143982&Searchpage=1&Main=31652&Words=affinity&topic=0&Search=true#Post143982
There's a registry-key you could try disabling, see if it makes any difference.

Share this post


Link to post
Share on other sites
Quote:
Original post by Lode

If I profile the application (with the profiler of devc++, which is by the way 1000x slower than gprof in Linux: gprof takes 0 seconds to output you a text file, devc++ takes 5 minutes to output the same text file in a window where you can't copypaste the text), it doesn't show what's causing the slowness.


There's an easy solution to this problem: don't use Dev-C++. [smile]

Quote:
Original post by Lode

Could you describe this more? What does a rounding or accuracy issue mean in the context of timing?


An inaccurate timer gives you an inaccurate fps measurement. Games tend not to wildly vary in performance from frame to frame unless the scene drastically changes, or something else in the system changes. The simplest timer available in Windows, GetTickCount, is notoriously inaccurate. QueryPerformanceFrequency/QueryPerformanceCounter tends to be the API of choice for most games on Windows, although you need to take some care when using them.

Share this post


Link to post
Share on other sites
Ah!
The FPS timer can't be the problem then: I use SDL's getticks, but also I take the average of the last 50 frames. So if the FPS changes, you see it settle to the new value. So if it says 11 fps, it really means that for 50 frames long it has been that slow, and the values I posted stay stable for minutes.

Also, I'm not just reading this FPS from a number. The FPS around the 11 FPS range is noticable by simply seeing how "fast" it's redrawn.

[Edited by - Lode on March 18, 2008 1:35:20 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by MJP
SDL_GetTicks actually uses QPC/QPF on Windows, so that's definitely not your problem then. [smile]

Except if one of the many known bugs with QPC/F occur ;)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement