My game has an issue with micro stuttering. Every second or two the game "jumps" a little, as if a few frames are missed. There is no tearing of the screen, just a small pause and then the jump forward. The issue occurs whether the character is moving or not, scrolling or not (just more noticeable when scrolling), etc. Without fail, every second or two, the game will just jerk/jump/stutter.
The issue is similar to this post, though VSync does not fix the problem.
60 Frames displayed in 1 second
Logest logic time: 1ms
Logest render time: 3ms
Logest frame time: 18ms
This is a sample debug output - It displays ever second the number of frames and how long each part took.
The logic handles the events, collisions, etc.
The render time calculates how long it takes to draw all the objects on the screen.
783 Frames displayed in 1 second
Logest logic time: 1ms
Logest render time: 3ms
Logest frame time: 4ms
The differences above denote VSync enabled and not enabled. I have a very high frame rate and the game never has a spike in processing. The rendering is a steady 3-5ms and the logic is always 1ms. The longest frame never exceeds 5ms unless VSync is enabled then each one is 16-18ms.
I started out with SDL 1.2, everything ran fine. Decided to implement OpenGL for more control and better frame rate, then the stuttering began. I thought it may be an issue with SDL so I upgraded to SDL2, still no change in the stutter.
The code I use to load start SDL, init GL, and load PNGs into textures, I have rewritten 2-3 times each. Anything that displays to the screen I have rewritten at least twice.
I have taken all the code that is responsible for setting up opengl, sdl, loading an image from a png to gluint, and displaying it on the screen and yanked it out. I have posted in on a github here:
This code takes a background tile, sticks it in the top left corner, and moves it to the bottom left corner. During the image's journey from corner to corner, you should be able to see the stutter that occurs a couple of times. Even this very basic example has the same problem of stuttering.
If you have any questions or need additional info, please just ask.
I really, really appreciate your guys' help in this! It's the last hurdle to my engine working!
which is then set via a function that is called each loop and a flag "nextState"
when the next state is not null, a switch is entered where the next state is created as follows:
switch( nextState )
currentState = new TitleState();
currentState = new IntroState();
then nextState is set to null again, rinse and repeat.
Everything that happens in the game happens from this state. The rendering is
same with the logic, event handling, etc.
The character, the world map, the level data, EVERYTHING exists inside of these states.
My question is this - Does this mean that my whole game will operate from the heap, even though my objects like Character and Level are declared as
This would lead me to believe that the Character and Level would be on the stack, but since their parent class is on the heap (currentState->character, and currentState->level), are they too on the heap? If so, would it be prudent to try and redesign this so that all the states are on the stack?
I am using Microsoft Visual C++ 2010 Express, and I may have a configuration issue. My question is: why is my debug build so slow compared to my release build? There are a lot of different settings for release vs config, but both are out of the box configurations. I haven't changed anything.
I have just built a quadtree to use for collision detection in my game, using Netbeans 7.0 and 7.1 in a Linux (Ubuntu 11.04) environment. The tree can insert 1,200,000 objects per second on my older, mediocre laptop. The code uses nothing specific to the OS, and the only library outside the standard STL that is used is boost 1.47's foreach command for vector element iteration.
The code consists of recursive calls, and at its simplest test run (just inserting, not clearing or removing objects, or querying neighbors) it only uses a loop of the nodes, a bounds check, and a vector to dynamically insert objects, else recursive call to go deeper in the tree. My initial incarnation of the quadtree involved template meta-programming, void* vectors, etc., but I was programming on my desktop and it didn't seem fast enough. Now I'm kicking myself because I think it could have gone much, much faster if I had only had the correct configuration.
When I decided to run it on my new i3 laptop and my powerful i7 desktop, I expected amazing performance, even a ten-fold increase.
I created a new project in VC++ (empty). Added the source and header files. Added the boost directory to my include path. This is an "out of the box" installation of VC++. I hit build, all was fine, and then it ran. VERY SLOWLY. At even a tenth the speed on a machine that is vastly more capable than my core 2 duo laptop from 2006. I was amazed and perplexed and worried all at once.
I went to new laptop, same problem. I tried installing netbeans (wouldn't work, needs mingw).
On my new laptop, I switched from Debug mode to Release and built it (no optimization flags). Then I cd'd to the Release directory and ran the executable. The program hauled ass churning through 1,200,000 object insertions and 120 tree clears (100,000 objects per frame, 120 fps) about twice as quickly as my old laptop. I turned on O/2 flags, it went even faster.
I went back and ran Debug mode (F5 and without debugging ctrl+f5) and it was still slow. I cd'd to Debug directory and ran Test.exe, STILL slow as can be.
In short: Release mode hauls ass, as it should, yet Debug mode crawls. How can I fix? short of copying all my release config options over to debug?
Thank you all for your help. I am a frequent visitor (though first time poster) of gamedev forums, and I know this is the place to go for all my game programming needs, as gamedev has helped me solve a good number of debacles!