Stutter / Micro Stutter Even w/ VSync

Started by
34 comments, last by DvDmanDT 10 years, 4 months ago
My game has an issue with micro stuttering. Every second or two the game "jumps" a little, as if a few frames are missed. There is no tearing of the screen, just a small pause and then the jump forward. The issue occurs whether the character is moving or not, scrolling or not (just more noticeable when scrolling), etc. Without fail, every second or two, the game will just jerk/jump/stutter.
The issue is similar to this post, though VSync does not fix the problem.
----------------------------------------
60 Frames displayed in 1 second
Logest logic time: 1ms
Logest render time: 3ms
Logest frame time: 18ms
----------------------------------------
This is a sample debug output - It displays ever second the number of frames and how long each part took.
The logic handles the events, collisions, etc.
The render time calculates how long it takes to draw all the objects on the screen.
----------------------------------------
783 Frames displayed in 1 second
Logest logic time: 1ms
Logest render time: 3ms
Logest frame time: 4ms
----------------------------------------
The differences above denote VSync enabled and not enabled. I have a very high frame rate and the game never has a spike in processing. The rendering is a steady 3-5ms and the logic is always 1ms. The longest frame never exceeds 5ms unless VSync is enabled then each one is 16-18ms.
I started out with SDL 1.2, everything ran fine. Decided to implement OpenGL for more control and better frame rate, then the stuttering began. I thought it may be an issue with SDL so I upgraded to SDL2, still no change in the stutter.
The code I use to load start SDL, init GL, and load PNGs into textures, I have rewritten 2-3 times each. Anything that displays to the screen I have rewritten at least twice.
I have taken all the code that is responsible for setting up opengl, sdl, loading an image from a png to gluint, and displaying it on the screen and yanked it out. I have posted in on a github here:
This code takes a background tile, sticks it in the top left corner, and moves it to the bottom left corner. During the image's journey from corner to corner, you should be able to see the stutter that occurs a couple of times. Even this very basic example has the same problem of stuttering.

If you have any questions or need additional info, please just ask.

I really, really appreciate your guys' help in this! It's the last hurdle to my engine working!

--------------------------------------------------------------------------
Systems:
Laptop with 2nd generation intel integrated graphics
Laptop with 1st generation intel integrated graphics
Desktop with i7 920 and 6870 radeon card
OS:
Linux - Ubuntu 13.10, 12.04
Windows 7 (mingw, but have also tried vc++ and the issue persists)
Each box has a Ubuntu and Windows 7 installation. Libraries and build environments are all sync'd.
All drivers are up to date, all other games that run openGL work just fine.
Advertisement

http://www.gamedev.net/topic/541407-sdl-stuttering-problem/

Found the exact same article. The guy's code was riddled with usages of SDL_GetTicks() to cap the framerate, and he doesn't use frame-independent movement.

My code uses deltas to move the character and environment, and I have the framerate both capped by VSync and not capped. Neither of these is repsonsible for the problem.

I just tore out all the code for blitting an image on the screen and compiled it independent of my code. I set up a single image (background tile of 320 x 320) and moved it across the screen an increment of 1 px per frame @ 60 fps.

EXACT same problem.

The code literally loads an image, draws it with the above function, and just moves it 1px at a time.... STILL stutters. It doesn't get simpler, and I don't understand it.

Maybe the problem is with your timer source. Try running your code on a single CPU core (SetProcessAffinityMask on Windows) - if it still stutters, then it must be that the timer you're using is having sync problems across core switches. As you said, other games run fine, so the problem is probably not with OpenGL - it must be somewhere in your time-keeping code.

Also, try to make sure that what you're seeing isn't tearing. If you enable VSync, you can eleminate that possibility.

And with VSync, you said that your "frame time" is somewhere between 16-18? On a 60Hz monitor, any frame that lasts longer than 1000/60=16.6 ms will be dropped or delayed, so I would also look into what is causing that.

In your second post, it's not clear what you mean by "1px at a time" - are you still using your timer in this case, or just drawing the image continuously? If you're just drawing continuously (no timer delays in between) then with VSync enabled, you shouldn't be getting any stuttering.

Also, are you loading the image every time you draw it, or just once?

If you try all this and it still doesn't work, then the problem must be external to your program - try to find out what other (background) programs are causing CPU spikes every 1-2 seconds.

Can you post your program's main loop (i.e where your timing functions run and where you draw the frame from)?

Also - can you try putting a glFinish before your SwapBuffers call and see if that resolves anything?

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Thank you for your post and for the help!

I too thought it might be a timer issue, which is why I ripped all the timers out in my sample program. The sample program simply takes the image and each frame it moves its xPosition and yPosition +1. Since it's capped by VSync at 60fps, it moves 60 pixels per second across the screen. There are no timers that cap the framerate, it relies solely on VSync.

As for the 16-18ms, it usually shows 18ms as the amount of time per frame. I don't understand why though, as the only thing controlling this is VSync (and my monitor is 60hz refresh rate). So 16.6ms would seem right to me, but each frame seems to just take 18ms. And this is with no timers, no frame limiting beyond VSync.

And when I do enable timers to delay, pause, nanosleep, etc. I wind up with the problem being amplified.

Also, the loading of the image occurs only once, I just tried setting the processor affinity to only run on the first core and the problem still persisted, and lastly, I reformatted yesterday with a fresh install of 13.10 and killed all other running processes and it still stutters.

https://github.com/martinisshaken/Sample-SDL2-OpenGL-Program

Here is the code I took out that just scrolls the background image across the screen. You may need to tweak the sconstruct's paths for it to build, as I made it for my systems' environments.

Thank you!

Edit:

I have run the progam with high precision timers using Chrono from c++0x and I get the following output during jitter times

Frame Time: 16.6016ms
Frame Time: 19.5782ms
Frame Time: 13.6319ms
Frame Time: 16.6807ms
Frame Time: 16.5073ms
The above is an extreme example, but there are definitely moments where the framerate goes above 16.66 (see iamge):
7J84G7g.png

Maybe the problem is with your timer source. Try running your code on a single CPU core (SetProcessAffinityMask on Windows) - if it still stutters, then it must be that the timer you're using is having sync problems across core switches. As you said, other games run fine, so the problem is probably not with OpenGL - it must be somewhere in your time-keeping code.

Also, try to make sure that what you're seeing isn't tearing. If you enable VSync, you can eleminate that possibility.

And with VSync, you said that your "frame time" is somewhere between 16-18? On a 60Hz monitor, any frame that lasts longer than 1000/60=16.6 ms will be dropped or delayed, so I would also look into what is causing that.

In your second post, it's not clear what you mean by "1px at a time" - are you still using your timer in this case, or just drawing the image continuously? If you're just drawing continuously (no timer delays in between) then with VSync enabled, you shouldn't be getting any stuttering.

Also, are you loading the image every time you draw it, or just once?

If you try all this and it still doesn't work, then the problem must be external to your program - try to find out what other (background) programs are causing CPU spikes every 1-2 seconds.

I have in fact tried the glFinish(), as well as glFlush() and neither has impacted it =/

Aslo, github link to all the code posted above. Thank you!

Can you post your program's main loop (i.e where your timing functions run and where you draw the frame from)?

Also - can you try putting a glFinish before your SwapBuffers call and see if that resolves anything?

How are you even measuring those 16-18ms? I saw no timer calls in the code. If you use something with a granularity of only 1ms it can easily be a starting time 1µs before the timer updates and look like it would add a whole ms and same at the end.

How are you even measuring those 16-18ms? I saw no timer calls in the code. If you use something with a granularity of only 1ms it can easily be a starting time 1µs before the timer updates and look like it would add a whole ms and same at the end.

Just edited the above post to show the amount of time the frames are taking, and yes my timer granularity was not sufficient.

Below is the timer code I have added. I also used this_thread::sleep_for(nanoseconds(.....)) to sleep


std::chrono::time_point<std::chrono::system_clock> start, end;

start = std::chrono::system_clock::now();
     //Do work
end = std::chrono::system_clock::now();

std::chrono::duration<double> elapsed_seconds = end-start;

float timer = elapsed_seconds.count() * 1000;
std::cout<< "Frame Time: " << timer << "ms\n";


The frame-time is going higher than 16.66ms on occasion, and this is with just VSync turned on.

When I disable VSync and manually force the time to sleep (using the aforementioned this_thread::sleep_for) for 16ms, 16.66ms, or 16.66666666ms I get the same jitter problem. Below is my sleep code:


if(timer < 16.66)
{
   float t = (16.66666666 - timer) * 1000000;
   cout<<"Sleeping for :"<<16.66 - timer<<" ms"<<endl;
   this_thread::sleep_for(nanoseconds((long)t));
}

https://github.com/martinisshaken/Sample-SDL2-OpenGL-Program

Here is the link to the source code - updated to have the frame timers

Again though, whether I have VSync enabled or not, whether it's 800fps or 60fps, and whether I implement timers to try to control the flow or not they ALL have the same stuttering problem.

If you skip the timer, and use VSync, and lock the frame-time you use in your simulation to always be exactly 16.666666666667 ms, does it still stutter?

This topic is closed to new replies.

Advertisement