inconsistent frametime

Started by
6 comments, last by L. Spiro 11 years, 10 months ago
Hello all,
I have a scene with nothing but 4 instances of a 26K vertices model. and there are two post passes. with one deferred shader and an anti-aliasing pass. the resolution is 1280x720. and my frame time follows this pattern:
0.006
0.005
0.005
0.006
0.005
0.012
0.001
0.005
...
This has been bothering me for a long time. Does anyone know why this happens?
Advertisement
You gave hardly enough information for anyone to make a guess.

I assume you are debugging with Microsoft® Visual Studio®, in which case this behavior is sometimes normal.
If you print things to the debug console you will get a little stutter, and even if you don’t you may get some stutter every second or so.
Run the application outside of the debugger and see if there is still stutter. If not, move on.

If there is, consider what else you could share for people to make a better guess.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Thanks for that comment,
yes it's under Visual Studio 10 with DX11.
and it still stutters outside the debugger.
Even when there is no polygon on screen, or if I reduce number of the render targets I'm using, the frame time still stutters. but in a smaller range. like:
0.002
0.002
0.005
0.001
0.002
...
it's interesting that every longer frame is followed by a shorter frame. and they add up to the average frame time.
update:
I'm debugging on my laptop. which has two graphics cards. and I was running the app on the nvidia card.
and I tried it on the Intel card, it runs without significant stuttering, although the frame time is longer. it's something like:
0.009
0.009
0.010
0.008
...

so is it safe to say it's the graphics card?
Since you ran it outside the debugger and replied that it still stutters, my following question may seem silly, but I want it to be explicitly stated for the record: Can you actually visually see this stutter, or are you just trusting the numbers?

Also, you appear to be measuring time in milliseconds. What function(s) are you calling to measure time between frames?


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

no I cannot feel the stutter when dragging the camera around.
I am calling timeGetTime() to get the value. and that's the value I'm using to update all my systems.

no I cannot feel the stutter when dragging the camera around.
I am calling timeGetTime() to get the value. and that's the value I'm using to update all my systems.

Ok, since you haven't posted the code:

1) Profile the code. Use code profiling software.
2) Run just a bluescreen. Run it through the code profiler.
3) If it's fluxuating about 1%, then call it good. Unbounded FPS fluxuates.
I don’t think the above will be necessary.
There is no stutter. Stop using timeGetTime(). As you can clearly see by your timings, it is not accurate. It is not updated frequently enough to give you accurate timings.

You should be able to easily tell by how low your numbers are that milliseconds these days are no longer enough to keep track of time. Games these days need to be using QueryPerformanceCounter() and friend.

While QueryPerformanceCounter() itself may have its own problems on multi-core systems, in practice these problems never manifest and are largely only possible on a small set of hardware.
In either case, there is nothing better for Windows® in regards to game timing.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

This topic is closed to new replies.

Advertisement