[DX11] How to accurately measure frame time when vsync is enabled?

Started by
2 comments, last by hplus0603 3 years ago

I have implemented dynamic resolution rendering in my engine, which requires a measurement of the time it takes to render each frame.

When vsync is disabled (ie sync interval == 0), I can just measure time between each call to Present() using CPU timers, and everything works fine.

However, when vsync is enabled (sync interval == 1), this no longer works, and I always get 16ms (monitor refresh rate).

Are there any tricks for measuring accurately the actual time it took the GPU to execute the draw commands when using vsync, for the purpose of driving my dynamic resolution scale?

Advertisement

Unfortunately you can't really measure how long the GPU takes using CPU timers. You can do that to know how long is spent on the CPU in order to process a frame (and you should definitely do this!), but for GPU performance you need to measure the time using the GPU itself.

I recommend having two timers in your engine/game and displaying/logging both: one for your CPU frame time, and one for your GPU frame time. This will help you to immediately know whether you're CPU-bound or GPU-bound, and will give you the right info to make decisions for dynamic resolution. With your current setup you would drop the resolution in the case of something like a physics pile-up taking a large amount of time, even though the GPU might be humming along.

If you use Vsync, you just want to push the resolution up until the frame time suddenly jumps too high. If you can't make your resolution high enough to miss a frame, then great; your game runs at peak frame rate on that system!

enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement