Switching PresentInterval after window creation

Started by
7 comments, last by MJP 11 years, 7 months ago
Is there some way to switch the present interval dynamically after the window has already been created? I cannot find a way to do that in DirectX9. I can only find how to set it at startup. I know this can be done in OpenGL on Windows, and even in DirectX9 on the Xbox 360. So is there something similar in DirectX9 on PC?

Searching the forum, I found this old topic, but that suggests I need to reset things to switch VSync. Doing this takes some time, is that correct? Or can that be done at any time in between two frames without causing a framedrop because of the reset?

I know that quite a few games these days have VSync on by default and then dynamically shortly turn if off if the framerate drops too far. When the framerate is back to normal, vsync is then turned on again. How is this done on PC in DirectX9? Or is this only done on consoles?

My dev blog
Ronimo Games (my game dev company)
Awesomenauts (2D MOBA for Steam/PS4/PS3/360)
Swords & Soldiers (2D RTS for Wii/PS3/Steam/mobile)

Swords & Soldiers 2 (WiiU)
Proun (abstract racing game for PC/iOS/3DS)
Cello Fortress (live performance game controlled by cello)

Advertisement
To change VSYNC in DX9 you need to reset the device, which is really slow since you have to release all of your non-managed resources. So you can't change it frame-to-frame to get the "soft VSYNC" effect that you're referring to. In DX10/DX11 the sync interval is a parameter that you pass to Present, so it is possible to change it each frame. However even with that it's still difficult do a soft VSYNC, since you don't have the same amount of low-level timing info and control that you do on consoles. I'm not sure if any PC games have actually shipped with it, but if they did they were surely using DX10 or DX11. (The one notable exception is RAGE, which had Nvidia add a soft VSYNC option into the driver for them).
On Vista and Win7 D3D9Ex has a WaitForVBlank method, which I guess does the same as the DXGI version, though I haven't tried it.
You could also create a DirectDraw device that you use only for VSync. Check out WaitForVerticalBlank, http://msdn.microsof...y/aa911354.aspx or GetVerticalBlankStatus.
Okay, sounds like my best option then is to just not support dynamic vsync switching in my engine and move the vsync option from the in-game settings menu to the pre-game launcher.

Kind of funny how some features are impossible in one API, and no problem in another. Same OS, same hardware, but DirectX 9 can't do it and OpenGL can. Guess DX10/11 do have some useful improvements after all. wink.png

Thanks for the help, folks!

My dev blog
Ronimo Games (my game dev company)
Awesomenauts (2D MOBA for Steam/PS4/PS3/360)
Swords & Soldiers (2D RTS for Wii/PS3/Steam/mobile)

Swords & Soldiers 2 (WiiU)
Proun (abstract racing game for PC/iOS/3DS)
Cello Fortress (live performance game controlled by cello)


However even with that it's still difficult do a soft VSYNC, since you don't have the same amount of low-level timing info and control that you do on consoles.
Does DX11 have a GPU timestamp read-back API, and a requirement for GPUs to support it? DX9 is lacking this, and GL has it via an extension (not sure which GPUs do and don't support the extension though).
Being able to stamp your frames to get a value on GPU processing time would be a great base-level API requirement (like on consoles). Even if read-back is delayed, you can use a rolling average to get yourself out of trouble a few frames after vsync starts being consistently harmful.
Does DX11 have a GPU timestamp read-back API, and a requirement for GPUs to support it? DX9 is lacking this, and GL has it via an extension (not sure which GPUs do and don't support the extension though).
Being able to stamp your frames to get a value on GPU processing time would be a great base-level API requirement (like on consoles). Even if read-back is delayed, you can use a rolling average to get yourself out of trouble a few frames after vsync starts being consistently harmful.

Doesn't DirectX have some equivalent of OpenGL's fences? Fences are not exactly what you describe, but they do give some nice information on where the videocard is at the moment.

My dev blog
Ronimo Games (my game dev company)
Awesomenauts (2D MOBA for Steam/PS4/PS3/360)
Swords & Soldiers (2D RTS for Wii/PS3/Steam/mobile)

Swords & Soldiers 2 (WiiU)
Proun (abstract racing game for PC/iOS/3DS)
Cello Fortress (live performance game controlled by cello)

There are timestamp queries, which are actually available in DX9 as well.

There are timestamp queries, which are actually available in DX9 as well.
You've just blown my mind!
I swear that last time I looked at the local version of this page (inside the DirectX SDK's installed documentation), there was no timestamp query.
I was still under the impression that the only method of timing GPU usage under DX9 was the non-real-time, CPU-blocking, flush & finish method described here.

On consoles, I basically use a ring-buffer of time-stamp queries to detect bad performance; the major check is using the deltas to calculate a rolling average of GPU-frame-time to see if there's consistently bad GPU performance. It seems I can implement this on DX9 as well?

[quote name='MJP' timestamp='1345449778' post='4971374']
There are timestamp queries, which are actually available in DX9 as well.
You've just blown my mind!
I swear that last time I looked at the local version of this page (inside the DirectX SDK's installed documentation), there was no timestamp query.
I was still under the impression that the only method of timing GPU usage under DX9 was the non-real-time, CPU-blocking, flush & finish method described here.

On consoles, I basically use a ring-buffer of time-stamp queries to detect bad performance; the major check is using the deltas to calculate a rolling average of GPU-frame-time to see if there's consistently bad GPU performance. It seems I can implement this on DX9 as well?
[/quote]

Yeah I had thought they were new for DX10, but someone else pointed out to me that DX9 has them as well. In my experience the query works pretty much the way you'd expect. Which of course means it has all of the usual latency problems with queries, as well as the "just what exactly am I measuring?" problem you have with reading GPU timestamps.

This topic is closed to new replies.

Advertisement