Jump to content
  • Advertisement
Sign in to follow this  
AshleysBrain

iTunes improves VSync quality?!

This topic is 4139 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've noticed something really weird with my little DirectX 9 (C++) game. If "not enough" is going on, it VSyncs badly and hits about 53FPS, which looks very choppy on my 75Hz monitor. If I open iTunes - not actually playing anything, just having it sit there - the framerate jumps back up to 75fps and it looks fine. What's going on?! Can't I make it run top speed normally, without having to open another application? Does VSync not work properly if the GPU isn't busy enough?

Share this post


Link to post
Share on other sites
Advertisement
Ha :)

I have also noticed this odd phenomenon with a different app (Ad-Aware)
I recently posted about it. This may or may not pertain to your situation.

http://www.gamedev.net/community/forums/topic.asp?topic_id=433911

Share this post


Link to post
Share on other sites
Some applications change the frequency of ticks returned from various timing methods to get higher accuracy. That might be the case here. If it is, framerate most likely isn't dropping, but rather the timing methods are returning invalid results, causing the FPS calcs to be off.
That would be worth looking into as well.

Share this post


Link to post
Share on other sites
Quote:
Original post by sirob
Some applications change the frequency of ticks returned from various timing methods to get higher accuracy. That might be the case here. If it is, framerate most likely isn't dropping, but rather the timing methods are returning invalid results, causing the FPS calcs to be off.
That would be worth looking into as well.
That is pretty much the exact answer I got from MS when I reported this exact same characteristic - but with Windows Media Player instead of iTunes...

For reference:
Quote:
MSDN Documentation:
Windows NT/2000: The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. If you do so, the minimum difference between successive values returned by timeGetTime can be as large as the minimum period value set using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and QueryPerformanceFrequency functions to measure short time intervals at a high resolution


hth
Jack

Share this post


Link to post
Share on other sites
I've noticed this too. Does anyone have a solution, I don't want to open WMP every time I want to run my applications. Where should D3DPRESENT_INTERVAL_ONE go? I can't find any reference to it in my code.

Share this post


Link to post
Share on other sites
I'm still having this problem. D3DPRESENT_INTERVAL_ONE didn't work for me, i'm not sure i'm using it right. Can someone explain to following paragraph to me:

A second solution is to use DirectX 9's Asynchronous Query functionality (analogous to using fences in OpenGL). At the end of your frame, insert a D3DQUERYTYPE_EVENT query into your rendering stream. You can then poll whether the GPU has reached this event yet by using GetData. As in 1) you can thus ensure (i.e., busy wait w/ the CPU) that the CPU never gets more than 2 frames ahead of the GPU, while the GPU is never idled. Similarly it is conceivable to insert multiple queries per frame to get finer control over lag.

I'm not sure what it's asking me to do. Also does anybody know what is causing the cpu to out-perform the gpu. Is is accepting keyboard input?

Thanks

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!