Pentium M cpu and High performance timer

Started by
28 comments, last by Krysole 18 years, 9 months ago
Microsoft uses QPC in all of it's DirectX samples and I imagine those samples are running on a variety of systems.

What's so bad about QPC? Anyone able to phrase it in "timer laymen's terms"?
Advertisement
Quote:Original post by zedzeek
Quote:Original post by Daggett
I found http://www.mvps.org/directx/articles/selecting_timer_functions.htm helpful.

Personally, I'd use the performance counter by default, but put in a fall back to timeGetTime if the user wants to use that instead.

aye? u wanna stick an option in the menu (choose timing function) :)

query performance counter should NOT be used in a app that u intend for other ppl to use, perhaps someone should email the guy who wrote that page and point that out (i find it very surprising he never mentioned the problems with queryperformance counter even though the article is from 2002)


Why? You query the speed, so what could go wrong (unless it isn't there)?

Quote:Original post by Daniel Miller
[...]Why? You query the speed, so what could go wrong (unless it isn't there)?
See my post here
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
I'm forced to agree with people that timeGetTime() offers enough precision for the uses of gaming. I can understand wanting excruciatingly detailed timers for ballistics simulations and nuclear detonation scenarios, but you can model a perfectly fine door getting blown off of its hinges and spaceships blowing up ad nauseum with a clock that's accurate within a 20th of a second, z.b. 50ms+/-, and usually much closer.

If it gives you problems later then perhaps you can swap out your timer source for one that fits a little better but don't get stuck up on what timer you're going to use ahead of time, start with what's easily available. Also, if you're designing the whole thing with forethought, it shouldn't be too difficult at all to swap out a timing function in the future if you suddenly realize that the old one is not going to work.
Quote:Original post by Omaha
I'm forced to agree with people that timeGetTime() offers enough precision for the uses of gaming. I can understand wanting excruciatingly detailed timers for ballistics simulations and nuclear detonation scenarios, but you can model a perfectly fine door getting blown off of its hinges and spaceships blowing up ad nauseum with a clock that's accurate within a 20th of a second, z.b. 50ms+/-, and usually much closer.

If it gives you problems later then perhaps you can swap out your timer source for one that fits a little better but don't get stuck up on what timer you're going to use ahead of time, start with what's easily available. Also, if you're designing the whole thing with forethought, it shouldn't be too difficult at all to swap out a timing function in the future if you suddenly realize that the old one is not going to work.


I don't see how anything can be done accurately with a timer that updates every 20th of a second... if your game renders at 60 frames per second, you want to have an "update" at least every render frame, or more. that would cause the timer to return a delta T of 0 seconds, and mean all your objects wouldnt move, and then every 7 frames or so, you would get a 50ms tick and the timer would say 50 ms passed and everything would move...

unless you suggest some system where you poll the timer 5 times a second and then divide by some average number of frames you had before that to get a "pseudo" higher precision so objects can move a little every frame.
Quote:Original post by LEET_developer
[...]I don't see how anything can be done accurately with a timer that updates every 20th of a second...[...]
I agree that such a timer would be difficult to use. It is a great thing, then, that we have access to timeGetTime which, with the help of timeBeginPeriod, has 1ms accuracy.
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
Quote:Original post by Extrarius
Quote:Original post by LEET_developer
[...]I don't see how anything can be done accurately with a timer that updates every 20th of a second...[...]
I agree that such a timer would be difficult to use. It is a great thing, then, that we have access to timeGetTime which, with the help of timeBeginPeriod, has 1ms accuracy.


I haven't seen much of this timeBeginPeriod() method... what exactly does that return? that method might be exactly what I am looking for, and if that will give me 1 ms accuracy I will be happy.
Quote:Original post by zedzeek
Quote:Original post by Daggett
I found http://www.mvps.org/directx/articles/selecting_timer_functions.htm helpful.

Personally, I'd use the performance counter by default, but put in a fall back to timeGetTime if the user wants to use that instead.

aye? u wanna stick an option in the menu (choose timing function) :)

Sure, why not? It doesn't have to be in the menu, just stick this in some cfg file:
UseGetTime=false
the user can change that if their CPU has a variable clock rate, which seems like a simple solution. :)

Quote:Original post by LEET_developer
I haven't seen much of this timeBeginPeriod() method... what exactly does that return? that method might be exactly what I am looking for, and if that will give me 1 ms accuracy I will be happy.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/multimed/htm/_win32_timebeginperiod.asp
It says to use timeBeginPeriod and endPeriod immediately before and after you use it... but then it says that timeBeginPeriod is on a per application / driver instance. So then it would have no adverse effects calling beginPeriod and end period at the beginning and end of your program?
Using timeBeginPeriod will force the interupt timer on the system to interupt the system every 1ms. This is a very bad thing, and has reported serious slow down (although an interupt every ms is not that bad the general consensus of most articles/threads i've read suggest that its not worth it). Just leave it at 10ms or do what most poeple do and use the QPC. The QPC is being actively updated and as such is the only api that will most likely work stabally for game timer usage in 10 years versus the somewhat buggy platform apis and the TSC. It seems to work on almost all machines, those that it doesn't probably don't play many games:P Its a safe bet afaik even though it may not be historically the correct one (who cares about semantics, this will work and its the only api that queries the high resolution clock on your pc anyway, the newer pcs are starting to come out with even better ones too, qpc will take advantage of them, and they're apparently more stable, and more suitable for gaming...). There is an artical here on timers with plenty of links, that and some googling is where i got this information (I the references at the bottom of that artical too, lots of stuff, good fun).

This topic is closed to new replies.

Advertisement