Lx Episode 1 is finally on download.com, so please do your part ;-)

Started by
27 comments, last by Uhfgood 21 years, 6 months ago
What about the RDTSC timer? Is that a viable option?

BlueGrass
Advertisement
I''ve always used GetTickCount(), despite it being limited to 1msec resolution (if I remember correctly).. I''m surprised it hasn''t been mentionned here... I suppose someone''s found a problem with it, just like with every other timing function it seems.
I''ve had some problems with that QueryPerformanceCounter, how exactly should you count FPS with it?

/MindWipe
"To some its a six-pack, to me it's a support group."
You know the frequency of the timer (using QueryPerformanceFrequency). This is the number of ticks per second.

So calculate the number of seconds elapsed and put it in a float. (new tick count - old tick count) / frequency. Then divide one by this, and that''s the number of frames per second you''re getting.

Easy. But be careful not to divide by zero.

Helpful links:
How To Ask Questions The Smart Way | Google can help with your question | Search MSDN for help with standard C or Windows functions
quote:
How many times do I have to correct people on this? There is no problem if you use the difference between two calls to TimeGetTime ! It even tells you that in the help files.
That's two people today with that misconception (and the third person so far) that I've corrected and I'm sure I'll see plenty more in future.


Wrong. You should not use the difference from a direct call to timeGetTime. Why? What happens 1ms before and after wrap around? You'll get a 4,000,000,000 difference that you can't bounds easily. Instead, mod the value from timeGetTime and use that to calculate elapsed time:

Time = timeGetTime() % 65536;

Also, you can bounds timeGetTime to the length of animations to make updating easy.

Oh, and there seems to be a misunderstanding of what resolution means. When MS states that timeGetTime has a resolution of 1ms, it only means that it can report time 1ms apart, never under. It doesn't mean that you can read the time each and every ms. In fact, per MS, typical testing shows that the difference in readings is usually 10ms (it's in the MSDN - do the research):

quote:
SUMMARY
When timing code to identify performance bottlenecks, you want to use the highest resolution timer the system has to offer. This article describes how to use the QueryPerformanceCounter function to time application code.

MORE INFORMATION
Several timers of differing accuracy are offered by the operating system:


Function Units Resolution

Now, Time, Timer seconds 1 second
GetTickCount milliseconds approx. 10 ms
TimeGetTime milliseconds approx. 10 ms


quote:
A resolution of 1 millisecond would seem more than adequate for any real-time game. However, the problems of latency associated with timeSetEvent apply to timeGetTime as well. With Time Waster running in the background, I have recorded delays of up to 100 milliseconds before timeGetTime reports a 1-millisecond "tick." Then, on a subsequent call, the missing time is made up. The result is a stutter that can affect timed events. For instance, if your game is updating the world 30 times per second, a single delay of even 40 milliseconds can cause the update routine to skip a beat and then take two quick beats to catch up. If the stutter happens to correspond to a screen update, the animation will not be smooth.


quote:
There are two types of timer events: single and periodic. A single timer event occurs once, after a specified number of milliseconds. A periodic timer event occurs every time a specified number of milliseconds elapses. The interval between periodic events is called an event delay. Periodic timer events with an event delay of 10 milliseconds or less consume a significant portion of CPU resources.
Note The multimedia timer runs in its own thread.




[edited by - Jim Adams on October 3, 2002 2:06:19 PM]
quote:Original post by Jim Adams
Wrong. You should not use the difference from a direct call to timeGetTime. Why? What happens 1ms before and after wrap around? You'll get a 4,000,000,000 difference that you can't bounds easily.

No you won't.


    #include <iostream>#include <cstdlib>using namespace std;#define WIN32_LEAN_AND_MEAN#include <windows.h>int main(){  DWORD oldTime = 0xFFFFFFFF;  DWORD newTime = 2; // or "oldTime + 3" if you prefer   cout << newTime - oldTime << endl;      system("PAUSE");	  return 0;}    

Tell me the output. Thank you for calling it an asinine statement, however. Much appreciated [since edited out, I believe].

[rest of the stuff conceded...]

[edited by - Alimonster on October 3, 2002 2:14:56 PM]
Oops - you are right on the output - I forgot about the whole bit-thang. But, the rest of the message still stands (about latency and resolution).

You still don't need to calculate the difference if you do things correctly. To expand on my example, let's say you have an animation that's 1000 milliseconds long. To properly calculate the timing of the animation:

DWORD AnimTime = timeGetTime() % 1000;

The above would never fail, and without having to calculate the difference between calls. In other words, if a call to timeGetTime returns a bad value, then the difference between calls will also contain a bad value. That's what I was calling asinine.


[edited by - Jim Adams on October 3, 2002 3:58:18 PM]
quote:DWORD AnimTime = timeGetTime() % 1000;

Would that not cause some problems? If timeGetTime gives a value of $FFFFFFFF then $FFFFFFFF % 1000 gives 295, but the next (0 % 1000) would be 0, not 296. It seems that animations could be started at different values rather than 0 depending on what the modulo 1000 returned. Would this cause some strange results such as animations starting half-way through their allocated time (though I don''t know how you use AnimTime in calculations)? Would you have to offset the initial value returned?

I''m asking out of genuine curiosity btw because I''m still fiddling with timing for my game.

My current method involves having a "total time" and "time passed" for an object. Each frame, I get the time difference in milliseconds and add that to the object''s "time passed". I then calculate its position in between x1,y1 and x2,y2 based on how much time has passed out of its total time. How would I modify this to work with the above AnimTime % 1000? [This is only for small animations on a help screen btw, not for every game object.]
You are right on the wrap around w/mod the time - I forgot to also add in the start of animation time to account for non-looping animation:

static DWORD StartTime = timeGetTime();
DWORD ThisTime = (timeGetTime() - StartTime) % 1000;

That should always give you the proper time of the animation.

This topic is closed to new replies.

Advertisement