GetTickCount() overhead

Started by
12 comments, last by Gage64 16 years, 3 months ago
Im curious to how much overhead the GetTickCount() gives, im using it quite alot now in my code and im curious if I should put it out to a global and update it as I see fit? Stefan
Advertisement
What is quite a lot and are you positive that you are calling it as rarely as possible?
GetTickCount is lighter to call (and much more inaccurate) than timeGetTime, which itself is lighter than QueryPerformanceCounter - the 3 most commonly used timers.
(RDTSC is the lightest - 1 ASM call - but cannot be trusted on DualCore's, unless you set the process' affinity mask to a single CPU, and not at all with Mobile CPU's either)
I imagine that it would be quite fast. GetTickCount() is just a function call and a memory read, I assume.
NextWar: The Quest for Earth available now for Windows Phone 7.
You really should have a central component that handles time for your application. Other components (such as physics) should simply accept a frame time parameter and not directly access the system clock.
Quit screwin' around! - Brock Samson
yes as coderx75 saiz, store the current tick/time somewhere + use that, ideally u want things to be able happen predeterminly
Quote:GetTickCount() is just a function call and a memory read, I assume.

Pretty much - though the function call is indirect (DLL) and a "divide" (multiply/shift) plus synchronization are involved. Still, at about 10 clocks, it's as good as a global variable and faster even than RDTSC.

Quote:You really should have a central component that handles time for your application. Other components (such as physics) should simply accept a frame time parameter and not directly access the system clock.

Good advice. It's more convenient to pass around time as a double (seconds since startup), and having your own component insulates the rest of your code from changes (e.g. needing a higher-resolution timer later).
E8 17 00 42 CE DC D2 DC E4 EA C4 40 CA DA C2 D8 CC 40 CA D0 E8 40E0 CA CA 96 5B B0 16 50 D7 D4 02 B2 02 86 E2 CD 21 58 48 79 F2 C3
Slightly unrelated question - In a typical game loop, is it OK to call GetTickCount()/timeGetTime()/whatever only once per-frame, or will that not be precise enough in certain situations?

I was also worried once about the overhead of GetTickCount(), so I implemented something like this:

class Timer {public:    Timer() { reset(); }    void reset() {        time = 0.0f;    }    void update(float timeDelta) {        time += timeDelta;    }    bool elapsed(float period) {        if (time > period) {            reset();            return true;        }        return false;    }private:    float time;};


Then each frame I would call update() with the time difference between frames in seconds (which I calculated once in the main loop using GetTickCount()). I suppose using this kind of timer might also help to implement a pause state for a game.

At the time it worked just fine, but I'm wondering if there are any problems with this approach in a more complicated game (I used that class in a tetris game)?
In a more complicated game, you should get the value of GetTickCount() once per-frame, and store that somewhere. The rationale for this is that frame times can be potentially longer than the precision of GetTickCount() - which is about 15ms on modern systems.

That means, that if your game was running at 30fps, you have 33ms per frame. Because of this, GetTickCount() at the beginning of your frame will probably return a different value than if you called it at the end of the frame. Depending on how you do your timing, this can cause unwanted behaviours.
NextWar: The Quest for Earth available now for Windows Phone 7.
Quote:
Original post by Sc4Freak
In a more complicated game, you should get the value of GetTickCount() once per-frame, and store that somewhere.


Was this directed at me or at the OP, because that's exactly what I'm doing.

Quote:The rationale for this is that frame times can be potentially longer than the precision of GetTickCount() - which is about 15ms on modern systems.

That means, that if your game was running at 30fps, you have 33ms per frame. Because of this, GetTickCount() at the beginning of your frame will probably return a different value than if you called it at the end of the frame. Depending on how you do your timing, this can cause unwanted behaviours.


I don't understand this. It sounds like you say that you should call it more than once per-frame, otherwise it might not be accurate enough. Could you please explain this a bit more?
You generally want everything in a given frame to behave as if it happened instantaneously - so all at the same time step. Consider that a frame is basically a snapshot at a given time. If your time step is constantly changing throughout the frame then different parts of the frame will have different update times. This could cause some very odd effects (object motion may jitter as it gets updated at different times within a frame for instance). To prevent this you should calculate a frame time at the start of the frame and use that for everything in the frame.

This topic is closed to new replies.

Advertisement