Jump to content
  • Advertisement
Sign in to follow this  
Veslefrikk

GetTickCount() overhead

This topic is 3971 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Im curious to how much overhead the GetTickCount() gives, im using it quite alot now in my code and im curious if I should put it out to a global and update it as I see fit? Stefan

Share this post


Link to post
Share on other sites
Advertisement
What is quite a lot and are you positive that you are calling it as rarely as possible?
GetTickCount is lighter to call (and much more inaccurate) than timeGetTime, which itself is lighter than QueryPerformanceCounter - the 3 most commonly used timers.
(RDTSC is the lightest - 1 ASM call - but cannot be trusted on DualCore's, unless you set the process' affinity mask to a single CPU, and not at all with Mobile CPU's either)

Share this post


Link to post
Share on other sites
You really should have a central component that handles time for your application. Other components (such as physics) should simply accept a frame time parameter and not directly access the system clock.

Share this post


Link to post
Share on other sites
yes as coderx75 saiz, store the current tick/time somewhere + use that, ideally u want things to be able happen predeterminly

Share this post


Link to post
Share on other sites
Quote:
GetTickCount() is just a function call and a memory read, I assume.

Pretty much - though the function call is indirect (DLL) and a "divide" (multiply/shift) plus synchronization are involved. Still, at about 10 clocks, it's as good as a global variable and faster even than RDTSC.

Quote:
You really should have a central component that handles time for your application. Other components (such as physics) should simply accept a frame time parameter and not directly access the system clock.

Good advice. It's more convenient to pass around time as a double (seconds since startup), and having your own component insulates the rest of your code from changes (e.g. needing a higher-resolution timer later).

Share this post


Link to post
Share on other sites
Slightly unrelated question - In a typical game loop, is it OK to call GetTickCount()/timeGetTime()/whatever only once per-frame, or will that not be precise enough in certain situations?

I was also worried once about the overhead of GetTickCount(), so I implemented something like this:


class Timer {
public:
Timer() { reset(); }

void reset() {
time = 0.0f;
}

void update(float timeDelta) {
time += timeDelta;
}

bool elapsed(float period) {
if (time > period) {
reset();
return true;
}

return false;
}

private:
float time;
};


Then each frame I would call update() with the time difference between frames in seconds (which I calculated once in the main loop using GetTickCount()). I suppose using this kind of timer might also help to implement a pause state for a game.

At the time it worked just fine, but I'm wondering if there are any problems with this approach in a more complicated game (I used that class in a tetris game)?

Share this post


Link to post
Share on other sites
In a more complicated game, you should get the value of GetTickCount() once per-frame, and store that somewhere. The rationale for this is that frame times can be potentially longer than the precision of GetTickCount() - which is about 15ms on modern systems.

That means, that if your game was running at 30fps, you have 33ms per frame. Because of this, GetTickCount() at the beginning of your frame will probably return a different value than if you called it at the end of the frame. Depending on how you do your timing, this can cause unwanted behaviours.

Share this post


Link to post
Share on other sites
Quote:

Original post by Sc4Freak
In a more complicated game, you should get the value of GetTickCount() once per-frame, and store that somewhere.


Was this directed at me or at the OP, because that's exactly what I'm doing.

Quote:
The rationale for this is that frame times can be potentially longer than the precision of GetTickCount() - which is about 15ms on modern systems.

That means, that if your game was running at 30fps, you have 33ms per frame. Because of this, GetTickCount() at the beginning of your frame will probably return a different value than if you called it at the end of the frame. Depending on how you do your timing, this can cause unwanted behaviours.


I don't understand this. It sounds like you say that you should call it more than once per-frame, otherwise it might not be accurate enough. Could you please explain this a bit more?

Share this post


Link to post
Share on other sites
You generally want everything in a given frame to behave as if it happened instantaneously - so all at the same time step. Consider that a frame is basically a snapshot at a given time. If your time step is constantly changing throughout the frame then different parts of the frame will have different update times. This could cause some very odd effects (object motion may jitter as it gets updated at different times within a frame for instance). To prevent this you should calculate a frame time at the start of the frame and use that for everything in the frame.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!