[help] GetTickCount() problem

Started by
4 comments, last by novousmihi 14 years, 9 months ago
I've never had this problem before and it makes no sense. When I call GetTickCount() it returns 0. Always.

//General inclusdes.....
void main()
{
    Sleep(5000);
    printf("NrOfMilliseconds: %i", GetTickCount());
}


It should print out the time in milliseconds since the computer was started, but it returns 0. Any suggestions what could be wrong? I'm trying to code a StopWatch-class and it isn't working when GetTickCount() return 0 all by itself :/ And no, it's not when i'm doing anything like:

int diffTick = GetTickCout() - startTick; 
// diffTick == 0 because the computer is too fast, high-resolution timer is needed. like I've seen many have a problem with.


My problem is that a call to GetTickCoun() all by itself returns 0. Always. Any suggestions?
Advertisement
in the includes are you including time.h?

if so try timeGetTime() i think it should do what you want.

[edit]

on a side note: has your machine been restarted in the last 50 days?

if not the following applys:
Quote:
The elapsed time is stored as a DWORD value. Therefore, the time will wrap around to zero if the system is run continuously for 49.7 days. To avoid this problem, use GetTickCount64. Otherwise, check for an overflow condition when comparing times.
But doesn't that mean that when it hits the maximum, it just resets to zero and continues like nothing happened?

like, if 10 is the maximum it would wrap around like this:

6,7,8,9,10,0,1,2,3,4,
5,6,7,8,9,10,0,1,2,3,
4,5,...

and not GetTickCount()==0 when it hits its maximum?

I have put my computer into hybernation everyday, but I seldom do a restart.
So I'll try that, but any other suggestions what might be wrong?
Don't use GetTickCount(). It has low resolution, and will not give you accurate enough results. Use timeGetTime() instead.

I was using GetTickCount() in my game to measure deltat between frames, and as a result the game used to stutter and jerk. I thought I was getting low frame rates and spent ages trying to work out what was wrong with my frame-rate timing code, and optimising my code. When I switched to timeGetTime() the game ran as smooth as I could ever wish for.
Quote:Original post by shaolinspin
Don't use GetTickCount(). It has low resolution, and will not give you accurate enough results. Use timeGetTime() instead.

I was using GetTickCount() in my game to measure deltat between frames, and as a result the game used to stutter and jerk. I thought I was getting low frame rates and spent ages trying to work out what was wrong with my frame-rate timing code, and optimising my code. When I switched to timeGetTime() the game ran as smooth as I could ever wish for.


Alright, You've all convinced me to switching to timeGetTime() ;D

I'll let you know if I have the same problem there. If I do, I'll restart my computer and try again, but if that doesn't work and it too returns 0, then I'm out of ideas.
Yay! It works!

Thanks alot Kalten and Shaolinspin!

This topic is closed to new replies.

Advertisement