Archived

This topic is now archived and is closed to further replies.

Millisecond Timers

This topic is 5575 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. Im working on a project and I''m doing frame rate independance, whitch works perfectly fine exept for 1 problem. I''m currently isung a startard time.h timer that only ticks at about 18.2 tiems per second, whitch is fine when my program is running at 18.2 a second or less. However programs that run at 18.2 fps or less ar undesireable. So what I need is for you to tell me how or point me to a source where I can get at millisecond timer. Basically all I need is to have a function that is called every frame that updates a global float with the amount of seconds the program has been running. And BTW, if it matters im using dev-c++ 4 Thanks for your help.

Share this post


Link to post
Share on other sites
All modern chips support it. I''ve heard that the 486 does not. In any case; if your game is only running 20 fps on a modern machine, it won''t run on the 486 to begin with. And, you can allways add getTickCount support - QueryPerformanceCounter returns 0 if the hardware timer is unavailable.

Share this post


Link to post
Share on other sites
Just for info, here are the performances of various timing mechanisms, profiled on a 733MHZ machine:

Timer Profile (RDTSC):
Frequency: 733373724Hz
Resolution: 1.36356e-09s
Error: 0.000227604s
Speed: 1.38006e-07s

Timer Profile (PerformanceCounter):
Frequency: 3579545Hz
Resolution: 2.79365e-07s
Error: 0.000274057s
Speed: 1.44208e-06s

Timer Profile (TimeGetTime):
Frequency: 1000Hz
Resolution: 0.001s
Error: 0.01s
Speed: 1.82705e-07s

Timer Profile (GetTickCount):
Frequency: 1000Hz
Resolution: 0.001s
Error: 0.01s
Speed: 5.39175e-08s

Timer Profile (Clock):
Frequency: 1000Hz
Resolution: 0.001s
Error: 0s
Speed: 3.71584e-06s

Frequency and resolution are pretty obvious. Error is the average error when timing for a second and will probably be a bit off (it''s difficult to calculate error when you don''t have a fully accurate timer to test against). Speed is the time taken to execute the timer mechanism.

Enigma

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Beware, NT (and that includes XP) often does not have 1 milisecond resolution with timeGetTime!
1 ms resolution is only on win9x/ME.
At the beginning you should timeGetDevCaps and than timeBeginPeriod with the minimum resolution which is returned.

I would use QueryPerformanceFrequency and QueryPerformanceCounter and if there isn''t performance counter only then I would use timeGetTime as is explained.

Share this post


Link to post
Share on other sites
Dont use timeGetTime(). Ever. It returns the time since the system was started up - if the system has been up for longer than 3 hours (IIRC) then it overflows the DWORD it returns - and disaster ensues.

Share this post


Link to post
Share on other sites
quote:
Original post by Sark
Dont use timeGetTime(). Ever. It returns the time since the system was started up - if the system has been up for longer than 3 hours (IIRC) then it overflows the DWORD it returns - and disaster ensues.


Nope. It doesn't matter if you take the difference between two values. I wrote this simple little example in Delphi:

  procedure Test;
var
NewTime: DWORD;
OldTime: DWORD;
Difference: DWORD;
begin
NewTime := 2;
OldTime := $FFFFFFFF; // $ = 0x, means hex


Difference := NewTime - OldTime; // no problem here


// format is like printf in C

ShowMessage(Format('%u', [Difference]));
end;

Try to guess the output. Yep, it was 3. It behaved properly.

[edited by - Alimonster on September 11, 2002 7:10:34 PM]

Share this post


Link to post
Share on other sites
I think it''s supposed to be every 49 days (4294967295 milliseconds) that the DWORD overflows, not after 3 hours. Put it this way, I''ve been using timeGetTime and its never overflowed on me (and I''m sure I''ve managed more than 3 hours uptime at some point ).

Share this post


Link to post
Share on other sites
Let me stop this before I have to hurt someone... and trust me, I''m willing to do so.

Straight from my win32.hlp file, in the section on TimeGetTime:

quote:
Note that the value returned by the timeGetTime function is a DWORD value. The return value wraps around to 0 every 2^32 milliseconds, which is about 49.71 days.This can cause problems in code that directly uses the timeGetTime return value in computations, particularly where the value is used to control code execution. You should always use the difference between two timeGetTime return values in computations.


Repeat: You should always use the difference between two timeGetTime return values in computations.

Use the difference and you''re totally fine. Also, note that GetTickCount returns a DWORD so would have the same issue - that''s why you''ll never see anything other than (NewTime - OldTime) in a tutorial on timing. Everyone uses relative timing and nobody gets hurt - right?

Share this post


Link to post
Share on other sites
quote:
Original post by Alimonster
Use the difference and you''re totally fine.
That depends. If you have a frame rate of < 1/50 days (0,00000023283064365386962890625 fps) it can be dangerous. Although, you probably have something else to worry about then

Share this post


Link to post
Share on other sites
hmm... ok, well it looks like im wrong I just vaguely remember having problems with it when i first started with timing (my computers always on), but it could have been something else. I dont know.

Share this post


Link to post
Share on other sites
link.

If you are using if you are using QueryPerformanceCounter()
you should cross check timeGetTime()

EDIT by ZE: fixed link, deleted redundant post.

[edited by - zealouselixir on September 11, 2002 10:47:26 PM]

Share this post


Link to post
Share on other sites