#### Archived

This topic is now archived and is closed to further replies.

# timeGetTime scale?

This topic is 6012 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

hello, i was just wondering what the timeGetTime() function returns for its time scale. i.e. - how many timeGetTime() units does it take to make a second? thanks, -jesse

##### Share on other sites
quote:
The timeGetTime function retrieves the system time, in milliseconds. The system time is the time elapsed since Windows was started.

##### Share on other sites
Right, so float Time = float(timeGetTime()) * 0.001f is what you''d want for precise time measurements. Or doubles.

~CGameProgrammer( );

##### Share on other sites
Actually you should process time as integer deltas rather than convert it to floating point, to avoid floating point precision errors (it''s also much faster).

In other words, just store all your times as DWORDs.

--
Eric

##### Share on other sites
Okay, if it''s in miliseconds, wouldn''t I divide it by 1,000? Like: DWORD time = timeGetTime()/1000... so if it returned 5000 miliseconds, it''d be 5 seconds? Or am I seriously confused?
-Jesse

##### Share on other sites
You are correct.

---
Make it work.
Make it fast.

"Commmmpuuuuterrrr.." --Scotty Star Trek IV:The Voyage Home

##### Share on other sites
Yes, you are correct, but if you are planning on dealing with game programming and using timeGetTime() to keep track of the internal fps independent scale, you will not want to deal in seconds (you dont want your frame to change every 5 secs, you want it to change every xxx milliseconds).

Maybe I am just tired and talking crap, but hell

Cray

##### Share on other sites
But dividing numbers is very slow, so you might as well convert to floating-point and multiply (by 0.001, which is 1/1000), which together is probably about the same as an integer divide. But that''s just my opinion.

~CGameProgrammer( );

##### Share on other sites
just some suggestions you might find helpful or not :

use the milli-seconds unmulitplied and undivided.
divide before you output anything.
btw: use the timeBeginPeriod() to set higher resolution.
default is 5 or more on NT/2000, but 1 is possible on nearly every system.

if you dont need a high resolution, use GetTickCount().
its much faster.

bye,
-- foobar

Edited by - foobar on January 30, 2002 5:03:33 PM

##### Share on other sites
fooby, what do you mean by resolution?
-Senses

1. 1
Rutin
23
2. 2
3. 3
JoeJ
20
4. 4
5. 5

• 29
• 41
• 23
• 13
• 13
• ### Forum Statistics

• Total Topics
631741
• Total Posts
3001977
×