Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

SIGMA

timeGetTime scale?

This topic is 6012 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hello, i was just wondering what the timeGetTime() function returns for its time scale. i.e. - how many timeGetTime() units does it take to make a second? thanks, -jesse

Share this post


Link to post
Share on other sites
Advertisement
quote:
The timeGetTime function retrieves the system time, in milliseconds. The system time is the time elapsed since Windows was started.


Share this post


Link to post
Share on other sites
Actually you should process time as integer deltas rather than convert it to floating point, to avoid floating point precision errors (it''s also much faster).

In other words, just store all your times as DWORDs.


--
Eric

Share this post


Link to post
Share on other sites
Okay, if it''s in miliseconds, wouldn''t I divide it by 1,000? Like: DWORD time = timeGetTime()/1000... so if it returned 5000 miliseconds, it''d be 5 seconds? Or am I seriously confused?
-Jesse

Share this post


Link to post
Share on other sites
Yes, you are correct, but if you are planning on dealing with game programming and using timeGetTime() to keep track of the internal fps independent scale, you will not want to deal in seconds (you dont want your frame to change every 5 secs, you want it to change every xxx milliseconds).

Maybe I am just tired and talking crap, but hell

Cray

Share this post


Link to post
Share on other sites
But dividing numbers is very slow, so you might as well convert to floating-point and multiply (by 0.001, which is 1/1000), which together is probably about the same as an integer divide. But that''s just my opinion.

~CGameProgrammer( );

Share this post


Link to post
Share on other sites
just some suggestions you might find helpful or not :

use the milli-seconds unmulitplied and undivided.
divide before you output anything.
btw: use the timeBeginPeriod() to set higher resolution.
default is 5 or more on NT/2000, but 1 is possible on nearly every system.

if you dont need a high resolution, use GetTickCount().
its much faster.

bye,
-- foobar


Edited by - foobar on January 30, 2002 5:03:33 PM

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!