Jump to content
  • Advertisement


This topic is now archived and is closed to further replies.


Time Functions

This topic is 5425 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Im a bit confused about using GetTickCount() to synchronize time. The textbook I have basically calls getTickCount() and stores that in a global, then defines some methods that basically do something like: DWORD Wait_Clock(DWORD count); { while(GetTickCount() - start_clock_count) < count); ... So basically, you call Wait_Clock until a certain time has passed and then process a frame. What I don''t understand is: what happens if it takes a long time to process a frame, and count is greater than GetTickCount()-start_clock? Do you just have to accept that, drop a frame, or watch the game get really slow? How accurate is GetTickCount(), and how long does it take to compute? Could it conceivably take long enough in the above while loop to make the game run at different rates on different machines? Right now, I''m using Sleep(##)''s to adjust time, but the game runs too fast on my 1.4Ghz and too slow on my 233Mhz machine. I''d like the game to run at the same speed on most machines. A little advice would be greatly appreciated. Thanks.

Share this post

Link to post
Share on other sites
The problem with your approach is that it forces constant framerate regardless of the computer you are running the game on. Because of this, in order to get the objects in your game to move at the same speed on the 233mhz and 1.4ghz, the game must always run at a framerate that the 233 can handle.

This is what I do, which allows the faster computer to get as high a framerate as possible while still running the game at the same speed as the 233:

At the end of the cpu intensive stuff for each frame, note the amount of time it took.
int t=GetTickCount()-time_at_start_of_frame;

Then convert this to a scalar to apply to object speeds.
float speed_scalar=(float)t/1000.0f;

Then you just need to make sure that all of your speeds are in units/sec. For example, say an object in your game should move 5 units every second, then at the end of the frame go:


This way, the distance an object travels each frame is dependant on the framerate, so the game runs at the same speed regardless of the framerate/speed of comp.

Hope this helps.

Share this post

Link to post
Share on other sites
Original post by InfestedFurby
This way, the distance an object travels each frame is dependant on the framerate..

Just a slight correction so that R67en doesn''t get confused:

InfestedFurby meant to say that the distances an object travels each frame is NOT dependant on the frame rate, hence the term "Frame Rate Independant Movement".

Share this post

Link to post
Share on other sites

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!