Members - Reputation: 133
Posted 14 August 2001 - 02:12 AM
DWORD start_time = GetTickCount();
This works for me.
Then to lock the frame rate you would put
DWORD start_time = GetTickCount(); at the beginning of the game loop and then
while (GetTickCount() - start_time < 33 );
33 is the number of milliseconds that you want it to wait. This is an example for 30 fps.
I dont believe that you need to include any extra files to use the time, but here are my includes anyway.
I hope this has helped
Edited by - GameDev135 on August 14, 2001 12:34:05 PM
Members - Reputation: 122
Posted 14 August 2001 - 02:27 AM
"- To begin with, said the Cat, a dog''s not mad. You grant that?
- I suppose so, said Alice.
- Well, then, - the Cat went on - you see, a dog growls when it''s angry, and wags its tail when it''s pleased. Now I growl when I''m pleased, and wag my tail when I''m angry. Therefore I''m mad."
Posted 14 August 2001 - 08:52 AM
It operates differntly depending on the OS you use it on.
In win9x (1 millisecond) line, timegettime() returns a much more percise number then in winNT/win2k(5 milliseconds) line. To fix this you need to call a fuction to set the timers resolution.
timeBeginPeriod(1); //Sets timer res to 1 millisecond
when your done with the timer (basiclly the unint of your prog)
Make sure that the value you pass to timeEndPeriod is the same as what you passed in timeBeginPeriod;
you might want to do this
#define TIMERES 1;
a = timeGetTime();
cout << "This computer was turned on " << a << "ms ago!";
Some more notes about timeGetTime()
"The return value wraps around to 0 every 2^32 milliseconds, which is about 49.71 days." -MSDN Library
The required files are Mmmsystem.h and libray Winmm.lib