• Advertisement
Sign in to follow this  

Milliseconds without mmsystem.h

This topic is 4795 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Is there a way to measure milleseconds accurately without using mmsystem.h? Im trying to limit FPS to about ~30 which is seamless to the human eye i believe.

Share this post


Link to post
Share on other sites
Advertisement
Hi!

You could also use the high-resolution performance counter to limit your frame rate. Look for the following functions:

QueryPerformanceCounter
QueryPerformanceFrequency

Bye, hunta

Share this post


Link to post
Share on other sites
Quote:
Original post by Mastadex
Is there a way to measure milleseconds accurately without using mmsystem.h?
Im trying to limit FPS to about ~30 which is seamless to the human eye i believe.


This method uses Assembly and is the most precise one, but ain't good on laptops with variable speed (just try Freedom Figther, it has a little problem with that ;-)).
Look up the RDTSC (Read Time Stamp Counter), get the processor speed via RDTSC and then calculate how much time has passed since last frame via RDTSC calls.
Check if the cpu supports it (everyone who tries your game should, because every cpu since pentium has it) via CPUID (RDTSC is some bit which is set to 1 in EAX, check the official Intel docs).

Share this post


Link to post
Share on other sites
Quote:
Original post by Mastadex
Is there a way to measure milleseconds accurately without using mmsystem.h?
Im trying to limit FPS to about ~30 which is seamless to the human eye i believe.


On a side note: due to the way our screens work, 30 fps is not totally seamless to human eyes. This is why games try to target 60 fps. Of course, in most case 30 fps is still correct :)

Regards,

Share this post


Link to post
Share on other sites
Quote:
Original post by Emmanuel Deloget
Quote:
Original post by Mastadex
Is there a way to measure milleseconds accurately without using mmsystem.h?
Im trying to limit FPS to about ~30 which is seamless to the human eye i believe.


On a side note: due to the way our screens work, 30 fps is not totally seamless to human eyes. This is why games try to target 60 fps. Of course, in most case 30 fps is still correct :)

Regards,


That depends on the game... A RTS doesn't need more than 30 fps because the units doesn't move fast across the screen (well, not in most RTS), but a FPS need above 30 fps (more than 50 fps in my opinion).

Share this post


Link to post
Share on other sites
However the problem is "how get milliseconds from the system".
If you use Windows go for timeGetTime().
Assembler solutions are not always compatible (this works on Pentium and another on AMD...); using an API function make your code compatible with different CPUs and different version of the same OS.
<time.h> functions don't work in practice...

Share this post


Link to post
Share on other sites
Quote:
Original post by blizzard999
However the problem is "how get milliseconds from the system".
If you use Windows go for timeGetTime().
Assembler solutions are not always compatible (this works on Pentium and another on AMD...); using an API function make your code compatible with different CPUs and different version of the same OS.
<time.h> functions don't work in practice...


That would be nice but it violates his original post.

From its doc. in msdn

Windows NT/2000/XP: Included in Windows NT 3.1 and later.
Windows 95/98/Me: Included in Windows 95 and later.
Header: Declared in Mmsystem.h; include Windows.h.
Library: Use Winmm.lib.

Cheers
Chris

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement