Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Moe

WM_TIMER fast enough for input?

This topic is 6478 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am planning to use Direct Input for getting my input in my game (this is not a DX related question). I would like to poll for input 30 times a second. I noticed in the Direct Input program that came with the SDK that they used the WM_TIMER message and set the timer so it would go off so many times per second and the input would be polled. Would this be practical for accuracy? I realize the WM_TIMER message is considered a lower priority, but could it be used to an accuracy of 1/30th of a second or should I make up my own timing functions?

Share this post


Link to post
Share on other sites
Advertisement
Forget about using WM_TIMER.

1) Never rely on WM Messages when timing is required.
2) If the FPS is below 30FPS, it will miss some WM_TIMER calls. If the FPS is over 30FPS sprites will move too fast and WM_TIMER wont keep up.
3) Better to use an MMTimer using a callback function, better still...get the DInput state in your UpdateFrame function. This way every frame will catch the input resulting in a fast reaction time The sprites need to be timed by ftime() at every frame to keep everything at a constant speed no matter what the FPS is. This is how *most* games work including mine, sometimes my game runs at 700FPS (without vsync) and everything runs the same speed as if it was running at 10FPS...

Share this post


Link to post
Share on other sites
So using it for anything with an accuracy of about 30 times a second is out of the question?

I am not using it for anything else, just for timing the input. And pulling out the timing is WAY out of the question. I did that on a previous program and it just wasn''t workable. I would just barely hit one of the arrow keys for as short a time as possible and the object on the screen would fly across soooo fast.

By MMTimer do you mean something like timeGetTime() or QueryPerformanceFrequency() type timers? I was trying to think how to do this. I am not sure but here is how I would do it:
1. get the current time
2. set the current time as some other variable (eg next_frame)
3. get the current time again and check to see if it is greater than 1/30th of a second compared to the first time checked. If is return Ok or something like that and then get the next_frame to be the current time again.
4. repeat 3, etc

Am I anywhere close to a solution?

Share this post


Link to post
Share on other sites
Here''s an example of the High Res. timer that I more-than-half-way stole from Nehe :
  
typedef struct _Timer {
__int64 frequency;
float resolution;
unsigned long mm_timer_start;
unsigned long mm_timer_elapsed;
bool performance_timer;
__int64 performance_timer_start;
__int64 performance_timer_elapsed;
} Timer;

Timer BestResTimer;

void InitStateChanging(void) {
memset(&BestResTimer, 0, sizeof(BestResTimer));

if(!QueryPerformanceFrequency((LARGE_INTEGER *) &BestResTimer.frequency)) {
BestResTimer.performance_timer = FALSE;
BestResTimer.mm_timer_start = timeGetTime();
BestResTimer.resolution = 1.0f/1000.0f;
BestResTimer.frequency = 1000;
BestResTimer.mm_timer_elapsed = BestResTimer.mm_timer_start;
} else {
QueryPerformanceCounter((LARGE_INTEGER *) &BestResTimer.performance_timer_start);
BestResTimer.performance_timer = TRUE;
BestResTimer.resolution = (float) (((double)1.0f)/((double)BestResTimer.frequency));
BestResTimer.performance_timer_elapsed = BestResTimer.performance_timer_start;
}
}

float GetCurrentState(void) {
__int64 time;
if(BestResTimer.performance_timer) {
QueryPerformanceCounter((LARGE_INTEGER *) &time);
return ( (float) ( time - BestResTimer.performance_timer_start) * BestResTimer.resolution)*1000.0f;
} else {
return ( (float) ( timeGetTime() - BestResTimer.mm_timer_start) * BestResTimer.resolution)*1000.0f;
}
}




http://www.gdarchive.net/druidgames/

Share this post


Link to post
Share on other sites
I think what I should probably do is write an application wide timer that can be check for 1/30th of a second. That way I can base all my animation, input, etc on it. I think that timeGetTime will be accurate enough to use and that way I won''t have to mess around with all the PerformanceCounter stuff, which will make my program a bit simpler. How does this sound?

Never cross the thin line between bravery and stupidity.

Share this post


Link to post
Share on other sites
Well, with the code I stole above, it is relatively simple to get the number of milliseconds passed. Just call InitStateChanging once at start up. And call GetCurrentState each time you need to now what the number of ticks is.

Either way, your method should work out. If I understand what you meant .

Wow, I hadn''t noticed how badly the source tags messed up my code''s formatting .

"Finger to spiritual emptiness underlying everything." -- How a C manual referred to a "pointer to void." --Things People Said
http://www.gdarchive.net/druidgames/

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!