Sign in to follow this  
Norman Barrows

what can cause spikey input ET's?

Recommended Posts

Norman Barrows    7179

what can cause spikey input ET's?

 

i recently timed the render, input and update for caveman, as was surprised to find a fair amount of variation in the time for input, even with no user input. all the code does is process windows messages for mouse input and poll async key states. even when not moving the mouse or pressing any keys, times could vary from 50 to over 1000 ticks. i didn't have mins and maxes setup, so it don't know what the actual low and high values were, these are about what they appeared to be as the times flashed by on the screen.

 

 

Share this post


Link to post
Share on other sites
Pink Horror    2459

You haven't given much to go on here. Also, maybe this is a dumb question and this term is used all the time when talking game performance in circles outside of my own, but what's a "tick"? Do you have an actual time measurement for this? My vague recollection of previous games' performance is based on percentages of a frame and actual time.

 

Share this post


Link to post
Share on other sites
L. Spiro    25638

and poll async key states

If you are not getting all input from WM_* messages then you are doing it wrong.

As for the rest, you need to specify what a tick is and how you are timing things. If you aren’t using QueryPerformanceCounter(), you are doing it wrong.

But to answer broadly, “What can cause jittery input times?”, it could be your timing method (timeGetTime() and friends), or it could be that your input is not running on its own thread, thus allowing it to be halted by your game logic or rendering.


L. Spiro Edited by L. Spiro

Share this post


Link to post
Share on other sites
Norman Barrows    7179

what's a "tick"?
 
// timers
DWORD Ztimer[10];
 
LARGE_INTEGER Ztimer_freq,Ztimer2[10];
 

 
// called from init_generic_game_library:
QueryPerformanceFrequency(&Ztimer_freq);
 
 

// start timer a
void Zstarttimer(int a)
{
Ztimer[a]=GetTickCount();
QueryPerformanceCounter(&Ztimer2[a]);
}
 
// returns elapsedtime of timer a
int Zelapsedtime(int a)
{
LARGE_INTEGER c,f;
float g,h;
QueryPerformanceCounter(&c);
f.QuadPart=c.QuadPart-Ztimer2[a].QuadPart;
g=(float)f.QuadPart;
h=(float)Ztimer_freq.QuadPart;
g*=1000.0f;
g/=h;
return((int)g);
}
 
int Zelapsedticks(int a)
{
LARGE_INTEGER c,f;
QueryPerformanceCounter(&c);
f.QuadPart=c.QuadPart-Ztimer2[a].QuadPart;
return((int)f.QuadPart);
}
 


so, if i remember my query performance counter definition correctly, that should be timer ticks - which can vary from PC to PC.

 

originally i used Zelapsedtime, and got something like 25ms for render and zero ms for input and update. so i switched to ticks, as i was interested in the ratio of input to render time. 

Share this post


Link to post
Share on other sites
_the_phantom_    11250
You do realise that 50 to 1000 is basically nothing here, right?

You are effectively wondering why things vary on a non-real time OS by a tiny fraction of time; heck depending on your CPU the counter might not even be 100% reliable anyways.

Share this post


Link to post
Share on other sites
Norman Barrows    7179


If you are not getting all input from WM_* messages then you are doing it wrong.

 

i'm only interested in the current state of a key when i process input, not all key presses and releases since last input processing. so getasynckeystate seems adequate. is there some disadvantage i'm unaware of? 

 

you see, the keyboard state at the time input is processed represents the human player's "move" for a "turn". if you process all keystrokes, and not just the current state, you're essentially giving the player move than one move per turn. 

Share this post


Link to post
Share on other sites
Hodgman    51342

Assuming a tick is 1ns, that's 50ns to 1?s? Doing all the input polling and OS message babysitting in a microsecond is not bad... 0.006% of a 60Hz frame and all...

As  phantom mentioned, you're going to run into a lot of noise at that scale. Things like whether a particular bit of data was in L3 vs L2 cache is going to have a huge (relative) impact.

 

What is Ztimer_freq? around 1000000000?

Why not just make Zelapsedtime return a double, so you can multiply by 1000 to see ms, 1000 again to see ?s, 1000 again to see ns, etc...

Edited by Hodgman

Share this post


Link to post
Share on other sites
Norman Barrows    7179


it could be that your input is not running on its own thread, thus allowing it to be halted by your game logic or rendering.

 

i put separate timers on render, input , and update. they occur sequentially in that order in the game loop - single thread.

 

since process_input just processes windows messages for mouse and calls getasynckeystate when the'res no human input, it would seem the variation is in windows message and keyboard routines.

Share this post


Link to post
Share on other sites
Norman Barrows    7179


You do realise that 50 to 1000 is basically nothing here, right?

You are effectively wondering why things vary on a non-real time OS by a tiny fraction of time; heck depending on your CPU the counter might not even be 100% reliable anyways.

 

while the amount of time involved is negligible, i did find the variation curious.   as you say, it can easily be chalked up to the OS and/or hardware.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this