what can cause spikey input ET's?

Started by
7 comments, last by Norman Barrows 9 years, 2 months ago

what can cause spikey input ET's?

i recently timed the render, input and update for caveman, as was surprised to find a fair amount of variation in the time for input, even with no user input. all the code does is process windows messages for mouse input and poll async key states. even when not moving the mouse or pressing any keys, times could vary from 50 to over 1000 ticks. i didn't have mins and maxes setup, so it don't know what the actual low and high values were, these are about what they appeared to be as the times flashed by on the screen.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

Advertisement

You haven't given much to go on here. Also, maybe this is a dumb question and this term is used all the time when talking game performance in circles outside of my own, but what's a "tick"? Do you have an actual time measurement for this? My vague recollection of previous games' performance is based on percentages of a frame and actual time.

and poll async key states

If you are not getting all input from WM_* messages then you are doing it wrong.

As for the rest, you need to specify what a tick is and how you are timing things. If you aren’t using QueryPerformanceCounter(), you are doing it wrong.

But to answer broadly, “What can cause jittery input times?”, it could be your timing method (timeGetTime() and friends), or it could be that your input is not running on its own thread, thus allowing it to be halted by your game logic or rendering.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid


what's a "tick"?

 
// timers
DWORD Ztimer[10];
 
LARGE_INTEGER Ztimer_freq,Ztimer2[10];
 

 
// called from init_generic_game_library:
QueryPerformanceFrequency(&Ztimer_freq);
 
 

// start timer a
void Zstarttimer(int a)
{
Ztimer[a]=GetTickCount();
QueryPerformanceCounter(&Ztimer2[a]);
}
 
// returns elapsedtime of timer a
int Zelapsedtime(int a)
{
LARGE_INTEGER c,f;
float g,h;
QueryPerformanceCounter(&c);
f.QuadPart=c.QuadPart-Ztimer2[a].QuadPart;
g=(float)f.QuadPart;
h=(float)Ztimer_freq.QuadPart;
g*=1000.0f;
g/=h;
return((int)g);
}
 
int Zelapsedticks(int a)
{
LARGE_INTEGER c,f;
QueryPerformanceCounter(&c);
f.QuadPart=c.QuadPart-Ztimer2[a].QuadPart;
return((int)f.QuadPart);
}
 


so, if i remember my query performance counter definition correctly, that should be timer ticks - which can vary from PC to PC.

originally i used Zelapsedtime, and got something like 25ms for render and zero ms for input and update. so i switched to ticks, as i was interested in the ratio of input to render time.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

You do realise that 50 to 1000 is basically nothing here, right?

You are effectively wondering why things vary on a non-real time OS by a tiny fraction of time; heck depending on your CPU the counter might not even be 100% reliable anyways.


If you are not getting all input from WM_* messages then you are doing it wrong.

i'm only interested in the current state of a key when i process input, not all key presses and releases since last input processing. so getasynckeystate seems adequate. is there some disadvantage i'm unaware of?

you see, the keyboard state at the time input is processed represents the human player's "move" for a "turn". if you process all keystrokes, and not just the current state, you're essentially giving the player move than one move per turn.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

Assuming a tick is 1ns, that's 50ns to 1?s? Doing all the input polling and OS message babysitting in a microsecond is not bad... 0.006% of a 60Hz frame and all...

As phantom mentioned, you're going to run into a lot of noise at that scale. Things like whether a particular bit of data was in L3 vs L2 cache is going to have a huge (relative) impact.

What is Ztimer_freq? around 1000000000?

Why not just make Zelapsedtime return a double, so you can multiply by 1000 to see ms, 1000 again to see ?s, 1000 again to see ns, etc...


it could be that your input is not running on its own thread, thus allowing it to be halted by your game logic or rendering.

i put separate timers on render, input , and update. they occur sequentially in that order in the game loop - single thread.

since process_input just processes windows messages for mouse and calls getasynckeystate when the'res no human input, it would seem the variation is in windows message and keyboard routines.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php


You do realise that 50 to 1000 is basically nothing here, right?

You are effectively wondering why things vary on a non-real time OS by a tiny fraction of time; heck depending on your CPU the counter might not even be 100% reliable anyways.

while the amount of time involved is negligible, i did find the variation curious. as you say, it can easily be chalked up to the OS and/or hardware.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

This topic is closed to new replies.

Advertisement