Handling packet variance

Started by
2 comments, last by Ozymandias42 17 years, 5 months ago
So, I'm using client-side prediction. One issue I was having was that often a movement packet would get to the server 1-3 gamestates early or late. So, at server time 50, it'll see a move command from a client aimed at server time 52 or server time 48. I'm kind of torn as to how to handle this. If I add some extra fake lag, maybe 3 or 4 ticks' worth, I can keep most of the guesses in the future (so 50-54 instead of 48-52). Then the question is just how to handle them. Right now, I'm holding them in a queue, not evaluating them until their destined moment, but that seems a little odd to me. Is there a better way to handle this?
Advertisement
To remove jitter, you have to add buffering, which means holding data in the de-jitter buffer to delay it.

You can also keep a running tab on how late packets are, and keep an estimated clock difference, plus a margin based on the standard deviation of that jitter. You would then discard packets that are later than mean jitter + standard deviation, and buffer the rest for that long. If you use moving averages for the calculations, you can adapt to changing network conditions.
enum Bool { True, False, FileNotFound };
Alright, just making sure that adding buffering was the sane thing to do.

My client, as it stands now, sends control updates to the server more or less constantly (though obviously I'll want to get around to making it not send control updates when the client's not pressing buttons). Each of those messages is marked with the timestamp the client expects the server to receive it at (that being the time the client's predicting its object at).

When the server gets it, it remembers how far off the most recently received one was, and when it sends an update back to the client, it tells it how far off the most recent one was.

On which side would it be better to work out the average, do you suppose? Should the server keep a simple or exponential average of the incoming packets and send the client an absolute adjustment, or should the server just tell the client exactly how far off the most recent one was and let it do some sort of averaging?
I just implemented one of the two real quick. I let the server take care of averaging the last few incoming command packets. It then sends back a predictionDelta variable back to the client, which accepts the new value as absolute without doing further averaging on its own. Seems to work pretty well. Thanks.

This topic is closed to new replies.

Advertisement