Jump to content
  • Advertisement
Sign in to follow this  
Ozymandias42

Handling packet variance

This topic is 4235 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So, I'm using client-side prediction. One issue I was having was that often a movement packet would get to the server 1-3 gamestates early or late. So, at server time 50, it'll see a move command from a client aimed at server time 52 or server time 48. I'm kind of torn as to how to handle this. If I add some extra fake lag, maybe 3 or 4 ticks' worth, I can keep most of the guesses in the future (so 50-54 instead of 48-52). Then the question is just how to handle them. Right now, I'm holding them in a queue, not evaluating them until their destined moment, but that seems a little odd to me. Is there a better way to handle this?

Share this post


Link to post
Share on other sites
Advertisement
To remove jitter, you have to add buffering, which means holding data in the de-jitter buffer to delay it.

You can also keep a running tab on how late packets are, and keep an estimated clock difference, plus a margin based on the standard deviation of that jitter. You would then discard packets that are later than mean jitter + standard deviation, and buffer the rest for that long. If you use moving averages for the calculations, you can adapt to changing network conditions.

Share this post


Link to post
Share on other sites
Alright, just making sure that adding buffering was the sane thing to do.

My client, as it stands now, sends control updates to the server more or less constantly (though obviously I'll want to get around to making it not send control updates when the client's not pressing buttons). Each of those messages is marked with the timestamp the client expects the server to receive it at (that being the time the client's predicting its object at).

When the server gets it, it remembers how far off the most recently received one was, and when it sends an update back to the client, it tells it how far off the most recent one was.

On which side would it be better to work out the average, do you suppose? Should the server keep a simple or exponential average of the incoming packets and send the client an absolute adjustment, or should the server just tell the client exactly how far off the most recent one was and let it do some sort of averaging?

Share this post


Link to post
Share on other sites
I just implemented one of the two real quick. I let the server take care of averaging the last few incoming command packets. It then sends back a predictionDelta variable back to the client, which accepts the new value as absolute without doing further averaging on its own. Seems to work pretty well. Thanks.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!