Where to put the jitter buffer

Started by
3 comments, last by hplus0603 11 years, 3 months ago

So, I have been working on my networking code more and more, and to be honest it's starting to look pretty nicely :). But, enough talk... my question is pretty straight. When implementing a jitter buffer, where in the packet delivery chain is common place to put it? I see a few different options:

  1. You apply the jitter buffer to the entire packet "receive" mechanism, basically buffering each packet as it comes in and then hands it out at the proper interval to the rest of the code.
  2. Since some data might be time critical, you put the jitter buffer "after" the packet receive mechanism and move it to the specific parts of the application that need the buffering - for example rendering of positions.

Also is it common place to use a jitter buffer on both ends of the connection (server and client), or do you usually just run it on one (the client I would guess) ?

Advertisement
The sender typically just sends as soon as possible. Additional latency there just adds latency, without any real help against jitter.

You may decide that "time critical" packets are delivered on receipt, rather than de-jittered. The question is then: What will the experience be when you actually have jitter, and those "time critical" packets are delivered with some delay (say, exactly the delay of your de-jitter buffer.) If the game is still a fine experience, then why not de-jitter everything all the time, if the game is still fun and the code simpler? There may be cases where instant delivery on receipt actually on average makes the game better; in that case, that might be a fine optimization to implement.
enum Bool { True, False, FileNotFound };
The sender typically just sends as soon as possible. Additional latency there just adds latency, without any real help against jitter.

Ah yes, maybe I was not clear - assuming a Client/Server model, is it common place to de-jitter things sent form the server to the client and also de-jitter things sent from the clients to the server. Or is it usually just applied in one direction? I would assume both.

You may decide that "time critical" packets are delivered on receipt, rather than de-jittered. The question is then: What will the experience be when you actually have jitter, and those "time critical" packets are delivered with some delay (say, exactly the delay of your de-jitter buffer.) If the game is still a fine experience, then why not de-jitter everything all the time, if the game is still fun and the code simpler? There may be cases where instant delivery on receipt actually on average makes the game better; in that case, that might be a fine optimization to implement.

As always a very good point, and I'm leaning towards putting the entire packet in a jitter-buffer, as it just makes the entire code easier and cleaner (like you say). Also I worry about actually untangling what *can* be delivered instantly and what has to be de-jittered, it just seems like it would lead to a slew of possible de-sync issues and weird behavior.

Are you sure you need this? I would try not to add any more latency than the network already has, because that could make the game feel even more laggy than a few packets arriving a bit later. Maybe you could just let it through as fast as possible and when the client receives corrections from the server apply them gradually other a bit of time so the player sees no "teleporting".

It has been my experience that a game that has rock-solid, predictable latency with no jitter is considered less "laggy" than games where things sometimes work, and sometimes not.

For example, I think the Halo series use 100 millisecond network ticks (perhaps 67 milliseconds, it's been a while) which means they pack 6 simulation ticks into each packet out the gate. This adds latency. Yet, the game series is considered a very well playing FPS :-)
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement