Hi there! I'm having some trouble dealing with jitter in my networked game. I've created a simplified example that should explain the problem:
Client A updates its position, P, each frame. For simplicity, say that P is incremented by one each frame. The game updates 60 frames per second, and every 3 frames (20 updates / second) the client sends this position to Client B.
B receives the position and uses this value. However, since we expect less than 60 updates per second from A, B will try to predict the increment to P by adding one to it each frame. This works just fine - most of the time.
What I'm seeing is some jittery movement on B's end, caused by packets arriving at an uneven pace. For example, here's a printout of P at the end of each frame:
P: 6 <- Got update from A, value = 6. All is well since we expect one update every three frames.
P: 9 <- Got update from A, value = 9.
P: 12 <- Missed an update!
P: 12 <- Got update from A, value = 12. Since it was one frame late, P stands still and we get jitter.
I guess I can't expect packets to arrive exactly on time, even though I'm running the test on a single computer and with all clients at 60 FPS. However, what to do about it? The obvious answer is smoothing, but I'd need a lot of it to cover up the jitter - the difference is often more than a frame or two. Are these results to be expected or would you suspect that something is wrong here?