Jump to content

  • Log In with Google      Sign In   
  • Create Account


#ActualInferiarum

Posted 09 December 2012 - 04:15 AM

You have to estimate the RTT separately (e.g. include the latest time stamp of the input packets from the client in the server packet). If the server time is 300 and it takes 100 ms for the update to get to the client, then the client has the update with time stamp 300 when it is already 100 ms old, now he adds the RTT to the packet time stamp and is now 100 ms ahead (if the estimate is correct). If the actual client time differs from this target time you adjust it slightly.

Ok, so here is how I do it (slightly different from what I described above since I do not use the RTT explicitly)

On the client we have the internal clock CLK and an offset OS, such that ideally the client time CT = CLK + OS = ST + RTT/2

Client sends input packet with current time stamp CT1
server receives input, updates simulation, sends out new game state, includes the current server time ST, and the stamp of the latest input packet CT1
Now ideally CT1 and ST should be the same
When we receive the game state update on the client (which includes CT1 and ST) we update
OS = OS + alpha*(CT1 - ST )
with some small alpha > 0
and the new client time would be
CT2 = CLK + OS

You can still keep an estimate of the RTT if you need it somewhere
RTT = (1-alpha)*RTT + alpha*(CT2-CT1)

#2Inferiarum

Posted 08 December 2012 - 07:44 AM

You have to estimate the RTT separately (e.g. include the latest time stamp of the input packets from the client in the server packet). If the server time is 300 and it takes 100 ms for the update to get to the client, then the client has the update with time stamp 300 when it is already 100 ms old, now he adds the RTT to the packet time stamp and is now 100 ms ahead (if the estimate is correct). If the actual client time differs from this target time you adjust it slightly.

Ok, so here is how I do it (slightly different from what I described above since I do not use the RTT explicitly)

On the client we have the internal clock CLK and an offset OS, such that ideally the client time CT = CLK + OS = ST + RTT/2

Client sends input packet with current time stamp CT1
server receives input, updates simulation, sends out new game state, includes the current server time ST, and the stamp of the latest input packet CT1
Now ideally CT1 and ST should be the same
When we receive the game state update on the client (which includes CT1 and ST) we update
OS = OS + alpha*(CT1 - ST )
with some small alpha > 0
and the new client time would be
CT2 = CLK + OS

You can still keep an estimate of the RTT if you need it somewhere
RTT = (1-alpha)*RTT + alpha*(CT2-CT)

#1Inferiarum

Posted 08 December 2012 - 07:26 AM

You have to estimate the RTT separately (e.g. include the latest time stamp of the input packets from the client in the server packet). If the server time is 300 and it takes 100 ms for the update to get to the client, then the client has the update with time stamp 300 when it is already 100 ms old, now he adds the RTT to the packet time stamp and is now 100 ms ahead (if the estimate is correct). If the actual client time differs from this target time you adjust it slightly.

Ok, so here is how I do it (slightly different from what I described above since I do not use the RTT explicitly)

On the client we have the internal clock CLK and an offset OS, such that ideally the client time CT = CLK + OS = ST + RTT/2

Client sends input packet with current time stamp CT1
server receives input, updates simulation, sends out new game state, includes the current server time ST, and the stamp of the latest input packet CT1
Now ideally CT1 and ST should be the same
When we receive the game state update on the client (which includes CT1 and ST) we update
OS = OS + alpha*(CT1 - ST )
with some small alpha > 0
and the new client time would be
CT2 = CLK + OS

PARTNERS