Command Frames and Tick Synchronization

Started by
28 comments, last by poettlr 5 years, 7 months ago

Hello, 

This is actually my first post after being a lurker for quite some time here. 

For the last couple of days I try to get my head around the concept of Overwatch's multiplayer architecture after watching the GDC 2017 talks by Timothy Ford ("Overwatch Gameplay Architecture and Netcode") and Dan Reed ("Networking Scripted Weapons and Abilities in Overwatch"). ( I hope someone here has Vault Access :D)

As usual with such complex systems Overwatch seems to combine different approaches to hide latency. Among others, the ones I am mostly interested in are, Command Frames and State Synchronization. I think I understood both concepts but I have problems adding them up. 

1) Ford talks about the client being in the future by ~ half the RTT plus one command frame. This ensures that all commands sent from the client to the server tend to arrive when the server is actually at the tick referenced in the command. If that's correct my assumption would be the following: 

  • Server is at tick 1000. 
  • RTT is ~ 10 ticks -> around 10*16.6 ms. 
  • Client simulates tick 1005 and sends command associated with tick 1005. 

<5 ticks later> 

  • Server is at tick 1005 and received a client command. (maybe its already queued in a buffer). 
  • Server applies command and sends state for tick 1006 (State_N + Input_N = State_N=1).
  • RTT might still be 10 ticks. 
  • Client simulates tick 1010.

<5 ticks later> 

  • Server is at tick 1010...
  • Client received State for tick 1005 and checks internal buffers for prediction. 

Does that really apply? Does the client really simulate half-rtt in the future? 
 

2) How do I handle ticks at the start of the game? My network layer requires a timestamp to work and I'd use ticks in the main game loop. Do I have something like a grace period until the client can calculate the required ticks to move to the future (by calling simulation.Tick(1/60f) for the number of ticks to be in the future)?

3) If I run the simulation at 60Hz and the network layer at say 20Hz. Do I have 60 Inputs that I send from Client to Server or 20? 

I know, this is somewhat similar to other questions in this forum, but I feel like this particular talk has never been discussed? 

Cheers, 
poettlr



 

 



 

 

Advertisement
Quote

 

Does that really apply? Does the client really simulate half-rtt in the future?

 

Yes, this frequently happens, especially in action games.

Quote

How do I handle ticks at the start of the game?

Typically, the first connection message will tell you what tick the server was at, and you can snap your tick delta to that value plus some guess to get within the ballpark. A few more packets will then get the tick offset "dialed in" using whatever normal mechanism you use to synchronize tick/time offsets. If all of the servers are on a synchronized clock, this happens during login, so your tick is already synchronized when you start listing available games to join. If you use user-hosted servers, then you need to synchronize when you first get introduced to the user-hosted server, which can typically happen while you're loading the game level and doing other pre-game / lobby activities.

Quote

 

Do I have 60 Inputs that I send from Client to Server or 20?

 

Typically 60.

Note that, with modern systems running at 120 Hz or 144 Hz, and Virtual Reality benefiting from up to 120 Hz update rate, you may want to re-think the "60 Hz is the base rate" common wisdom, and run your loop faster. USB has a latency of a single millisecond if you have really good mice/keyboards/gamepads.

 

enum Bool { True, False, FileNotFound };

Wow that's super helpful! 
I actually use dedicated servers but they are managed via a REST API. 

So for now my last big problem will be client/server time synchronization, after searching through the forum I have some plans on how I want to approach that. I plan to include the server time and server tick (at the time of sending the packet) in all state sync packets or send the server time/tick at a given interval (be it at 5Hz or less). Since my network layer already provides me with RTT I think I have all variables in place to do that. 
Probably something including a stopwatch.ElapsedTimeMilliseconds since the last received packet on the client. Or just using the RTT since the game should run on a fixed loop anyway. 

Is there a list of possible frame/time sync algorithms or a paper to read into? 
 

Typically, you'll want to include the "current client tick" in each client-to-server message.

The server would record its tick as soon as it receives the message.

Then, when the server sends a message to the client, it will include the following information:

The last tick you sent was A. I received it at tick B. When I send you this message, it's now tick C.

The client can then use this information, plus the tick D at which it received the server update, to calculate an appropriate offset.

Alternatively, the client can just include its tick, and the server just includes "you were X ticks behind/ahead in the last message you sent" in the response. Beware overlapping adjustments that will make the client offset gyrate wildly, though. You can typically get pretty good with a simple algorithm:

  • Each time you're more than 32 ticks off, adjust by one-quarter the amount in the appropriate direction.
  • Each time you're more than 15 ticks off but less than 33, adjust by one-eighth the amount in the appropriate direction.
  • Each time you're late by 0 to 15 ticks, adjust backward by one tick.
  • Each time you're early by 1 to 7 ticks, leave it be. (Or some other value for "7" for how much jitter compensation you're prepared to accept.)
  • Each time you're early by 8 to 15 ticks, adjust forward by one tick.

This will converge somewhat slowly, but is tolerant to "overshoot" where the server will keep sending you "you're late" messages for a bit after you make an adjustment, because of the latency and overlapping messages involved. There are of course other more stateful mechanisms, using generation counters or packet serial numbers or such, in the algorithm/protocol.

enum Bool { True, False, FileNotFound };

 

8 hours ago, hplus0603 said:

Alternatively, the client can just include its tick, and the server just includes "you were X ticks behind/ahead in the last message you sent" in the response. Beware overlapping adjustments that will make the client offset gyrate wildly, though.

I just tried that. In a simple Unity Update Loop with two different flavors. 

a) Advance the Simulation N-times if needed. But don't advance but adjust tick backwards if we are too far in the future.


ulong clientTick;

//not pretty
void Update() {
  int offsetFromServer = GetLastServerOffset();
  int ticksToSimulate = 0;
  if(offset < -32) {
    ticksToSimulate = (int) (Math.Abs(offset) * 1f/4f);
  } else if(offset < -15) {
    ticksToSimulate = (int) (Math.Abs(offset) * 1f/8f);
  } else if(offset < 0) {
    ticksToSimulate = 2;
  } else if(offset > 32) {
    clientTick -= (int) (offset * 1f/4f); //?
  } else if(offset > 15) {
    clientTick -= (int) (offset * 1f/8f); //?  
  } else if(offset > 8) { 
    clientTick -= 1; //?
  } else { 
    //offset >= 0 && offset < 8
    ticksToSimulate = 1;
  }
  if(ticksToSimulate > 0) {
    SimulateFor(ticksToSimulate);
  }
}

That.. well did not work as expected. Setting the clientTick to the past is also probably not a good idea? 

b) Based on the offset adjust the time between ticks without touching the actual tickDelta for the simulation. If the client is too far behind Simulate() is called more often but still uses a fixed delta of 16ms internally. 


ulong clientTick;
float tickRate = 1/60f;
float adjustedTickRate = 1/60f;
double lastTickTime = 0;

//not pretty //Note hacky unity update running at 300fps+
void Update() {
  if(lastTickTime + adjustedTickRate <= Time.time) {
    
  int offsetFromServer = GetLastServerOffset();
  if(offset < -32) {
    adjustedTickRate = tickRate * 0.75f; 
  } else if(offset < -15) {
    adjustedTickRate = tickRate * 0.875f;
  } else if(offset < 0) {
    adjustedTickRate = tickRate * 0.9375f;
  } else if(offset > 32) {
    adjustedTickRate = tickRate * 1.25f;
  } else if(offset > 15) {
    adjustedTickRate = tickRate * 1.125f;
  } else if(offset > 8) { 
    adjustedTickRate = tickRate * 1.0625f;
  } else { 
    adjustedTickRate = tickRate;
  }
    
   Simulate();
    
   lastTickTime = Time.time; 
  }
}

Do you have any pointers on how to adjust the client tick in a better way? Did you have something like this in your mind? 
Both versions seem to need a fail-safe of either setting the clienttick to nearly the server tick if it is too far off and the second version (b) needs a very long time to adjust the client tick to ~ the server tick. Maybe mix both versions? 

Another thing bugging my mind is: Should I take the RTT into account and recalculate the offset on the client side before using it for adjustments? Like a "hey I already did that adjustment in the last frame, I should be fine"-kinda way.

Cheers

Quote

Like a "hey I already did that adjustment in the last frame, I should be fine"-kinda way.

Yes, that's the "danger of adjusting too much because of overlapping outstanding messages" problem I talked about. There are multiple ways to implement that.

Another way would be to send your current tick and your current offset to the server, and the server would then send a recommended offset back. That would let you adjust without the risk of oscillation.

Or you can go all science on the problem and look at the NTP protocol (or PTP) although those protocols solve a slightly different problem. For games, all that matters is that events happen in the same order, and inputs arrive at the server ahead of the time they're needed, which means that clock skew in one direction is much worse than clock skew in the other direction.

enum Bool { True, False, FileNotFound };
On 5/8/2018 at 6:58 AM, poettlr said:

So for now my last big problem will be client/server time synchronization

If it's of any help, I just implemented this recently: https://pastebin.com/EaAK9Fce

After connecting, the client starts sending timestamped messages to the server periodically, which include it's own clock value and a guess of the server's clock value. The server responds by sending the client's clock value back to them, along with the server's actual clock value, and the error between that and their guess.

The client can compare their own clock at the time of receipt to their clock value included in the packet to measure the RTT. They can use the server's clock value to compute a delta between the client/server clocks (allowing them to convert times on one clock to times on the other), and can use the "guess error" value to tell how well calibrated their last attempt was.

I keep collecting a lot of these deltas/errors, and then use some statistics to make an educated guess. Each time the client gets a new response, they push the computed delta and error values into some vectors. They then calculate the standard deviation and the median of each vector, and then calculate a 32% truncated mean (the average, but ignoring any values that are further than 1 std dev from the median)... They then report these means as the "correct" delta value to use when trying to convert between client/server clocks.

I also report a made up "confidence" score, which is weighted by now many samples it's based on (more timing packets = more confidence), how many passed the filter (less outliers / tighter distribution = more confidence), and the mean error value reported by the server (server reports that our guesses are fairly close = more confidence). Once my confidence score reaches some arbitrary threshold, I stop sending these timing packets and assume that the clock is synchronized.

If, at any point during gameplay, the client realizes that a packet has arrived at the client before it was sent from the server, then obviously our clock synchronization was wrong (the alternative of faster-than-light internet is less likely...) so the age of such packets is immediately subtracted from the clock sync delta to get it back to a plausible level. So far this situation has only popped up on LAN where the pings can drop extremely low :)

Quote

 

Once my confidence score reaches some arbitrary threshold, I stop sending these timing packets and assume that the clock is synchronized.

 

Thanks for sharing!

You might want to keep the timing information in your general packet headers, to be able to adjust to changing network conditions (both lower and higher pings, as well as changing variance) on the fly, without having to see an "impossible" event. I imagine you can actually do this even for the initial set-up -- if timing is part of the packet headers, you don't need to explicitly send separate timing messages.

 

enum Bool { True, False, FileNotFound };
10 hours ago, hplus0603 said:

You might want to keep the timing information in your general packet headers, to be able to adjust to changing network conditions

Yeah I guess that if you're sending ~1KB sized packets, a few bytes of timer info is a small overhead. If my syncing code works, the goal is to synchronise the two clocks globally, regardless of ping, etc. i.e. If you photograph the client/server screens simultaneously, you would hopefully see the exact same clock value. So if my code works initially, varying network conditions don't affect it :)

However, my code only seems to manage to achieve sync within a few dozen milliseconds, not complete accuracy... Which is enough error for time travelling events to be noticed on LAN and require corrections to the clock sync to be applied later :)

In practice, you'll want sync to be such, that the packet from the client arrives at the server slightly ahead of when it's needed. The "slightly" value should typically be about three standard deviations of your jitter, so 99.7% of all packets arrive ahead of their being needed.

enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement