Jump to content
  • Advertisement

Command Frames and Tick Synchronization

Recommended Posts

Posted (edited)

Ok, wow, now I'm confused, sorry. 

I don't have a function like time-to-tick. 

As soon as my server is starting it also ticks and increments the local tick number every 16.6ms. 
When a client is starting it is executing the following function:

public class GameLoop : MonoBehaviour { 
  private const float TickRate = 60f;
  private const float NetRate = 1 / TickRate;
  private const float SlowRate = 1 / (TickRate - 3f);
  private const float FastRate = 1 / (TickRate + 3f);
  private const int MinimumJitterBuffer = 2;
  private float netTimer;
  private float adjustedRate = NetRate;
  private int addedJitterBuffer;
  private int tick;
  private int offset;
  public void Update() {
    netTimer += UnityEngine.Time.deltaTime;
    while (netTimer >= adjustedRate) {
      netTimer -= adjustedRate;
      int lastServerTick = SteelClient.Instance.NetworkInfo.LastPacketServerTick;
      float rttMillis = Client.Instance.NetworkInfo.RttMillis;
      bool loss = Client.Instance.NetworkInfo.HadLoss;
      AdjustTickRate(lastServerTick, loss, rttMillis);
      //Note regardless of the AdjustedRate we always use a simulation deltatime of 16.6ms. 
      //Therefore if the AdjustedRate would be faster more simulation steps appear in a second. 
      tickerFunction.Invoke(systemHelper.GetTimeSinceStartup(), tick++);

  private void AdjustTickRate(int lastKnownServerTick, bool loss, float rttMillis) {
    int rttAsTick = Mathf.CeilToInt(rttMillis / NetRate * 1000f);
    if(loss && addedJitterBuffer < rttAsTick + 10) 
      addedJitterBuffer += 2;
    //The last received offset from the server. An offset of 0 means that the server received the client message associated with a tick at 
    //the frame it is needed. 
    //Note: server should always run behind the client. 
    offset = Client.Instance.NetworkInfo.ClientOffset; 
    //For a new connection server tick will be higher. Client has to snap. 
    if(offset < -30 || offset > 30) {
      tick = lastKnownServerTick + rttAsTick + MinimumJitterBuffer + addedJitterBuffer; 
      adjustedRate = NetRate; 
    if(offset < -2 -MinimumJitterBuffer - addedJitterBuffer) {
      //The client runs to far in the future and should be dialed back a little. 
      adjustedRate = SlowRate; 
    else if(offset > -1 -MinimumJitterBuffer - addedJitterBuffer) {
  	  //The client falls back behind the server and should be running faster. 
      adjustedRate = FastRate;  
    else { 
      adjustedRate = NetRate;
    if (addedJitterBuffer > 0)

The "tickFunction" itself is basically the following: 

public void TickFunction(float time, int tick) {

Both, messages from my server and messages from my client have the current tick included (server messages send the server tick + the offset of the client and client messages send the tick this message should be executed at (which has to be a greater tick number than the server tick). Is that what you mean? 
The minimum information my server has to send to the client (atm) is the current server tick, client to server offset and a flag if the offset was definitely too small (I can remove the flag, I know that, but I send it also if the offset is 0). The offset is simply calculated via int offset = (int) (tick - inputCollector.GetLastReceivedClientTick(id));

The overall problem I observer is that the client is not able to guarantee a delivery and my server has to duplicate input (I send at least the last three inputs my client has taken).

Again, I'm really sorry, that I can't follow you, might be an issue with me not fully understanding all the multiplayer nomenclature.

That being said, I really appreciate all the help I can get, thanks! 

Edited by poettlr

Share this post

Link to post
Share on other sites

To provide smooth animation to the user, you need to establish a clock base, and then calculate "what should the current tick be" based on the local clock time! The messages between client and server determine what the relationship between "current time" and "current tick" are, but you should always calculate "current tick" as something like:

double baseTime = secondsAsDouble();
int baseTick = 0;

int current_tick() {
  double now = secondsAsDouble();
  int tick = (int)floor((now - baseTime) * 60 + baseTick);
  return tick;

The main role of server/client tick sync is to adjust the value of "baseTick"

Share this post

Link to post
Share on other sites

But that would mean that I simulate ticks twice or not at all. That would not really help with my order of events?

I was fairly confident in my system but now I have no idea if it would work. 

To be honest all I want to achieve is a 60fps simulation where the client runs ahead of the server. 
I am so confused now.

Let me try to describe my current problem.

The server has the authority and uses a simple fixed game loop to ensure my simulation runs at 60 Hertz. 

double newTime = stopwatch.ElapsedMilliseconds / 1000f;
double frameTime = newTime - currentTime;
currentTime = newTime;
accumulator += frameTime;
while (accumulator >= dt) {
  accumulator -= dt;
  t += dt;

Each time I call Simulation() I 
* Receive Messages from the Client
* Do calculations for my simulation using a fixed delta time of 16.6ms
* Send Messages to the Client
* Increment a tick number. 

Therefore my server Simulation function looks roughly like this: 

uint tick = 0;
void Simulation(double time) {
  gameSimulation.Simulate(time, tick);

This function is executed regardless of the number of clients and as soon as the server has started. tick starts to increment. 

So far so good. 

On my Unity Client, I can't guarantee a smooth execution of 60 Hertz because Update and Fixed Update are not controlled by my call structure, but for now, I can use the Update Function (which is called with approximately 120 Hertz by Unity. 

Basically its the same as the server and I use the Update function from my prev post: 

private const float TickRate = 1/60f; 
private float timer = 0f;

public void Update() {
  timer += UnityEngine.Time.deltaTime;
  while (timer >= TickRate) {
    timer -= TickRate;      

The Client_Simulation function is basically the same as the one on the server: 

uint tick = 0;
void Simulation(double time) {
  gameSimulation.Simulate(time, tick);

But from what I understand; Somewhere in this function, I have to handle the client tick in a very specific different way from what I have been using. 
Honestly, I'm lost, can you please point me in the direction I have to go to achieve "a client tick that is a bit ahead"? 

Help me, hplus0603. You're my only hope. 


Share this post

Link to post
Share on other sites

The server Receive() and Send() functions should use 'tick' not 'time' as their time keepers. Same thing for client -- you send and receive based on tick number, not time.

You should probably Receive() even if it's not "time" yet -- the idle loop for a server is typically waiting on either of "some data coming in" or "it being time for the next simulation step."

The client typically does the same thing. Something like:

Wait until it's time for the next tick (if the machine is fast) while receiving server updates.
Simulate a tick until caught up with time.
Render a frame.

With vsync, you will typically end up blocking on "render a frame" and not having to sleep much if any between the simulations.


Share this post

Link to post
Share on other sites

O The thing with time in Send and Receive functions is, that my Network Library is using the time to calculate RTT, bandwidth and check for keep-alive messages. To be honest... Receive is actually called Tick(double time) in my case. When calling the Tick functions all received messages (from the network thread) are pulled and fed to the simulation. It's basically the housekeeping. 
That's how the underlying network forwards my Packets:

while (transport.HasNextPacket()) {
  Datagram packet = transport.GetNextPacket();
  ProcessPacket(packet.payload, packet.payloadSize, packet.sender);

So basically I receive in a thread via my UDP Socket Context: 

public class UdpSocketContext : ISocketContext {
  private readonly Socket internalSocket;
  private Thread socketThread;
  //some variables ommited
  public UdpSocketContext(AddressFamily addressFamily) {
    internalSocket = new Socket(addressFamily, SocketType.Dgram, ProtocolType.Udp);
  public void Bind(EndPoint endpoint) {
    socketThread = new Thread(RunSocket);
  private void runSocket() {
    while (true) {
      try {
      catch (Exception e) {
        if (e is SocketException) {
          var socketException = e as SocketException;
          if (socketException.SocketErrorCode == SocketError.ConnectionReset) continue;

So that all being said; My tick number is mostly application agnostic and if I have messages that need to run in a specific tick I include the tick number in that message.

A simple example of a message with a tick number is my Input Message or the minimum GamePlayStateMessage: 

public class InputMessage : Message {
  public int clientTick;
  public Input input;
  public InputMessage(Input input, int clientTick): base(MessageType.Input) {
    this.input = input;
    this.clientTick = clientTick;
  //Serialization ommited

public class SimpleGamePlayStateMessage : Message { 
  public int clientToServerOffset;
  public SimpleGamePlayStateMessage(int clientToServerOffset): base(MessageType.ServerStatus) {
  	this.clientToServerOffset = clientToServerOffset;
  //Serialization ommited

My Entity Component System uses a fixed delta time each tick (regardless of actual time needed to simulate). 

void Simulate(double time, int tick) {
 ecs.ReplaceGlobalTime(time, tick, 1/60f); 
public void ReplaceGlobalTime(double time, int tick, float delta) {

So with all that, If I start my client and server (using no time sync at all) I had the following Scenario: 

  1. Booting up the server 
  2. The server starts to run and increment its tick number. 
  3. Some seconds later a Boot up a client
  4. The client starts to run and increment its tick number effectively being behind by seconds * 60 ticks.
  5. The client sends Input Messages to the server.
  6. The server receives but ignores the Input Messages because they are too old. 

Currently my old code from ...

... works pretty well. But it seems like there is a major flaw with Network Jitter (as I said). Since then you pointed out the animation flow ( I guess because I alter the tick rate of the whole simulation and sometimes it would execute 60*16.6ms of simulation and sometimes 64*16.6ms of simulation and sometimes 56*16.6ms of simulation. All based on the current adjusted rate but still executed in a timeframe ~1 second.). 

So what you suggest all in all (even though I don't know if I understand it correctly) is to find a way of altering the executed tick without altering the tickrate? How would I achieve that without losing a certain number of ticks? Or do I really execute all ticks in between? 

For example, if my server is at tick 1000 receives a message from the client that is actually for tick 995, my client needs to run at least 5 ticks to be again ahead of the server (to be safe probable 6 ticks).
So the server response for tick 1000 would be clientToServerOffset = 5 (the 5 frames the client lacks behind). 

If that's correct, ok but, do I then save the history of all those "simulation steps" as before? Do I send an Input Message for each tick?
Wouldn't it have the same effect on the animation? 

So many questions :O, sorry



Share this post

Link to post
Share on other sites

do I really execute all ticks in between?

When you jump, yes, you'll typically either "freeze" for a little bit, or "simulate faster" for a render frame or two. The user experience will jump. That should happen approximately once, during start-up, so that shouldn't matter.

Typically, you will send the input state at the beginning of each simulated tick, just the same way your simulation/physics gets to see it when it simulates. If you send network packets at 30 Hz, but simulate at 240 Hz, that means you send input for 8 ticks per network packet, plus whatever backlog you repeat if you want packet drop redundancy.

Share this post

Link to post
Share on other sites

Ok so, I can keep the offset and whenever I notice I'm running to slow on the client I execute two ticks instead of one. Is that that what you mean? Or should I alter the tickrate for a couple of milliseconds like I do atm? 
Also, when I notice the client to server offset is too high (meaning the client is way to far in the future) I freeze and do not advance my simulation at all? 

At the moment I have a 60Hz Network Tick Rate and a 60 Hz Simulation Tick Rate;


Share this post

Link to post
Share on other sites

whenever I notice I'm running to slow on the client I execute two ticks instead of one

Yup! And the main reason for "running slow" on the client would be that the latency to the server increased. This will typically happen if the user is on some kind of mobile internet and moving around, so it's not super common.

when I notice the client to server offset is too high (meaning the client is way to far in the future) I freeze and do not advance my simulation at all?

Yes! And when you're too far out of sync, you may need to "snap" the current time offset, and then figure out how to recover the correct new state of objects being simulated. The "really large snap" case will typically only happen during game start-up, or when a computer is having trouble like hibernate/sleep, suddenly paging to disk, or other extreme timing events.

Share this post

Link to post
Share on other sites

Hey guys, 
so I was able to solve clock sync in a way I'm happy with and I'm now facing a new problem. 

Due to client prediction, the client runs ahead by approx half the RTT. Therefore if RTT is 166ms or 10 ticks; The client would be ahead approx 5 ticks in contrast to the server. 
At the exact same real-time the client would process tick 20 and the server would process tick 15. 

Due to interpolation, remote entities are behind by RTT + interpolation delay. 

Therefore, at the exact same real-time my client would process tick 20 and the server would process tick 15 and my remote entities are at tick 9 (RTT + 1 tick of interpolation delay (which is very low?)). 

At the given server tick X, my local client would be at tick ~(X + RTT / 2) and the remote entities on my local client would be in tick ~(X - RTT / 2 + interpolation delay) since that is the most recent server state I could have received. 

a) Is that correct? 

Now, I want my local entity to be able to collide with my remote entities and still be able to predict them (as good as possible). 

b) Does my local client just collide with the remote state I received? If so, my predicted state could be wrong as soon as it arrives at the server, because remote entities could have moved without me knowing (since they run ahead of the server on their machine). When the server has received my input, it would check collision against the current server state. This state is not equal to the state my client used for his predictions. 
My local client used old valid state, but the server can use the most recent state (which is (RTT/2) ticks ahead of the one my local client predicted) and broadcast my current position to all other clients. 

c) Should there happen something else? If so what? 


PS: I hope all those questions and answers also help other people. 

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!