Synchronizing server and client time

Started by
33 comments, last by FredrikHolmstr 12 years, 8 months ago
I know this has been debated a lot on the forums here, and I have searched and probably read 25+ threads - I've done my research, I have a couple of pretty simple questions that I would like answered if possible.

Question 1: The standard way of synchronizing time between client is something like this:
  1. Client sends timestamp to server
  2. Server responds with clients timestamp + its own timestamp
  3. Client does something like this when it gets the server response: localTime = servertime + ((currentTime - timestampSentToServer) / 2)
  4. This gives the client a timer that is somewhat approximate to the server, step 3 can be repeated one or many times to create a more accurate time stamp

Is my reasoning correct here?


Question 2: You have two options, either you stick to your guns with the initial clock sync (as describe in question 1) for the entire game or you update it at set intervals (every packet, every 20 seconds, whatever) and adjust accordingly. Is there a preferred approach here (stick to your guns, or update as you go along), what are the benefits of one or the other? And if you update the timer as you go along how do you deal with times where the client might end up way of (either in the past or the future)?


Question 3: I think this is my most important question (as I think I got 1 and 2 groked in my head): The two previous questions talked about syncing using time but the concept of time isn't really important is it? What you're really looking for is how many ticks behind the client is, isn't it? Both the server and the client runs on the same tick speed, 15ms (66.66Hz), they both hold a tick counter that keeps track of the current tick they're on, wouldn't this scheme be possible:

  1. The client connects
  2. Server sends its current tick count to the client
  3. Client adopts the servers tick count, incrementing it for each tick that passes

Now when the client sends a command to the server, it attaches it's tick count to that message so the server knows when in time (according to the client) this command was generated, so things like lag compensation, etc. can be employed? I suppose my question is pretty simple: Is there a reason the time/clock is used instead of just what really matters to the game: amount of ticks passed.


Thanks again for all the help this forum has given me!
Advertisement
Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]


Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.


Of course, it's so obvious! Thanks man :)
So, the "best solution" in my eyes seems to be this:

  1. On every world state (every 45ms, 22.22Hz) that is sent from the server, the current servers game time is attached
  2. The client sets it's local gametime clock like this: gametime = server game time + (avarage roundtrip/2)
  3. Every tick the client doesn't get an update from the server (every 15ms, 66.66Hz) it increments its local gametime with 15ms (the tick time)
  4. The server also keeps track of the last gametime that was sent to each client
  5. When the client issues a command such as "FORWARD" it attaches it's local, estimade gametime: lastReceivedServerTime + (avarageRoundtrip/2) + (currentTime - timeSinceLastServerUpdate)
  6. When the server receives a command from a client it verifies that the attached gametime falls within a valid range, say +/- (100 + avarageRoundTrip/2) ms of current time, if it falls outside the command is discarded, if not it gets snapped to the nearest tick-size and put in the queue of commands to be processed.

Is this scheme plausible? Good? bad?
Ok, I think I got something works reasonably well in code now, one last itching question:

How do you deal with the fact that sometimes your client will jump ahead of the server? Occasionally, maybe once every 10 updates or something, the client time will be further into the future then the server time I received. How do I deal with this? Do I adjust the multiplier of the RTT, say that I take 0.3 instead of 0.5 of the RTT and add, etc.? Do I skew the whole time back/forward with some small multiplier like 0.000001? The most logical to me would be to take less of the RTT then half, since that's what's giving me the wrong offset (probably server->client is faster then client->server). For example doing RTT/2 - (abs(negativeDiff)/2).

What would I do with commands that are issued when in "future" time on the client? Should I snap them to the correct time, send them with the future timestamp?
Ok this is becoming a lot, but figured I might as well ask while it's in my head: Why are we "required" to add the RTT/2 time to the server time we get back from the server on the client? Can't we just keep the server time we get from the server and increase it as we go along, just syncing it once in the beginning? Considering all the updates we will get from the server also will be RTT/2 behind? Even if our RTT changes considerably, say we start with a 100ms RTT, but even if we swing down to 15ms RTT or up to 500ms RTT, the time will still be in "sync" (as in it will be behind with as much as it's always been, even though updates will take longer to send/receive) ? Maybe I'm confusing myself.
Becoming a monologue here, but I hope someone has the time to look at my questions, my main problem now is dealing with negative numbers, which seem to be happen quite a lot for me, here's a print out of the log I create when I update the clients gametime with server gametime+(rtt/2) and how much the previous time was wrong with (negative time: we're ahead, positive: we're behind)



diff: 0.007507324
diff: 0.007507324
diff: -0.01499939
diff: -0.007507324
diff: 0.02252197
diff: -0.007507324
diff: 0.007507324
diff: -0.02250671
diff: 0.007522583
diff: 0.02249146
diff: -0.02999878
diff: 0.02250671
diff: -0.01498413
diff: 0.007492065
diff: -0.02249146
diff: 0.02249146
diff: -0.007492065
diff: 0.01501465
diff: -0.007507324


As you can see negative times are very common for me, however I have been getting pretty good offsets about 5-25ms diff only. Is this the way it's suppoed to be, with a long of negative "in the future" values, or am I doing something wrong?
Your distribution looks to be about 50%, so I'd say it's pretty much to be expected. Any form of round-trip compensation is going to introduce some inaccuracy, but seeing that inaccuracy evenly mixed between too-fast and too-slow is more or less what you want.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]


Your distribution looks to be about 50%, so I'd say it's pretty much to be expected. Any form of round-trip compensation is going to introduce some inaccuracy, but seeing that inaccuracy evenly mixed between too-fast and too-slow is more or less what you want.


Thanks! So last one: What do I actually DO (code/algorithm wise) when I think I'm in the future? Do I snap to 0.0 offset, or do I keep going like nothing happened - since I know I'm never actually IN the future?

Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.


I was wondering something... Assuming we are talking about time variance, not hardware saturation, how does the time variance matter exactly? Can't you simply do a (TickCount * Frequency) formula to determine the time of a given frame and ignore the variance that gets reset every frames? It's not as if the variance would stack up and make the game drift.

This topic is closed to new replies.

Advertisement