Jump to content

  • Log In with Google      Sign In   
  • Create Account

Synchronizing server and client time


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
34 replies to this topic

#1 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 03:26 AM

I know this has been debated a lot on the forums here, and I have searched and probably read 25+ threads - I've done my research, I have a couple of pretty simple questions that I would like answered if possible.

Question 1: The standard way of synchronizing time between client is something like this:
  • Client sends timestamp to server
  • Server responds with clients timestamp + its own timestamp
  • Client does something like this when it gets the server response: localTime = servertime + ((currentTime - timestampSentToServer) / 2)
  • This gives the client a timer that is somewhat approximate to the server, step 3 can be repeated one or many times to create a more accurate time stamp
Is my reasoning correct here?


Question 2: You have two options, either you stick to your guns with the initial clock sync (as describe in question 1) for the entire game or you update it at set intervals (every packet, every 20 seconds, whatever) and adjust accordingly. Is there a preferred approach here (stick to your guns, or update as you go along), what are the benefits of one or the other? And if you update the timer as you go along how do you deal with times where the client might end up way of (either in the past or the future)?


Question 3: I think this is my most important question (as I think I got 1 and 2 groked in my head): The two previous questions talked about syncing using time but the concept of time isn't really important is it? What you're really looking for is how many ticks behind the client is, isn't it? Both the server and the client runs on the same tick speed, 15ms (66.66Hz), they both hold a tick counter that keeps track of the current tick they're on, wouldn't this scheme be possible:

  • The client connects
  • Server sends its current tick count to the client
  • Client adopts the servers tick count, incrementing it for each tick that passes
Now when the client sends a command to the server, it attaches it's tick count to that message so the server knows when in time (according to the client) this command was generated, so things like lag compensation, etc. can be employed? I suppose my question is pretty simple: Is there a reason the time/clock is used instead of just what really matters to the game: amount of ticks passed.


Thanks again for all the help this forum has given me!

Sponsor:

#2 ApochPiQ   Moderators   -  Reputation: 16414

Like
0Likes
Like

Posted 25 August 2011 - 03:41 AM

Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.

#3 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 04:39 AM

Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.


Of course, it's so obvious! Thanks man :)

#4 fholm   Members   -  Reputation: 262

Like
1Likes
Like

Posted 25 August 2011 - 07:01 AM

So, the "best solution" in my eyes seems to be this:

  • On every world state (every 45ms, 22.22Hz) that is sent from the server, the current servers game time is attached
  • The client sets it's local gametime clock like this: gametime = server game time + (avarage roundtrip/2)
  • Every tick the client doesn't get an update from the server (every 15ms, 66.66Hz) it increments its local gametime with 15ms (the tick time)
  • The server also keeps track of the last gametime that was sent to each client
  • When the client issues a command such as "FORWARD" it attaches it's local, estimade gametime: lastReceivedServerTime + (avarageRoundtrip/2) + (currentTime - timeSinceLastServerUpdate)
  • When the server receives a command from a client it verifies that the attached gametime falls within a valid range, say +/- (100 + avarageRoundTrip/2) ms of current time, if it falls outside the command is discarded, if not it gets snapped to the nearest tick-size and put in the queue of commands to be processed.
Is this scheme plausible? Good? bad?

#5 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 07:39 AM

Ok, I think I got something works reasonably well in code now, one last itching question:

How do you deal with the fact that sometimes your client will jump ahead of the server? Occasionally, maybe once every 10 updates or something, the client time will be further into the future then the server time I received. How do I deal with this? Do I adjust the multiplier of the RTT, say that I take 0.3 instead of 0.5 of the RTT and add, etc.? Do I skew the whole time back/forward with some small multiplier like 0.000001? The most logical to me would be to take less of the RTT then half, since that's what's giving me the wrong offset (probably server->client is faster then client->server). For example doing RTT/2 - (abs(negativeDiff)/2).

What would I do with commands that are issued when in "future" time on the client? Should I snap them to the correct time, send them with the future timestamp?

#6 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 07:59 AM

Ok this is becoming a lot, but figured I might as well ask while it's in my head: Why are we "required" to add the RTT/2 time to the server time we get back from the server on the client? Can't we just keep the server time we get from the server and increase it as we go along, just syncing it once in the beginning? Considering all the updates we will get from the server also will be RTT/2 behind? Even if our RTT changes considerably, say we start with a 100ms RTT, but even if we swing down to 15ms RTT or up to 500ms RTT, the time will still be in "sync" (as in it will be behind with as much as it's always been, even though updates will take longer to send/receive) ? Maybe I'm confusing myself.

#7 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 10:37 AM

Becoming a monologue here, but I hope someone has the time to look at my questions, my main problem now is dealing with negative numbers, which seem to be happen quite a lot for me, here's a print out of the log I create when I update the clients gametime with server gametime+(rtt/2) and how much the previous time was wrong with (negative time: we're ahead, positive: we're behind)


diff: 0.007507324
diff: 0.007507324
diff: -0.01499939
diff: -0.007507324
diff: 0.02252197
diff: -0.007507324
diff: 0.007507324
diff: -0.02250671
diff: 0.007522583
diff: 0.02249146
diff: -0.02999878
diff: 0.02250671
diff: -0.01498413
diff: 0.007492065
diff: -0.02249146
diff: 0.02249146
diff: -0.007492065
diff: 0.01501465
diff: -0.007507324

As you can see negative times are very common for me, however I have been getting pretty good offsets about 5-25ms diff only. Is this the way it's suppoed to be, with a long of negative "in the future" values, or am I doing something wrong?

#8 ApochPiQ   Moderators   -  Reputation: 16414

Like
0Likes
Like

Posted 25 August 2011 - 10:47 AM

Your distribution looks to be about 50%, so I'd say it's pretty much to be expected. Any form of round-trip compensation is going to introduce some inaccuracy, but seeing that inaccuracy evenly mixed between too-fast and too-slow is more or less what you want.

#9 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 10:54 AM

Your distribution looks to be about 50%, so I'd say it's pretty much to be expected. Any form of round-trip compensation is going to introduce some inaccuracy, but seeing that inaccuracy evenly mixed between too-fast and too-slow is more or less what you want.


Thanks! So last one: What do I actually DO (code/algorithm wise) when I think I'm in the future? Do I snap to 0.0 offset, or do I keep going like nothing happened - since I know I'm never actually IN the future?

#10 Shift53   Members   -  Reputation: 123

Like
0Likes
Like

Posted 25 August 2011 - 11:10 AM

Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.


I was wondering something... Assuming we are talking about time variance, not hardware saturation, how does the time variance matter exactly? Can't you simply do a (TickCount * Frequency) formula to determine the time of a given frame and ignore the variance that gets reset every frames? It's not as if the variance would stack up and make the game drift.

#11 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 11:14 AM


Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.


I was wondering something... Assuming we are talking about time variance, not hardware saturation, how does the time variance matter exactly? Can't you simply do a (TickCount * Frequency) formula to determine the time of a given frame and ignore the variance that gets reset every frames? It's not as if the variance would stack up and make the game drift.


I was thinking the exact same thing in this instance, since the _only_ reason we sync time is to make sure we don't perform X+1 before we perform X+2, and the variance gets cancelled out every frame anyway. But maybe there are more things at play here :)

#12 hplus0603   Moderators   -  Reputation: 5725

Like
0Likes
Like

Posted 25 August 2011 - 11:57 AM

I was thinking the exact same thing in this instance, since the _only_ reason we sync time is to make sure we don't perform X+1 before we perform X+2, and the variance gets cancelled out every frame anyway. But maybe there are more things at play here :)


I think you should use your time estimation algorithm for each packet you send and receive. Then you should update your approximation of client/server clock offset using a leaky integrator -- when you have a new estimate of the offset N, calculate the actual offset O you will be using from old estimate oO as O = oO * 0.95 + N * 0.05. (You may want to adjust those factors -- 0.99 and 0.01 may actually works just as well and be more stable!)

You will be unlikely to be "ahead" of the server, because you will have an actual server timestep number in the packet you received.

Also, I prefer to ONLY use time step numbers in the protocol, rather than times. Time doesn't matter as much as time steps evolved per unit of time, and unit of time is what you measure using the local PC clock. (Note that offset and rate between PC clock and time steps may still be measured in fractional steps, even though only whole steps make it on the wire).



enum Bool { True, False, FileNotFound };

#13 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 25 August 2011 - 12:12 PM

O = oO * 0.95 + N * 0.05.


So the algorithm should be like this: gameTime = (oldGameTime * 0.95) + ((serverTimeStamp + (rtt/2)) * 0.05);
I suppose my question is if N in your algorithm should include the rtt/2 or just the timestamp from server?


Also, I prefer to ONLY use time step numbers in the protocol, rather than times. Time doesn't matter as much as time steps evolved per unit of time, and unit of time is what you measure using the local PC clock. (Note that offset and rate between PC clock and time steps may still be measured in fractional steps, even though only whole steps make it on the wire).


By timestep numbers do you mean "timestep X", "timestep X+1", "timestep X+2", so basically the tick/step-count? In my mind this last paragraph of yours conflicts with what your first two + the formula said, but I assume I'm missing something. When you send a command to the server do you do gameTime/stepTime or do you just send your gameTime?

Again, thanks for all your help and your help in my other threads, very appreciated!




#14 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 26 August 2011 - 03:08 AM

I got a very nice implementation using the method/algorithm that hsplus0603 described, using a leaky integrator with 0.99/0.01 which provides (as was clearly stated) very stable results and a smooth curve estimation without any jerkyness, etc. I have one last question, when my new estimate is a tiny bit behind (always behind the server) the previous estimate, how do I handle this? Do I just ignore it and go on trucking and let the server figure out the order of things? For example early on, I can have jumps like this:

old: 5.665365
new: 5.654861

You can clearly see how the new estimate is 11ms before the old one, does this matter? or can I just continue on as I was? What I'm asking is: Do I need to do anything special when my new estimate is "before" (in time) my old one?

Here's the implementation for reference:

public static void SetClientTime(float serverTime, float rtt) {
		if(gameTime == 0.0f) {
			gameTime = (serverTime + (rtt*0.5f));
			
		} else {
     		gameTime = (gameTime * 0.99f) + ((serverTime + (rtt*0.5f)) * 0.01f);
		}
	}

The reason I'm asking is because, in my mind this would be confusing:

  • I have "FORWARD" pressed down and I send this to the server with the current timestep (estimateGameTime/timeStepSize), call it X
  • At the start of the next simulation step I have received an updated gametime and now my estimated gametime is 11ms behind what I assumed it to be in in the previous simulation step
  • I send a new "FORWARD" command (since I have it pressed down till) to the server, but now when i send the current timestep (estimatedGameTime/timeStepSize) the timestep ends up at X-1
  • Server receives the command with timestep X and applies that
  • Server receives the command with timestep X-1 but notices the timestep is behind what was applied just before it
How would this be sorted out?

#15 smasherprog   Members   -  Reputation: 432

Like
0Likes
Like

Posted 26 August 2011 - 07:39 AM

I wouldn't have the client send a timestamp to the server. The client should be attempting to maintain synchronization with the server, not both ways (otherwise cheating can occur). The server should process the requests as they come in, so that the server isn't trying to go back and change an estimate of a players position (this is where cheating can occur). This also means that if a single player is laggy, then everyone else wont notice the problem- --only the laggy player will. Also, on the first pressing of the forward key, the client should send the request to move forward. If you continue to press the forward key down and do nothing else, there should be no new requests sent to the server. In other words, think of the client as continuing its last action until a change occurs. This will dramatically decrease the complexity of your program and the amount of data being sent back and forth. The server then sends its normal updates with a timestamp and some sort of position, speed, and direction information.

Hope that helps
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.

#16 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 26 August 2011 - 08:26 AM

I wouldn't have the client send a timestamp to the server. The client should be attempting to maintain synchronization with the server, not both ways (otherwise cheating can occur). The server should process the requests as they come in, so that the server isn't trying to go back and change an estimate of a players position (this is where cheating can occur). This also means that if a single player is laggy, then everyone else wont notice the problem- --only the laggy player will. Also, on the first pressing of the forward key, the client should send the request to move forward. If you continue to press the forward key down and do nothing else, there should be no new requests sent to the server. In other words, think of the client as continuing its last action until a change occurs. This will dramatically decrease the complexity of your program and the amount of data being sent back and forth. The server then sends its normal updates with a timestamp and some sort of position, speed, and direction information.

Hope that helps



Ok, but if there is no need to send the time to the server, what is the need to sync the clock to the server? What is it actually used for? I imagined that I would need the clock time for when the client presses "FIRE" and then attach the timestamp to it so the server knows "when" in time the client fired, but I realized this is as simple as doing "receivedTime - (RTT*0.5)" on the server to get a decent approximate.

So what is this the synced time on the client actually *used* for?

#17 smasherprog   Members   -  Reputation: 432

Like
2Likes
Like

Posted 26 August 2011 - 08:40 AM

The server sends timestamps to the client, and the client attempts to sync with the servers time.
Here is how it should go ( there might be a better way)

Player A sends to server a request to fire a gun (no timestamp)
Server receives the request and does its necessary checks to ensure the request is valid. If everything is valid, the server then sends out player A shot a gun and attaches a timestamp to with it.
all players receive the gun fire event with the timestamp the server sent (which was when the server actually received the command, not when the player sent the request)

This actually will lead to a less jerky simulation because instead of having to play an event that occurred possibly 500 ms ago (which would be possible for laggy players), the event plays for an even that occured yourroundtrip/2.

In other words, you dont want the entire server at the mercy of laggy players (where events are received at a delay of 200 ms in the case of a laggy connection ). The server should continue blindly and if a request(command as you put it) is received, the server should treat it as happening then. Think of how your simulation would run if the server was running commands based on a laggy player. Imagine getting a few laggy players together and having them run around each other trying to fight, or collide. All the information would be running 200 ms in the past. The server would be freaking out trying to decide what to send out because the information is all interlaced and it would receive commands in the past which could be very bad and lead to very jerky simulations.

Basically, the server would have to hold all information being sent out to sync with the laggiest player otherwise, there would be many correction events that can occur with a player telling the server when he or she moved, or shot a gun. Player A (who has a bad ping) might tell the server that he moved forward 500 ms ago, but player B (who has a good ping) told the server that he shot at player A 30 ms ago. So the server receives player B's packet with the info and the server decides that player A should be dead. But then 470 ms later, the server receives a packet from player A saying that he was moving for the past 500 ms, which means that player B never actually hit player A.

The synced time on the client is used when the server sends the clients events that have occurred. The client will always be running in the past just behind the server by a time of roundtrip/2 ms. Our job as programmers is to try and guess the events that occur between the updates that we receive. If we guess wrong, we have to correct it. In most cases a wrong guess is a position change, in which case you can slowly adjust a players position over time instead of jerking the player into the correct position.
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.

#18 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 26 August 2011 - 08:46 AM

The server sends timestamps to the client, and the client attempts to sync with the servers time.
Here is how it should go ( there might be a better way)

Player A sends to server a request to fire a gun (no timestamp)
Server receives the request and does its necessary checks to ensure the request is valid. If everything is valid, the server then sends out player A shot a gun and attaches a timestamp to with it.
all players receive the gun fire event with the timestamp the server sent (which was when the server actually received the command, not when the player sent the request)

This actually will lead to a less jerky simulation because instead of having to play an event that occurred possibly 500 ms ago (which would be possible for laggy players), the event plays for an even that occured yourroundtrip/2.

In other words, you dont want the entire server at the mercy of laggy players (where events are received at a delay of 200 ms in the case of a laggy connection ). The server should continue blindly and if a request(command as you put it) is received, the server should treat it as happening then. Think of how your simulation would run if the server was running commands based on a laggy player. Imagine getting a few laggy players together and having them run around each other trying to fight, or collide. All the information would be running 200 ms in the past. The server would be freaking out trying to decide what to send out because the information is all interlaced and it would receive commands in the past which could be very bad and lead to very jerky simulations.


Thank you so much, exactly what I needed! much love to you for this explanation! This is what made it "click" for me :)




#19 smasherprog   Members   -  Reputation: 432

Like
0Likes
Like

Posted 26 August 2011 - 08:51 AM

Cool. I actually continued to add more to the post above. I have this bad habit of thinking of things after I make a post then adding to the post. I think I did that like 3 times for this one :P
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.

#20 fholm   Members   -  Reputation: 262

Like
0Likes
Like

Posted 26 August 2011 - 08:53 AM

Cool. I actually continued to add more to the post above. I have this bad habit of thinking of things after I make a post then adding to the post. I think I did that like 3 times for this one :P


I do the same thing, ha :) Just read the rest of it, I have one follow up question though:

So I get that the client time (that is somewhat in sync in with the server) is used when we receive events from the server, but how is it used? To sort the events in the right order? Or is it just used to make better predictions (possibly correct our earlier predictions) ?




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS