Synchronizing server and client time

Started by
33 comments, last by FredrikHolmstr 12 years, 7 months ago

[quote name='ApochPiQ' timestamp='1314265316' post='4853573']
Especially when dealing with PCs, it is not possible to guarantee that one machine will elapse time at the same fixed rate as another. Even on a single machine you won't get perfect n-Hz cycles, there will be variances between frames/ticks. The purpose of using a time value instead of a tick count is to help absorb these variances.


I was wondering something... Assuming we are talking about time variance, not hardware saturation, how does the time variance matter exactly? Can't you simply do a (TickCount * Frequency) formula to determine the time of a given frame and ignore the variance that gets reset every frames? It's not as if the variance would stack up and make the game drift.
[/quote]

I was thinking the exact same thing in this instance, since the _only_ reason we sync time is to make sure we don't perform X+1 before we perform X+2, and the variance gets cancelled out every frame anyway. But maybe there are more things at play here :)
Advertisement

I was thinking the exact same thing in this instance, since the _only_ reason we sync time is to make sure we don't perform X+1 before we perform X+2, and the variance gets cancelled out every frame anyway. But maybe there are more things at play here :)


I think you should use your time estimation algorithm for each packet you send and receive. Then you should update your approximation of client/server clock offset using a leaky integrator -- when you have a new estimate of the offset N, calculate the actual offset O you will be using from old estimate oO as O = oO * 0.95 + N * 0.05. (You may want to adjust those factors -- 0.99 and 0.01 may actually works just as well and be more stable!)

You will be unlikely to be "ahead" of the server, because you will have an actual server timestep number in the packet you received.

Also, I prefer to ONLY use time step numbers in the protocol, rather than times. Time doesn't matter as much as time steps evolved per unit of time, and unit of time is what you measure using the local PC clock. (Note that offset and rate between PC clock and time steps may still be measured in fractional steps, even though only whole steps make it on the wire).


enum Bool { True, False, FileNotFound };


So the algorithm should be like this: gameTime = (oldGameTime * 0.95) + ((serverTimeStamp + (rtt/2)) * 0.05);
I suppose my question is if N in your algorithm should include the rtt/2 or just the timestamp from server?


[quote name='hplus0603']
Also, I prefer to ONLY use time step numbers in the protocol, rather than times. Time doesn't matter as much as time steps evolved per unit of time, and unit of time is what you measure using the local PC clock. (Note that offset and rate between PC clock and time steps may still be measured in fractional steps, even though only whole steps make it on the wire).


By timestep numbers do you mean "timestep X", "timestep X+1", "timestep X+2", so basically the tick/step-count? In my mind this last paragraph of yours conflicts with what your first two + the formula said, but I assume I'm missing something. When you send a command to the server do you do gameTime/stepTime or do you just send your gameTime?

Again, thanks for all your help and your help in my other threads, very appreciated!


I got a very nice implementation using the method/algorithm that hsplus0603 described, using a leaky integrator with 0.99/0.01 which provides (as was clearly stated) very stable results and a smooth curve estimation without any jerkyness, etc. I have one last question, when my new estimate is a tiny bit behind (always behind the server) the previous estimate, how do I handle this? Do I just ignore it and go on trucking and let the server figure out the order of things? For example early on, I can have jumps like this:

old: 5.665365
new: 5.654861


You can clearly see how the new estimate is 11ms before the old one, does this matter? or can I just continue on as I was? What I'm asking is: Do I need to do anything special when my new estimate is "before" (in time) my old one?

Here's the implementation for reference:

public static void SetClientTime(float serverTime, float rtt) {
if(gameTime == 0.0f) {
gameTime = (serverTime + (rtt*0.5f));

} else {
gameTime = (gameTime * 0.99f) + ((serverTime + (rtt*0.5f)) * 0.01f);
}
}


The reason I'm asking is because, in my mind this would be confusing:

  1. I have "FORWARD" pressed down and I send this to the server with the current timestep (estimateGameTime/timeStepSize), call it X
  2. At the start of the next simulation step I have received an updated gametime and now my estimated gametime is 11ms behind what I assumed it to be in in the previous simulation step
  3. I send a new "FORWARD" command (since I have it pressed down till) to the server, but now when i send the current timestep (estimatedGameTime/timeStepSize) the timestep ends up at X-1
  4. Server receives the command with timestep X and applies that
  5. Server receives the command with timestep X-1 but notices the timestep is behind what was applied just before it

How would this be sorted out?
I wouldn't have the client send a timestamp to the server. The client should be attempting to maintain synchronization with the server, not both ways (otherwise cheating can occur). The server should process the requests as they come in, so that the server isn't trying to go back and change an estimate of a players position (this is where cheating can occur). This also means that if a single player is laggy, then everyone else wont notice the problem- --only the laggy player will. Also, on the first pressing of the forward key, the client should send the request to move forward. If you continue to press the forward key down and do nothing else, there should be no new requests sent to the server. In other words, think of the client as continuing its last action until a change occurs. This will dramatically decrease the complexity of your program and the amount of data being sent back and forth. The server then sends its normal updates with a timestamp and some sort of position, speed, and direction information.

Hope that helps
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.

I wouldn't have the client send a timestamp to the server. The client should be attempting to maintain synchronization with the server, not both ways (otherwise cheating can occur). The server should process the requests as they come in, so that the server isn't trying to go back and change an estimate of a players position (this is where cheating can occur). This also means that if a single player is laggy, then everyone else wont notice the problem- --only the laggy player will. Also, on the first pressing of the forward key, the client should send the request to move forward. If you continue to press the forward key down and do nothing else, there should be no new requests sent to the server. In other words, think of the client as continuing its last action until a change occurs. This will dramatically decrease the complexity of your program and the amount of data being sent back and forth. The server then sends its normal updates with a timestamp and some sort of position, speed, and direction information.

Hope that helps



Ok, but if there is no need to send the time to the server, what is the need to sync the clock to the server? What is it actually used for? I imagined that I would need the clock time for when the client presses "FIRE" and then attach the timestamp to it so the server knows "when" in time the client fired, but I realized this is as simple as doing "receivedTime - (RTT*0.5)" on the server to get a decent approximate.

So what is this the synced time on the client actually *used* for?
The server sends timestamps to the client, and the client attempts to sync with the servers time.
Here is how it should go ( there might be a better way)

Player A sends to server a request to fire a gun (no timestamp)
Server receives the request and does its necessary checks to ensure the request is valid. If everything is valid, the server then sends out player A shot a gun and attaches a timestamp to with it.
all players receive the gun fire event with the timestamp the server sent (which was when the server actually received the command, not when the player sent the request)

This actually will lead to a less jerky simulation because instead of having to play an event that occurred possibly 500 ms ago (which would be possible for laggy players), the event plays for an even that occured yourroundtrip/2.

In other words, you dont want the entire server at the mercy of laggy players (where events are received at a delay of 200 ms in the case of a laggy connection ). The server should continue blindly and if a request(command as you put it) is received, the server should treat it as happening then. Think of how your simulation would run if the server was running commands based on a laggy player. Imagine getting a few laggy players together and having them run around each other trying to fight, or collide. All the information would be running 200 ms in the past. The server would be freaking out trying to decide what to send out because the information is all interlaced and it would receive commands in the past which could be very bad and lead to very jerky simulations.

Basically, the server would have to hold all information being sent out to sync with the laggiest player otherwise, there would be many correction events that can occur with a player telling the server when he or she moved, or shot a gun. Player A (who has a bad ping) might tell the server that he moved forward 500 ms ago, but player B (who has a good ping) told the server that he shot at player A 30 ms ago. So the server receives player B's packet with the info and the server decides that player A should be dead. But then 470 ms later, the server receives a packet from player A saying that he was moving for the past 500 ms, which means that player B never actually hit player A.

The synced time on the client is used when the server sends the clients events that have occurred. The client will always be running in the past just behind the server by a time of roundtrip/2 ms. Our job as programmers is to try and guess the events that occur between the updates that we receive. If we guess wrong, we have to correct it. In most cases a wrong guess is a position change, in which case you can slowly adjust a players position over time instead of jerking the player into the correct position.
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.

The server sends timestamps to the client, and the client attempts to sync with the servers time.
Here is how it should go ( there might be a better way)

Player A sends to server a request to fire a gun (no timestamp)
Server receives the request and does its necessary checks to ensure the request is valid. If everything is valid, the server then sends out player A shot a gun and attaches a timestamp to with it.
all players receive the gun fire event with the timestamp the server sent (which was when the server actually received the command, not when the player sent the request)

This actually will lead to a less jerky simulation because instead of having to play an event that occurred possibly 500 ms ago (which would be possible for laggy players), the event plays for an even that occured yourroundtrip/2.

In other words, you dont want the entire server at the mercy of laggy players (where events are received at a delay of 200 ms in the case of a laggy connection ). The server should continue blindly and if a request(command as you put it) is received, the server should treat it as happening then. Think of how your simulation would run if the server was running commands based on a laggy player. Imagine getting a few laggy players together and having them run around each other trying to fight, or collide. All the information would be running 200 ms in the past. The server would be freaking out trying to decide what to send out because the information is all interlaced and it would receive commands in the past which could be very bad and lead to very jerky simulations.


Thank you so much, exactly what I needed! much love to you for this explanation! This is what made it "click" for me :)


Cool. I actually continued to add more to the post above. I have this bad habit of thinking of things after I make a post then adding to the post. I think I did that like 3 times for this one :P
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.

Cool. I actually continued to add more to the post above. I have this bad habit of thinking of things after I make a post then adding to the post. I think I did that like 3 times for this one :P


I do the same thing, ha :) Just read the rest of it, I have one follow up question though:

So I get that the client time (that is somewhat in sync in with the server) is used when we receive events from the server, but how is it used? To sort the events in the right order? Or is it just used to make better predictions (possibly correct our earlier predictions) ?

This topic is closed to new replies.

Advertisement