how to adjust client predicted time?

Started by
6 comments, last by lride 7 years ago

I'm making a 2d multiplayer platformer. I do client prediction for my own player and do interpolation for other players.

I understand client always tries to stay ahead of server, so an input for tick T should arrive close to when server simulates T.

For every input packet, server tells the client how far the input was off. If the input was too early, then the client needs to slow down. If the input was late, then the client needs to speed up.

What i don't understand is how to slow down or speed up.

For example, if server tells me it received my input for tick 10 at tick 7(3 ticks early), then I need to slow down 3 ticks worth of time. How should I go about doing this?

An invisible text.
Advertisement

The way that I dealt with this is to maintain a server side estimate of client RTT, and use that for clock estimation on the server (e.g tick_cli = tick_srv - tick_rate * RTT/2)

This meant I didn't explictly deal with the client clock for any timestamping information, and used an incrementing event number to ensure events were inserted into the event queue on time.

In short, I didn't manage the clock on the client, as I saw complications - you suddenly lose the guarantee of contiguous time, which I am not comfortable with.

The only reason for this clock variation is to ensure that the server itself receives events when it expects them, whereby it compares the tick values between server and client. Providing that you can guarantee a fixed tick rate, you aren't actually moving forward /back in time, however, it is just the network conditions changing. This is why I approached this with - build an event queue server side of N events, where N might be the uncertainty in the client-server time * tick rate. This simply ensures the server has enough events that it never "runs out" of events to process to move the client forward in time. The RTT estimate that the server maintains is then used to resolve things like "was i shot, according to lag compensation".

Otherwise, you could likely use some kind of sliding window smoothing, or

You don't need to know the RTT for this.
Simply keep an offset between your local computer clock and "virtual game start time."
The "current game time" equals "current clock minus game start time."

If you are to "speed up" then move the game start time backwards (subtract from it.)
If you are to "slow down" then move the game start time forwards (add to it.)
Note that you want some interval/window where you don't adjust at all. For example, if the data arrives within 200 milliseconds ahead of being used, don't adjust.
(Another rule is to use number of packets; if a packet arrives and there are 0 or 1 unprocessed packets in the incoming queue, you're good; if there are more, you need to slow down.)
enum Bool { True, False, FileNotFound };

You don't need to know the RTT for this.
Simply keep an offset between your local computer clock and "virtual game start time."
The "current game time" equals "current clock minus game start time."

If you are to "speed up" then move the game start time backwards (subtract from it.)
If you are to "slow down" then move the game start time forwards (add to it.)
Note that you want some interval/window where you don't adjust at all. For example, if the data arrives within 200 milliseconds ahead of being used, don't adjust.
(Another rule is to use number of packets; if a packet arrives and there are 0 or 1 unprocessed packets in the incoming queue, you're good; if there are more, you need to slow down.)

I think we're looking at this slightly differently - The reason I mention using the RTT is to approximate the time difference. In practice however, the RTT varies with network conditions and is going to deviate by several ticks at times.

I consider there to be several problems:

  1. As the network conditions evolve, the client will move further into the future or past relative to the server.
  2. The server needs at least (1+N) network tick interval's worth of events to account for changing latency in the connection (N>0)
  3. The time at which events were generated is useful so that the client can move forward / back in time

Point 3. can be resolved by sending the current "server tick" that corresponds to the world state tick at time T of generating events, which does trust the client, but this happens with any method (using RTT, clock nudging), and just requires some sanity conditions on the server.

Point 1. and 2. are related, and the server will occasionally need to stop and recover in the event that it runs out of inputs. One of the finer points is differentiating between latency variations and network tick intervals, which are intentional delays. In the case of clock nudging, it is important that the accepted lead time is significantly larger than the network tick interval.

I never felt comfortable with how clock adjusting was discussed - It seems vulnerable to panicking when network conditions change sufficiently, and will vary with the network (obviously). I prefer to have the client running in the future, but with no explicit notion of "aiming to arrive on time", given that this really is just an estimate. Having the clock vary according to the network wasn't something I liked particularly. I realise that I've just moved the "problem" around, of course.

This probably makes no sense. It's late.

the server will occasionally need to stop and recover in the event that it runs out of inputs


For a lockstep simulated game (like an RTS,) this makes sense. The server proceeds to step N exactly when it has inputs from all clients for step N.

For a FPS game, the server really just needs to keep going, and if one or more players haven't provided input for step N, then they proceed as if they had not given any input (which will likely generate corrections if it happens when they player is trying to steer.)

If you have the automatic "send so it arrives in time," and drive everything else based on that adjustment, you don't need to RTT; it "falls out" of the math. Your main option is then to figure out how far behind server updates you display remote entities -- at known positions but late, or at furthermore extrapolated positions but more up-to-date-guessed.
You can estimate the end-to-end RTT simply by calculating game time when you process a received server packet. If you already have a game-time-offset variable, the RTT is "timestep of latest server entity update game time" minus "my currently estimated server time." Note that your currently estimated server time includes the transmit delay, because it's designed to make you send the input data so it arrives in time to the server.
This value will always be a lot bigger than the raw physical network ping time, so gamers will hate it if you show it to them.
Thus, having the server always respond with a packet as soon as a packet is received (in some real-time thread) and measure the round-trip time of that (using a real-time receving thread on client, too) will show what gamers want to think of as their "ping."
enum Bool { True, False, FileNotFound };

Thus, having the server always respond with a packet as soon as a packet is received (in some real-time thread) and measure the round-trip time of that (using a real-time receving thread on client, too) will show what gamers want to think of as their "ping."

You could also send ping requests from the client and store the receive time on the server. Once you send a 'game' packet on the server, send the delta time along to save some (significant) bandwidth overhead of directly sending a pong packet. On the client side, measure the difference between request and response and subtract the delta time.

For example, if server tells me it received my input for tick 10 at tick 7(3 ticks early), then I need to slow down 3 ticks worth of time. How should I go about doing this?

You could tick slightly slower. E.g. if you tick at 25hz, that's 40ms per tick. Increasing that to 41ms will slow you down by about 2.5%, which is probably not very noticeable to the player, but it will also not close the gap very fast (6.4s for 3 ticks at 25hz).
There's a bit of a trade off to make between closing the gap faster and how noticeable the slowdown is to the player, this doesn't have to be a static value though, you can adjust the slowdown based on how large the gap is.

For example, if server tells me it received my input for tick 10 at tick 7(3 ticks early), then I need to slow down 3 ticks worth of time. How should I go about doing this?

You could tick slightly slower. E.g. if you tick at 25hz, that's 40ms per tick. Increasing that to 41ms will slow you down by about 2.5%, which is probably not very noticeable to the player, but it will also not close the gap very fast (6.4s for 3 ticks at 25hz).
There's a bit of a trade off to make between closing the gap faster and how noticeable the slowdown is to the player, this doesn't have to be a static value though, you can adjust the slowdown based on how large the gap is.

Thanks, this approach sounds good.

An invisible text.

This topic is closed to new replies.

Advertisement