• Create Account

### #ActualAngus Hollands

Posted 31 July 2012 - 10:14 AM

Hi everyone!
I've been working on an authoritative server design for a few months, and I've hit a snag.
After working initially with a tick-based approach, simulations break down (prediction and extrapolation) when the client tick rate differs to that of the server.
The server will always have a constant frame-rate, typically 60 fps. However, this may change for the clients, so how can I account for that?

• Time is not an integer - unless you perform multiplication and rounding.
• Time does not progress at a constant rate with a tick based logic loop, thus states won't be derived at exactly the same time, even if the simulations run at the same tick rate and are started at exactly the same time.
Because of these issues, whilst I can think of how to modify my system to use time, I cannot think how one could access the gamestate that was determined for that time. This is my current idea for a solution:
• As the server runs at 60 ticks per second, use a division function to determine the current "tick" of the server using a tick rate and time (e.g tick = time / tick_rate(per second)). I'd have to round to the nearest integer, so i'd use the floor of the result.
• This tick can be used to determine a constant tick rate.
• When comparing predictions against server states, use interpolation between client predictions (if the client tick rate is 15hz, ever 4 server ticks)
Some final notes:
• I'm using a game engine that runs all game elements (render, animation, logic) sequentially in the game loop. Thus all are affected by a varying frame rate
• The physics system is constrained to the tick rate, so i'd have to simply modify the velocity values by hand using the tick system as a scalar against the velocity vector.
Does this sound like a viable solution? I'd assume that the tick rate of the client is capped at a maximum of 60.

### #1Angus Hollands

Posted 31 July 2012 - 10:08 AM

Hi everyone!
I've been working on an authoritative server design for a few months, and I've hit a snag.
After working initially with a tick-based approach, simulations break down (prediction and extrapolation) when the client tick rate differs to that of the server.
The server will always have a constant frame-rate, typically 60 fps. However, this may change for the clients, so how can I account for that?