I've been working on an authoritative server design for a few months, and I've hit a snag.
After working initially with a tick-based approach, simulations break down (prediction and extrapolation) when the client tick rate differs to that of the server.
The server will always have a constant frame-rate, typically 60 fps. However, this may change for the clients, so how can I account for that?
I've read about using time instead of ticks, however here are my following concerns:
- Time is not an integer - unless you perform multiplication and rounding.
- Time does not progress at a constant rate with a tick based logic loop, thus states won't be derived at exactly the same time, even if the simulations run at the same tick rate and are started at exactly the same time.
- As the server runs at 60 ticks per second, use a division function to determine the current "tick" of the server using a tick rate and time (e.g tick = time / tick_rate(per second)). I'd have to round to the nearest integer, so i'd use the floor of the result.
- This tick can be used to determine a constant tick rate.
- When comparing predictions against server states, use interpolation between client predictions (if the client tick rate is 15hz, ever 4 server ticks)
- I'm using a game engine that runs all game elements (render, animation, logic) sequentially in the game loop. Thus all are affected by a varying frame rate
- The physics system is constrained to the tick rate, so i'd have to simply modify the velocity values by hand using the tick system as a scalar against the velocity vector.
Edited by Angus Hollands, 31 July 2012 - 10:14 AM.