I'm sorry to make yet another thread about this, but i've been confused reading the threads about processing inputs.
What i've been doing until now is : clients send inputs, server is calling process(input) as soon as it receives them, which means processing several netframes per server frame. It works well for my simple physics, it's deterministic.. but it's obviously vulnerable to speedhack.
So i was looking at another way, here is what i'm doing.
Client has its own frame ID and send its inputs with the frame ID to the server.
The server is caching each message by (client) frame ID.
(The server sets the first client frame ID via the first client message received (client ID))
From there the server increments each client ID according to its (the server) steps.
The server sets client inputs getting them from the buffer, if there is no frame, just skip it.
Using a dejitter buffer it works well but i'm not convinced that's how i should do it; here are my questions :
- Here the latency is set by the first message received, how do i initialize my timestamps?
- What happens if the client is suddenly lagging and the server ends up skipping all the received frames ? Do you adjust this in realtime?
- In my game the server is sending the map & a player creation event all at once. I'm using a fixed timestep but what happens is that when the client is receiving the map and processing it, it takes around 1 second & in the same frame sends the first inputs. Next frame the fixed loop makes his job and tries to catch up from the lag introduced in the previous loop, so i end up sending X events to compensate for something the server isn't aware of which introduces a big amount of lag in the game.
- Still related to time, what happens if the client or the server goes out of the fixed loop aka by being prevented to enter the spiral of death?