In my current implementation of netcode I followed Gabriel Gambettas article to implement client side prediction and input handling.
For each game loop update I capture the input into a packet like this:
1 bit - Go west
1 bit - Go east
1 bit - Go north
1 bit - Go south
4 bit padded
4 bytes - deltatime(float) <- How long did I simulate this command on the client
Why do I include the deltatime for each input command? Because I need the server to know for how long to simulate this input for my entity.
The problem with this implementation is that I have to constantly send input commands even when my keys are not changed.
For example say my computer runs fast at roughly 200 FPS, that is 0.3 ms.
I will be sending 200 packets each second to the server. With 10+ players the server needs to simulate 2000 packets / second, and that is if everyone is running their clients at 200 FPS.
I should constrain the interval that I capture the input a fixed interval, say 60 FPS. I will save a bit, but it still feels like I can do better.
I have read that I only should change input delta. So only when my input state changes I will send a new packet with the new command. But then I can't rely on the deltatime I will send along with the input state to the server.
It bugs me that neither Gabriel Gambetta or Valve is talking about "ticks" as a time unit or using it with simulation. https://developer.valvesoftware.com/wiki/Latency_Compensating_Methods_in_Client/Server_In-game_Protocol_Design_and_Optimization.
If you know some article I should read or if you have any input on this I will be very happy.