Assume my game is running at a rate of 60Hz (~16.66 ms) on both the server and client and updates from the server to the client is sent at 20Hz (50ms). Now since the game is running at 60Hz, there are two ways of calculating the interval for the client updates. I can either just do this:
if((currentTime - lastSendTime) >= 0.05) {
SendUpdate();
lastSendTime = currentTime;
}
And just check this every frame for each client. But to me it seems like a cleaner solution would be to specify the client send rate in "every n:th" frame instead, so if I want to send at 20Hz when running at a 60Hz simulation rate I would just do this:
if(frameNumber % 3 == 0) {
SendUpdate();
}
It just feels a lot cleaner, you don't need to keep a separate timer or have to deal with small inaccuracies in the local time of the server, etc. It also becomes nicely synchronized with updates to the game state on the server, etc. The reason I'm asking is because in the few examples of "professional" networking code you can find on the web, I see *everyone* using a the first method of a separate timer for the send time for each client, so I'm thinking I must be missing something here?