I know that in order to apply 'smoothin out' algorithm (client side prediction, extrapolation), one has to take into account the latency factor. But it kinda dazzled me as to whom should calculate it and how it is generally measured.
1. Who gets the burden?
-Do the server do the calculation and tell the client overtime?
-Do the client do it (since it is him who needs to be smoothened)
-Do the server and client each has their own pingtime?
2. How would one calculate it?
right now I simply use the simplistic ping approach (e.g., send "ping" message and teh server quickly throw back the reply).
however, it was too simplistic and pingtime was generally change overtime (they sure are very random). Thus the second question. But if we use the average pingtime, it sometimes won't be correct, right? plus if I have to use average pingtime, that means I have to already have several ping results at hand. What if the game has just begun? should I spend my time calculating pingtime first?
3. How was it generally used?
-client send message that he just pressed "move forward" button. In order to hide latency, he directly moves his avatar forward locally.
-by the time the message reaches the server, however, it's already too late. perhaps by 100 ms. then the server should simulate as if player pressed the button 100 ms ago, thus the server must do "extrapolation" by 100 ms. CMIIW.
-client stopped pressing forward key, thus he sends "stop pressing forward" message to server.
-the server gets the message 100 ms later, oops that means our player is moved too far by 100 ms, so his position must be corrected by 100ms
(pos = pos + vel * -0.1). as such, the server send a position update message to player
-the client received the message, assume it's late by 100 ms. but since this is corrective position, the client simply stick his current position to the server's corrective position. He might also use interpolation (so it looks like sliding over rather than snapping) to alleviate graphical jerkiness.
So am I right on number 3?
4. bonus question. I recently have a chat with some gamedev over IRC, we were discussing minecraft. I threw a question regarding its networking, and some fellow said that the send rate is very low (a particular guy even said that it's only 5 TIMES A SECOND!! WTF!!). that means it sends every 200 ms. now I wonder about its update rate. since minecraft is kinda "physics"-is, I bet they use at least 30 fps update rate. But since I can't get a look at their sources, I can't confirm my curiousity. Do any of you guys know about this? thanks a lot.
Those are my questions, I hope you guys could help this confused guy. Anyway, thanks for your time reading this blocks of text.
1. This depends on where and why do you need it. For smoothing out latency in a fps you propably would calculate latency in the server since it needs to know where each client was when one pressed the trigger to see who got shot.
2. There are a few ways. Easiest being just to measure how long the server or client takes to respond to a simple short message.
3. There are quite a few things you could use the latency for. Movement prediction and correction is perhaps the most common use.
4. Minecraft suprisingly doesn't have that much physics in it. And as far as I know its network update rate depends on the servers tickrate so with a lot of mods it can be as low as once in every 2-10 seconds. The tickrate is capped at 20 meaning it would update 20 times a second at most. But the update rate is not that important in a game like minecraft. A 5hz rate would be more than enough. For example battlefield 3 and 4 updates 10 times a second and its a twitch shooter where life and death depends on a few milliseconds. This is exactly where latency corrections and such come into play to smooth things out.