Jump to content
  • Advertisement

poettlr

Member
  • Content Count

    14
  • Joined

  • Last visited

Community Reputation

1 Neutral

About poettlr

  • Rank
    Member

Personal Information

  • Role
    DevOps
    Programmer
  • Interests
    Design
    Education
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hey guys, so I was able to solve clock sync in a way I'm happy with and I'm now facing a new problem. Due to client prediction, the client runs ahead by approx half the RTT. Therefore if RTT is 166ms or 10 ticks; The client would be ahead approx 5 ticks in contrast to the server. At the exact same real-time the client would process tick 20 and the server would process tick 15. Due to interpolation, remote entities are behind by RTT + interpolation delay. Therefore, at the exact same real-time my client would process tick 20 and the server would process tick 15 and my remote entities are at tick 9 (RTT + 1 tick of interpolation delay (which is very low?)). At the given server tick X, my local client would be at tick ~(X + RTT / 2) and the remote entities on my local client would be in tick ~(X - RTT / 2 + interpolation delay) since that is the most recent server state I could have received. a) Is that correct? Now, I want my local entity to be able to collide with my remote entities and still be able to predict them (as good as possible). b) Does my local client just collide with the remote state I received? If so, my predicted state could be wrong as soon as it arrives at the server, because remote entities could have moved without me knowing (since they run ahead of the server on their machine). When the server has received my input, it would check collision against the current server state. This state is not equal to the state my client used for his predictions. My local client used old valid state, but the server can use the most recent state (which is (RTT/2) ticks ahead of the one my local client predicted) and broadcast my current position to all other clients. c) Should there happen something else? If so what? Cheers, PS: I hope all those questions and answers also help other people.
  2. Ok so, I can keep the offset and whenever I notice I'm running to slow on the client I execute two ticks instead of one. Is that that what you mean? Or should I alter the tickrate for a couple of milliseconds like I do atm? Also, when I notice the client to server offset is too high (meaning the client is way to far in the future) I freeze and do not advance my simulation at all? At the moment I have a 60Hz Network Tick Rate and a 60 Hz Simulation Tick Rate; Thanks
  3. O The thing with time in Send and Receive functions is, that my Network Library is using the time to calculate RTT, bandwidth and check for keep-alive messages. To be honest... Receive is actually called Tick(double time) in my case. When calling the Tick functions all received messages (from the network thread) are pulled and fed to the simulation. It's basically the housekeeping. That's how the underlying network forwards my Packets: while (transport.HasNextPacket()) { Datagram packet = transport.GetNextPacket(); ProcessPacket(packet.payload, packet.payloadSize, packet.sender); packet.Release(); } So basically I receive in a thread via my UDP Socket Context: public class UdpSocketContext : ISocketContext { private readonly Socket internalSocket; private Thread socketThread; //some variables ommited public UdpSocketContext(AddressFamily addressFamily) { internalSocket = new Socket(addressFamily, SocketType.Dgram, ProtocolType.Udp); } public void Bind(EndPoint endpoint) { internalSocket.Bind(endpoint); socketThread = new Thread(RunSocket); socketThread.Start(); } private void runSocket() { while (true) { try { datagramQueue.ReadFrom(internalSocket); } catch (Exception e) { if (e is SocketException) { var socketException = e as SocketException; if (socketException.SocketErrorCode == SocketError.ConnectionReset) continue; } return; } } } } So that all being said; My tick number is mostly application agnostic and if I have messages that need to run in a specific tick I include the tick number in that message. A simple example of a message with a tick number is my Input Message or the minimum GamePlayStateMessage: public class InputMessage : Message { public int clientTick; public Input input; public InputMessage(Input input, int clientTick): base(MessageType.Input) { this.input = input; this.clientTick = clientTick; } //Serialization ommited } public class SimpleGamePlayStateMessage : Message { public int clientToServerOffset; public SimpleGamePlayStateMessage(int clientToServerOffset): base(MessageType.ServerStatus) { this.clientToServerOffset = clientToServerOffset; } //Serialization ommited } My Entity Component System uses a fixed delta time each tick (regardless of actual time needed to simulate). void Simulate(double time, int tick) { ecs.ReplaceGlobalTime(time, tick, 1/60f); } //... public void ReplaceGlobalTime(double time, int tick, float delta) { //... } So with all that, If I start my client and server (using no time sync at all) I had the following Scenario: Booting up the server The server starts to run and increment its tick number. Some seconds later a Boot up a client The client starts to run and increment its tick number effectively being behind by seconds * 60 ticks. The client sends Input Messages to the server. The server receives but ignores the Input Messages because they are too old. Currently my old code from ... ... works pretty well. But it seems like there is a major flaw with Network Jitter (as I said). Since then you pointed out the animation flow ( I guess because I alter the tick rate of the whole simulation and sometimes it would execute 60*16.6ms of simulation and sometimes 64*16.6ms of simulation and sometimes 56*16.6ms of simulation. All based on the current adjusted rate but still executed in a timeframe ~1 second.). So what you suggest all in all (even though I don't know if I understand it correctly) is to find a way of altering the executed tick without altering the tickrate? How would I achieve that without losing a certain number of ticks? Or do I really execute all ticks in between? For example, if my server is at tick 1000 receives a message from the client that is actually for tick 995, my client needs to run at least 5 ticks to be again ahead of the server (to be safe probable 6 ticks). So the server response for tick 1000 would be clientToServerOffset = 5 (the 5 frames the client lacks behind). If that's correct, ok but, do I then save the history of all those "simulation steps" as before? Do I send an Input Message for each tick? Wouldn't it have the same effect on the animation? So many questions :O, sorry
  4. But that would mean that I simulate ticks twice or not at all. That would not really help with my order of events? I was fairly confident in my system but now I have no idea if it would work. To be honest all I want to achieve is a 60fps simulation where the client runs ahead of the server. I am so confused now. Let me try to describe my current problem. The server has the authority and uses a simple fixed game loop to ensure my simulation runs at 60 Hertz. //... double newTime = stopwatch.ElapsedMilliseconds / 1000f; double frameTime = newTime - currentTime; currentTime = newTime; accumulator += frameTime; while (accumulator >= dt) { Simulation(t); accumulator -= dt; t += dt; } //... Each time I call Simulation() I * Receive Messages from the Client * Do calculations for my simulation using a fixed delta time of 16.6ms * Send Messages to the Client * Increment a tick number. Therefore my server Simulation function looks roughly like this: uint tick = 0; void Simulation(double time) { server.Receive(time); gameSimulation.Simulate(time, tick); server.Send(time); tick++; } This function is executed regardless of the number of clients and as soon as the server has started. tick starts to increment. So far so good. On my Unity Client, I can't guarantee a smooth execution of 60 Hertz because Update and Fixed Update are not controlled by my call structure, but for now, I can use the Update Function (which is called with approximately 120 Hertz by Unity. Basically its the same as the server and I use the Update function from my prev post: private const float TickRate = 1/60f; private float timer = 0f; public void Update() { timer += UnityEngine.Time.deltaTime; while (timer >= TickRate) { timer -= TickRate; Client_Simulation(t); } } The Client_Simulation function is basically the same as the one on the server: uint tick = 0; void Simulation(double time) { Client.Instance.Receive(time); gameSimulation.Simulate(time, tick); Client.Instance.Send(time); tick++; } But from what I understand; Somewhere in this function, I have to handle the client tick in a very specific different way from what I have been using. Honestly, I'm lost, can you please point me in the direction I have to go to achieve "a client tick that is a bit ahead"? Help me, hplus0603. You're my only hope.
  5. Ok, wow, now I'm confused, sorry. I don't have a function like time-to-tick. As soon as my server is starting it also ticks and increments the local tick number every 16.6ms. When a client is starting it is executing the following function: public class GameLoop : MonoBehaviour { private const float TickRate = 60f; private const float NetRate = 1 / TickRate; private const float SlowRate = 1 / (TickRate - 3f); private const float FastRate = 1 / (TickRate + 3f); private const int MinimumJitterBuffer = 2; private float netTimer; private float adjustedRate = NetRate; private int addedJitterBuffer; private int tick; private int offset; public void Update() { netTimer += UnityEngine.Time.deltaTime; while (netTimer >= adjustedRate) { netTimer -= adjustedRate; int lastServerTick = SteelClient.Instance.NetworkInfo.LastPacketServerTick; float rttMillis = Client.Instance.NetworkInfo.RttMillis; bool loss = Client.Instance.NetworkInfo.HadLoss; AdjustTickRate(lastServerTick, loss, rttMillis); //Note regardless of the AdjustedRate we always use a simulation deltatime of 16.6ms. //Therefore if the AdjustedRate would be faster more simulation steps appear in a second. tickerFunction.Invoke(systemHelper.GetTimeSinceStartup(), tick++); Client.Instance.NetworkInfo.SetClientTick(tick); } } private void AdjustTickRate(int lastKnownServerTick, bool loss, float rttMillis) { int rttAsTick = Mathf.CeilToInt(rttMillis / NetRate * 1000f); if(loss && addedJitterBuffer < rttAsTick + 10) addedJitterBuffer += 2; //The last received offset from the server. An offset of 0 means that the server received the client message associated with a tick at //the frame it is needed. //Note: server should always run behind the client. offset = Client.Instance.NetworkInfo.ClientOffset; //For a new connection server tick will be higher. Client has to snap. if(offset < -30 || offset > 30) { tick = lastKnownServerTick + rttAsTick + MinimumJitterBuffer + addedJitterBuffer; adjustedRate = NetRate; return; } if(offset < -2 -MinimumJitterBuffer - addedJitterBuffer) { //The client runs to far in the future and should be dialed back a little. adjustedRate = SlowRate; } else if(offset > -1 -MinimumJitterBuffer - addedJitterBuffer) { //The client falls back behind the server and should be running faster. adjustedRate = FastRate; } else { adjustedRate = NetRate; } if (addedJitterBuffer > 0) addedJitterBuffer--; } } The "tickFunction" itself is basically the following: public void TickFunction(float time, int tick) { Client.Instance.Receive(time); //Simulate Client.Instance.Send(time); } Both, messages from my server and messages from my client have the current tick included (server messages send the server tick + the offset of the client and client messages send the tick this message should be executed at (which has to be a greater tick number than the server tick). Is that what you mean? The minimum information my server has to send to the client (atm) is the current server tick, client to server offset and a flag if the offset was definitely too small (I can remove the flag, I know that, but I send it also if the offset is 0). The offset is simply calculated via int offset = (int) (tick - inputCollector.GetLastReceivedClientTick(id)); The overall problem I observer is that the client is not able to guarantee a delivery and my server has to duplicate input (I send at least the last three inputs my client has taken). Again, I'm really sorry, that I can't follow you, might be an issue with me not fully understanding all the multiplayer nomenclature. That being said, I really appreciate all the help I can get, thanks!
  6. Could you elaborate on this?
  7. Oh yeah, I missed a return there. My plan was to tick faster on the client to fill the server side buffer faster. But this could actually have no effect to be honest. I just don't want to have the situation where the server does not have input from a client. But after thinking about it.. if I lose tick 105 on the client and my rtt is 10 ticks it would take 10 ticks until I actually notice the loss on the client. So yeah I could actually remove that at all. Any other suggestions?
  8. So, after not getting this to work as expected I stripped everything not related to the matter. Goals are the overwatch networking model: * Run the client ahead of the server. * If a packet is lost tell the client to tick slightly faster until its fine again. So whats happening on my server: Each tick the server * grabs the stored input for this tick from the buffer. If it does not exist we set a flag. * integrates the simulation. * sends State Message to the client. public struct ServerStateMessage { public int serverTickAtTimeOfSending; public bool lossDetected; } So every-time the server does not have input to process the client will know. Each tick the client * reads the ServerStateMessages (duh). * calculates the estimated tick it needs to have packets arriving early * samples input * integrates the simulation * sends input and current tick public stuct InputMessage { public int clientTickAtTimeOfSending; public Input input; } That describes the basic loop both go. So what is the client doing exactly to be "ahead" of the server. private const int MinimumTickAhead = 1; private int addedTickAhead = 0; private const float NetDt = 1 / 60f; private const float NetDtInMs = NetDt * 1000f; private int tick = 0; private void Tick() { ServerStateMessage lastState = //some function to retrieve the last state. int rttAsTick = Math.Max(Mathf.CeilToInt(Client.Instance.NetworkInfo.RttMillis / NetDtInMs), 1); if(lastState.lossDetected && addedTickAhead < rttAsTick + 1) //not sure on the second part of that condition) addedTickAhead += 2; int estimatedTickToBeAhead = lastState.serverTickAtTimeOfSending + rttAsTick + MinimumTickAhead + addedTickAhead; int diff = estimatedTickToBeAhead - tick; if(diff < -60 || diff > 60) //1 second off tick = estimatedTick; //I assume that will only happen at the beginning of the game for now. if(diff < -2) { //Local is ahead estimate. Tick slower. simulationTickRate = 1/(60f-3f); } else if(diff > 2) { //Local is behind estimate. Tick faster simulationTickRate = 1/(60f+3f); } else { //Local is near estimate. Tick normal; simulationTickRate = NetDt; } if(addedTickAhead > 0) addedTickAhead--; //Simulation //SendInput tick++; } So far this works pretty well. Does anyone of you see a mayor flaw? Unfortunately this is the first time I do something like that. Cheers
  9. I tried both running it on the same machine and running it on two different machines in the same local network. I use UDP and to be more specific its based on netcode.io + reliable.io by Glenn Fielder. I can give you more details but that is basically it. For simulating different network performances (like loss, lag and so on) I use the Windows Tool clumsy (https://jagt.github.io/clumsy/) which works very good for that. I just ran the simulation again (on a different machine than before but doing client and server on it) with the following results: RTT = 16.63ms Loss = 0% The offset the server sends to the client is 1. The server receives client packets ~ 1 Tick ahead (see log output); [Info - 2018-05-15 18:20:34Z] Client is ahead (Srvtick-17869 || Clttick-17870) [Info - 2018-05-15 18:20:34Z] Client is ahead (Srvtick-17870 || Clttick-17871) [Info - 2018-05-15 18:20:34Z] Client is ahead (Srvtick-17871 || Clttick-17872) This output is generated when the server receives a packet and the client tick is the tick included in said packet. Note: On the client I am actually 5 ticks ahead when sending the message. Which seems odd because 5 ticks are 80ms. Even if I lose 1 tick (because I send and receive packets at the beginning of the frame on client and server) It should not be 5 ticks should it? RTT = 126ms (Clumsy Added lag on inbound and outbound packets of 50ms). Loss = 0% The offset the server sends to the client is still ~1-2. The server receives client packets ~2 ticks a head based on my log output. On the client the last received server tick is ~13 ticks behind. So the client now runs 13 ticks in the future. Which is nearly twice the RTT. Is there any information I can provide? Edit: I should note that the "Client runs ahead of server ~ 2 ticks." was in my prev. post was the offset the server sent to the client at the given time.
  10. Hey, I tried out the offset approach for more time now and god pretty good results with a low latency environment. But I am unhappy with what I have with higher latency. To clarify: Currently my server and client both tick the network at and simulation at 60hz. Each tick the client sends a message with its current tick: public struct Input { public ulong tick; public Vector2 stick; } On the server the input message is used in the following way: public void OnPlayerInput(Input input) { lastClientTick = input.tick; inputQueue.Enqueue(input, lastClientTick); } Each tick the server sends a message to the client containing the offset: public struct TimeInfo { public ulong serverTick; public double serverTime; public int clientToServerOffset; } This message is filled in the following way: public void Simulate(ulong currentServerTick, double currentServerTime) { TimeInfo t = new TimeInfo(); t.serverTick = currentServerTick; t.serverTime = currentServerTime; t.clientToServerOffset = (int) (lastClientTick - tick + 1); //send t to client. } The client is using the TimeInfo to adjust its tick rate: ulong clientTick; const float tickRate = 1/60f; float adjustedRate = tickRate; private void AdjustTickRate(TimeInfo t) { int offset = t.clientToServerOffset; if (offset > 180 || offset < -180) { clientTick = t.serverTick; return; } if (offset < -32) { adjustedRate = tickRate * 0.75f; } else if (offset < -15) { adjustedRate = tickRate * 0.875f; } else if (offset < 1) { adjustedRate = tickRate * 0.9375f; } else if (offset > 32) { adjustedRate = tickRate * 1.25f; } else if (offset > 15) { adjustedRate = tickRate * 1.125f; } else if (offset > 8) { adjustedRate = tickRate * 1.0625f; } else { adjustedRate = tickRate; } } So if the offset is either to high or to low I snap to the server tick (assuming, I just connected). Otherwise I adjust my local tickRate to send/receive and simulate faster than 60Hz while still using a deltaTime of 16.6ms for the simulation. Results: RTT ~20ms Loss 0% Client runs ahead of server ~ 4-5 ticks. RTT ~100ms Loss 0% Client runs ahead of server ~ 2 ticks. RTT ~200ms Loss 0% Client runs ahead of server ~ 2 ticks. Both results seem okish but also wrong for me. For a RTT of 20ms I expected the client to run ahead for something like 2 or 3 ticks based on my calculation. And for RTT of 100 I'd expected a range of 5-6 ticks. So I think there is something off... I hope you guys have an idea because I clearly do something wrong here Thanks, cheers,
  11. Thanks for your replies, I was on a small vacation thus my late response. Big thanks for the synchronization code insights. I will sweep over the code and as soon as I have a confident version of my own I will share it here. Cheers
  12. I just tried that. In a simple Unity Update Loop with two different flavors. a) Advance the Simulation N-times if needed. But don't advance but adjust tick backwards if we are too far in the future. ulong clientTick; //not pretty void Update() { int offsetFromServer = GetLastServerOffset(); int ticksToSimulate = 0; if(offset < -32) { ticksToSimulate = (int) (Math.Abs(offset) * 1f/4f); } else if(offset < -15) { ticksToSimulate = (int) (Math.Abs(offset) * 1f/8f); } else if(offset < 0) { ticksToSimulate = 2; } else if(offset > 32) { clientTick -= (int) (offset * 1f/4f); //? } else if(offset > 15) { clientTick -= (int) (offset * 1f/8f); //? } else if(offset > 8) { clientTick -= 1; //? } else { //offset >= 0 && offset < 8 ticksToSimulate = 1; } if(ticksToSimulate > 0) { SimulateFor(ticksToSimulate); } } That.. well did not work as expected. Setting the clientTick to the past is also probably not a good idea? b) Based on the offset adjust the time between ticks without touching the actual tickDelta for the simulation. If the client is too far behind Simulate() is called more often but still uses a fixed delta of 16ms internally. ulong clientTick; float tickRate = 1/60f; float adjustedTickRate = 1/60f; double lastTickTime = 0; //not pretty //Note hacky unity update running at 300fps+ void Update() { if(lastTickTime + adjustedTickRate <= Time.time) { int offsetFromServer = GetLastServerOffset(); if(offset < -32) { adjustedTickRate = tickRate * 0.75f; } else if(offset < -15) { adjustedTickRate = tickRate * 0.875f; } else if(offset < 0) { adjustedTickRate = tickRate * 0.9375f; } else if(offset > 32) { adjustedTickRate = tickRate * 1.25f; } else if(offset > 15) { adjustedTickRate = tickRate * 1.125f; } else if(offset > 8) { adjustedTickRate = tickRate * 1.0625f; } else { adjustedTickRate = tickRate; } Simulate(); lastTickTime = Time.time; } } Do you have any pointers on how to adjust the client tick in a better way? Did you have something like this in your mind? Both versions seem to need a fail-safe of either setting the clienttick to nearly the server tick if it is too far off and the second version (b) needs a very long time to adjust the client tick to ~ the server tick. Maybe mix both versions? Another thing bugging my mind is: Should I take the RTT into account and recalculate the offset on the client side before using it for adjustments? Like a "hey I already did that adjustment in the last frame, I should be fine"-kinda way. Cheers
  13. Wow that's super helpful! I actually use dedicated servers but they are managed via a REST API. So for now my last big problem will be client/server time synchronization, after searching through the forum I have some plans on how I want to approach that. I plan to include the server time and server tick (at the time of sending the packet) in all state sync packets or send the server time/tick at a given interval (be it at 5Hz or less). Since my network layer already provides me with RTT I think I have all variables in place to do that. Probably something including a stopwatch.ElapsedTimeMilliseconds since the last received packet on the client. Or just using the RTT since the game should run on a fixed loop anyway. Is there a list of possible frame/time sync algorithms or a paper to read into?
  14. Hello, This is actually my first post after being a lurker for quite some time here. For the last couple of days I try to get my head around the concept of Overwatch's multiplayer architecture after watching the GDC 2017 talks by Timothy Ford ("Overwatch Gameplay Architecture and Netcode") and Dan Reed ("Networking Scripted Weapons and Abilities in Overwatch"). ( I hope someone here has Vault Access :D) As usual with such complex systems Overwatch seems to combine different approaches to hide latency. Among others, the ones I am mostly interested in are, Command Frames and State Synchronization. I think I understood both concepts but I have problems adding them up. 1) Ford talks about the client being in the future by ~ half the RTT plus one command frame. This ensures that all commands sent from the client to the server tend to arrive when the server is actually at the tick referenced in the command. If that's correct my assumption would be the following: Server is at tick 1000. RTT is ~ 10 ticks -> around 10*16.6 ms. Client simulates tick 1005 and sends command associated with tick 1005. <5 ticks later> Server is at tick 1005 and received a client command. (maybe its already queued in a buffer). Server applies command and sends state for tick 1006 (State_N + Input_N = State_N=1). RTT might still be 10 ticks. Client simulates tick 1010. <5 ticks later> Server is at tick 1010... Client received State for tick 1005 and checks internal buffers for prediction. Does that really apply? Does the client really simulate half-rtt in the future? 2) How do I handle ticks at the start of the game? My network layer requires a timestamp to work and I'd use ticks in the main game loop. Do I have something like a grace period until the client can calculate the required ticks to move to the future (by calling simulation.Tick(1/60f) for the number of ticks to be in the future)? 3) If I run the simulation at 60Hz and the network layer at say 20Hz. Do I have 60 Inputs that I send from Client to Server or 20? I know, this is somewhat similar to other questions in this forum, but I feel like this particular talk has never been discussed? Cheers, poettlr
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!