Jump to content
  • Advertisement

NetworkDev19

Member
  • Content Count

    20
  • Joined

  • Last visited

Community Reputation

0 Neutral

1 Follower

About NetworkDev19

  • Rank
    Member

Personal Information

  • Interests
    Programming

Recent Profile Visitors

433 profile views
  1. NetworkDev19

    Shooting & Catching Edges In Prediction

    Oh yeah I wasn't going for the full determinism route, changing compiler flags and all that. Just was looking for the next best thing. That's why I was thinking fixed point is the only real other option but when the other person mentioned rounding I thought maybe there was something else.
  2. NetworkDev19

    Shooting & Catching Edges In Prediction

    Well only just rounding doesn't work deterministically right? If I say round 5.1999 to the 2nd decimal place, one machine may say it's 5.20 and another machine may say it's 5.10 At that point you have to start using fixed point arithmetic. Sadly I don't think I can use fixed point stuff because I'm already using Unity math libs and raycasting for a lot of my movement calculations.
  3. NetworkDev19

    Shooting & Catching Edges In Prediction

    So like fixed point positions (eg round to the nearest nth decimal)? It was something I considered but I wasn't sure if applying rounding would maybe lead to some oddities in movement. Like maybe getting stuck in objects for example. An additional layer of validation would be necessary after the rounding I think.
  4. NetworkDev19

    Shooting & Catching Edges In Prediction

    Supposedly they 'predict' rockets too according to their GDC architecture talk. They said it was hard but they made it possible. They also make a big deal about players who activate abilities to dodge stuff like Reapers shadow form are specially taken into account. Thankfully I don't have that kind of thing. You're definitely right about it being fun and at some point players will have an acceptable threshold of believability especially with moving targets. But for competitive games this kind of stuff comes under scrutiny pretty often and can generate a lot of negative buzz. I'm thinking of ways to just be as accurate as possible without the full determinism but you can only get so far I guess. Networking is something alright 😛
  5. NetworkDev19

    Shooting & Catching Edges In Prediction

    Oh yeah I know how deep determinism can get for sure with all the floating point precision. But funny enough the Overwatch devs say that they ignore fractional differences in their simulation when correcting positions. So I would expect my problem would be big for them.
  6. NetworkDev19

    Shooting & Catching Edges In Prediction

    Yeah I was doing the hit indicator after the server gets the result but was doing the hit FX predicted like Overwatch does. Probably best just to not do it at all until the server says so. I wonder how Overwatch gets their stuff to be so deterministic this doesn't happen to them. At least, they seem to claim that they can't get it to happen very often. But this case is super easy to repro if you just light a target up with gunfire and eventually you'll "nick" the hitbox which could be fractionally different on the server.
  7. NetworkDev19

    Fighting Game Over Network

    I just figured this out myself. Client view Interpolation is the amount of time the client has interpolated between 2 world states it has received. Think of it as the client is running the game world in the past. It knows about 2 worldstates, and is waiting for a 3rd. It interpolates between the 2 worldstates while it waits for the 3rd otherwise players would "teleport" around visually. The server needs to know what 2 worldstates the player is interpolating between and *how far they've interpolated between them* in order to accurately recreate the state the player is seeing when they did the shoot command. Or at least, as close as possible of an estimate. Hope that helps
  8. I have a situation where the client is shooting at a target player, and is predicting hits just on the edge(s) of the player. Now this target player is standing still, so no interpolation is involved. The client predicts with a hitscan weapon that he did infact hit this moving target just on the edge. The server reproduces this shot with the exact same origin and direction, but does not get the same result. It believes the bullet just barely missed. The client and server position of the target are nearly identical, which I believe is the issue. It's part of determinism (or lack thereof I guess). The server's position is exact where the client's position is ever so slightly off due to floating point differences. So that hair of a difference on the client is enough for the raycast to say "yup, hit him" and the server to say "nope, missed". I have a 2 ideas of maybe how to fix this, but I'm curious what people think/prefer. One idea is to prevent showing blood/damage/hit FX unless the server recognizes the shot, rather than predict it. The client will just have to live with blood being laggy if they have high latency. But then keen-eyed players will see the discrepancy I believe, thinking they got a shot on the target in their game state but the server "unfairly" denies it. Another is to somehow enlarge the raycast on the server. Give it a larger-than-client-side radius to "compensate" for such a discrepancy. The danger of this is that laggy clients who think they got shots off, even with lag compensation turned on to account for interpolation like the good ol' Valve method, may end up with false valid shots because the radius is large. Perhaps this is something I should tweak based on ping?
  9. NetworkDev19

    Understanding Lag Compensation & Timestamps

    I *think* I figured out why it was oscillating so much. At a tickrate of 60, my server was ticking sporadically because performance is not refined yet. I started to count how many ticks occur in an Update() loop in Unity, and it would bounce between 1-4 frequently. This means that the game client has to continually change its guessing. I dropped the tickrate to 30, my server is currently running around 40FPS anyway, and now I get a much more stable guess at the server's tickrate. The delta is closer to 2 after the game client connects and syncs up a bit. I'll have to ramp up the tickrate when I get better performance.
  10. NetworkDev19

    Understanding Lag Compensation & Timestamps

    At the moment, I have the server sending the client: Server's current tick (ie tick 100) Latest tick for commands it has received from that client (ie tick 105) Using this information, the client assumes the server has 5 commands buffered (for each tick 100-105) for that client. This is delta. I have a setting that says "if the delta is > than X, tick slightly slower" And "if the delta is < Y, tick slightly faster" This appears to somewhat work, it has a tendency to bounce around a lot when I set it to 2. I would expect the delta to stay close to 2 commands buffered, but it oscillates between 0 and upwards of 3-4 kind of rapidly. It never settles on 2 for a long time at any latency I throw at it from 0 to 200. The Unity FPS sample does something pretty much identical, but it stays close to 2-3 when set to 2 as well. I'm wondering if maybe my tick timestep are wrong somehow. You suggested sending the adjustment value "Z" to the server, I presume you mean the # of ticks the client is guessing it should be ahead. Can you expand on that and how that would help? I don't mind the extra 4 bytes if it keeps my command buffer rock solid. Thank you again by the way.
  11. NetworkDev19

    Understanding Lag Compensation & Timestamps

    So you think I should ditch the Unity FPS sample's direction and go for something where - 1) On login, the client sends a local timestamp (UTC ticks or something) 2) The server sees this, computes the delta with its own timestamp, and sends the client the result. 3) The client will now use its local timestamp + delta to send future commands to the server. Am I understanding you correctly?
  12. NetworkDev19

    Understanding Lag Compensation & Timestamps

    Hello again, So I'm just posting to run down ideas outloud and I'm curious if anyone has any suggestions to improve it or problems with the design. I believe it's what is in the Unity FPS sample. The client waits for 2 world snapshots from the server. It uses delta encoding so it has to do this anyway. In each world snapshot from the server, the server includes it's current game time tick. The game client intializes it's own game time tick to: The servers tick + ((timeSinceLastSnapshotInMS + rtt) / 1000) * 60 + 2 Where 60 is the tickrate of the server and 2 is the extra buffer of ticks. Now the gameclient is constantly doing this over time. Each time the game client ticks, it runs this calculation and decides that if the current client tick number is too far behind or too far ahead of the result of that calculation, it simply hard sets it's tick number to that result. This seems to work great for awhile. But eventually the client drifts or hiccups and then ends up too far behind ahead or behind the server and it results in a missed command. It happens maybe every 15-20 seconds. It's annoying, and I'm beginning to think I should ditch the design and go for something else.
  13. NetworkDev19

    Understanding Lag Compensation & Timestamps

    I'm counting the simulation ticks in a FixedUpdate loop (Unity). So I'm not using timestamps, but rather ticks that are supposed to occur every 16ms for example. The client waits until it receives the server's tick, then begins to count upward. It monitor's the server's tick, so if it the client slows down for some reason or is moving too fast, then it will reset back to the servers tick (+ 1/2 rtt + 1~3 tick buffer). The problem I'm experiencing is that when this occurs, and the clock resets, often I'll end up losing that 1 input that goes out just before the clock resets. This is the implementation I got from the FPS Unity example and the Overwatch GDC architecture talk. I'm definitely open to using different clocks, I'm just afraid that if I rely on a particular clock on different devices/CPUs/architectures because its multiplatform, I may end up seeing issues.
  14. NetworkDev19

    Understanding Lag Compensation & Timestamps

    Thank you yet again! I actually went and got something working. I poked around the Unity Multiplayer FPS sample and saw what they were doing with this, I hadn't understood what it was doing before and now it makes sense! Now my client is running ahead of the server in its clock by at least 1 buffered "frame"/tick. However, sometimes the client drifts. There's a formula in the Unity FPS sample that it uses. It looks something like this for context if you're interested. When the client drifts, it will end up 1 tick behind the server, and then reset itself to be 1 tick ahead which is good. However, the command sent for that tick from the client is always lost. The reason it is lost is because when I see a command received by the server is from a tick in the past, I disregard it. Perhaps I shouldn't ignore inputs that are roughly -1 from the current server tick? Example: I sent move for tick 2 and jump for tick 3. The server is already at tick 3. It looks and sees both inputs, but I was previously dropping the move input (2 < 3). So should I go ahead and perform both the move and jump on tick 3?
  15. NetworkDev19

    Understanding Lag Compensation & Timestamps

    Thank you, however, how does RTT translate to ticks of the server? I have 100ms ping when I start, I connect to the server, server sends me a message saying "I am at tick 120". The client starts at what tick to guarantee it stays ahead? And yes, sorry, I meant Valve's specific lag compensation in hind sight.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!