Jump to content
  • Advertisement

Substance12

Member
  • Content Count

    3
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Substance12

  • Rank
    Newbie

Personal Information

  • Role
    Creative Director
  • Interests
    Art
    Design
    Programming
    QA

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Substance12

    Accounting for lost packets?

    That was all extremely insightful, thank you! Some of it I already knew about but the rest seem like good practices that I'll make sure to keep in mind. My biggest concern is with inputs and ticks. I'm using the LOVE2D framework and it's update loop is either locked through vsync (usually 60hz) or it can be whatever if it's disabled. I think it checks for inputs more often than 60 times per second, therefore when a new tick happens, there have been more than 1 input for sure. This is why if I press many keys repeatedly I get a huge desync on the server side, and the buffer with 100ms delay seemed to be my solution. One thing I've thought to fix this problem without delaying the packets' times was to limit how many times per second an input is sampled in the client. For example, 30 times a second (same as the tickrate), which shouldn't show any significant input lag, and it would make sure there's only 1 input sent every tick. Is this an acceptable solution? About the tickrate = framerate equivalency, I do not plan to implement variable tickrates as it's a simple enough game that it won't need anything over 60hz, but I'll remember that if I ever have to. I'm never sending snapshots from client to server, although I'm thinking of implementing P2P between specific clients to send some data, as my game relies on duos of players cooperating between each other, and I'm going to see if P2P connections for some types of data is better than routing it through the server first. The rotating snapshots concept is interesting, I've never heard of it. I would just simulate the other players in the client with the inputs and correct them every time the server sends a snapshot? And as for the last part, this is something that makes a lot of sense, thank you!
  2. Substance12

    Accounting for lost packets?

    I will make sure to keep this in mind. I know that to "fix" the state from the server, there needs to be a maximum amount of difference that is considered acceptable, and if it differs, the client will fix itself. Is that correct? 1. The reason I did the 100ms delay is because it makes it so the inputs are played in a correct order and at the right times with the precise delay between each key press, albeit 100ms late. It is the only solution I've found to a major problem: when the client presses a lot of keys repeatedly, the server representation of the character used to desync horribly. This is an issue that I've never seen acknowledged in any article, so it makes me wonder if it's my implementation that's wrong. 2. I've read about this but I'm not too sure about it; I know the Source Engine uses this by delaying inputs 100ms from rendering, however wouldn't this cause a very noticeable input lag? It only seems that it would work if the client represented the character by interpolating between snapshots, and that's not the way I'm doing it. 3. This makes sense, however I feel it's good in case one of the packets might be lost, the server has something to fall back on to. 4. 33ms*, my bad.
  3. I'm trying to make a 2D multiplayer game. So far I've implemented client-side prediction and server reconciliation and a method that keeps the player objects almost perfectly sync, with as much as a 1 pixel difference. It works like this: Client samples input, position and other variables 60 times per second and stores them as a snapshot, and stores the input in a packet buffer. Every 50 miliseconds (30 tickrate), the queued input packets are sent to the server with a timestamp and sequence number. When the server receives an input packet, it adds half of the round trip time to the timestamp, as well as en extra 100 miliseconds, and stores the sequence number as "last acknowledged packet". The inputs are stored in another buffer and are applied accordingly to the server timer (albeit 100 ms late). The server also stores snapshots, but 30 times a second as well as sending a packet with all the positions and relevant variables, as well as the "last acknowledged packet". When the client receives the packet, it reapplies all the snapshots from the last acknowledged packet onwards (server reconciliation). This seems to work pretty well, but I'm concerned about packet loss and any desynchronization that might happen. I've tried faking packet loss and it appears that when the character strays too far away from the true server position, it doesn't "correct" itself and the client remains in the wrong position. When I play games like Overwatch, sometimes I get terrible lag spikes and everything freezes for a while, characters begin flying to completely random positions until my connection gets better and then everything "jumps" to a correct state, and I'm not sure of how to account for situations like these. Also, the 100ms delay seems necessary as it makes sure the timestamps are applied in the correct order and properly spaced between each other, but I'm not sure if it's a good idea. Is there anything more I might be forgetting? What am I doing wrong?
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!