Jump to content
  • Advertisement
Sign in to follow this  
triggdev

World loop/engine design with apache MINA

This topic is 922 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm currently writing a 2D MMO that will be pretty small scale, kind of like Zelda and Pokemon on GameBoy. I'm wondering how I should update the world and when I should send delta updates to connected clients.

 

I'm using MINA, so each client is connected to a character in the world. When a client sends a command (like movement changes) it checks if the command is "legal" (velocity can't be more than X, etc) and then adds the movement to a queue for the server loop to process. Basically, I have asynchronous clients sending packets that all come together and become synchronized in the game loop. The command that gets received first, gets processed first.

 

The only commands that are currently sent from the client are velocity changes (usually when you start moving or change direction) and then position updates when the character stops moving voluntarily(no collision). Collisions are detected clientside as well as server side but the server has the final say in what happens. The client does not send a collision update to the server because the server already knows where the character should be based on the last velocity update. For example: if a player A walks straight into a wall, A's client will stop A's movement and the server will also detect the collision and send an update to all other clients in the area telling them that A stopped at x,y location.

 

Now, onto HOW the server knows. First, the server knows where each player is and knows the entire map, obviously.

The client runs on a game loop set at 60fps, so each frame takes 16.6ms to process. Velocity is "added" to position and scaled using delta time between frames. I want to write a server engine loop that runs in the exact same way and corrects the client if need be.

 

In this loop I want to also send delta updates to the client, but I don't want to send them every 16.6ms. I would rather keep the updates at a 50ms tick (20hz).

 

Should I forget about updating the server at a 16.6ms tick since It wouldn't matter because the clients get updated at 50ms? Or use 16.6ms for more accurate collision detection and corrections while sending delta updates at 50ms but correcting a client if a collision is detected?

 

Also, the combat will be turn based, so a fast tickrate isn't really necessary.

Edited by triggdev

Share this post


Link to post
Share on other sites
Advertisement
FPS games need the server to run at 60 Hz like the client.
Turn based games do not.
You have to make a decision about how correct you want your game to be.
If by-the-tick movement is important to you, then make the server simulate the game and correct clients that diverge.
If only "important" events (like battles, pick-ups, etc) matter to you, then don't worry about the precise location of clients, and only make the server the arbiter of important events.
There's no "right" or "wrong" here -- it's up to what you think is important to your game. (With an eye towards cost -- servers for FPS games are more costly than servers for importance-only games.)

Share this post


Link to post
Share on other sites

Thank you for the response!
 

I guess I worded my question wrong. Should I stick to ONE tickrate for everything on the server? Or would simulating player movement at 60hz but only updating clients at 20hz be beneficial? I'm guessing it wouldn't matter that the simulation runs at 60hz if it's only being updated at 20hz.

 

To clarify (hopefully...sorry I'm bad at explaining things), I want to send player position/velocity at 20hz and check for collisions, battles, item pickup, etc at 60hz. So the important stuff will get processed by the server much quicker than just the location of players. "Processing the important stuff" includes sending an update to all clients that an Item was just picked up, so delete it from the map. All clients that can see the item of course.

Share this post


Link to post
Share on other sites

Typically, there will be one (fixed) tick rate for physics simulation, and the either a separate, fixed, tick rate for networking, or some kind of adaptive network send rate.

 

So, yes, different rates for "simulation" versus "networking" is common. Typically, you will time-stamp each event that goes into a packet with the tick number for it. (Which may be "when it happened in the past" or "when it should happen in the future.")

 

Note that collision, pickups, and other gameplay events should happen at the physics tick rate. The network tick rate is simply there to batch up multiple small updates into one bigger packet, to cut down on packet header overhead. This will have the effect of adding some effective latency to the "round trip time" from "command" to "everybody has seen this command," but it's generally worth it. 50 ms network updates versus 17 ms physics ticks only add an extra 33 ms of effective round trip time, and saves 66% of network packet header overhead!

Share this post


Link to post
Share on other sites

Thank you, that clears up a lot! But also brought up new questions

 

 

Typically, you will time-stamp each event that goes into a packet with the tick number for it.

Interesting... 50ms for network and 16.6ms for physics would mean that there are 3 physics ticks per network update. When the client receives the update, it will "run" the each physics tick per client tick? I'm guessing the clientside fps would need to be factored in but I doubt it will ever drop below 60.
 

 

This will have the effect of adding some effective latency to the "round trip time" from "command" to "everybody has seen this command," but it's generally worth it.

I'm not sure what you mean by "command" to "everybody has seen this command". Are the clients supposed to be sending packets saying that they have received an update? Or are you referring to how the server will "hold" a command until the 50ms mark and then send the update packet and therefore it has been seen?

Share this post


Link to post
Share on other sites

I'm guessing the clientside fps would need to be factored in but I doubt it will ever drop below 60.

 

You should separate your graphics and your physics simulation rates.

 

are you referring to how the server will "hold" a command until the 50ms mark and then send the update packet and therefore it has been seen?

 

Yes, and the same for commands client-to-server (typically.)

 

Also, clients will send input commands to the server, and the server will then repeat those out to all other clients, so the other clients can simulate the original clients actions.

 

You will also need some amount of de-jitter buffer, where commands are queued for a particular tick, that may be in the future or past of where you are now. When the command is for the "past" it means the packet arrived "too late" and you need to increase your estimate of latency / jitter. When commands arrive for "way" into the future, it means your estimate is wrong the other way.

Share this post


Link to post
Share on other sites

Hmm, I'm not quite understanding what I would do with an estimate of latency how I would even come up with an estimate. Wont all updates received by the client be in the past? And if a client has, say 100ms, latency with the server, wont that mean that the client's simulated world will be 100ms behind the server world regardless? And isn't that how most if not all multi-player servers work?

 

Thanks for all the help!

Share this post


Link to post
Share on other sites

Personally I think that trying to synchronise remote physics in a game like this is a waste of time. It's not possible to guarantee perfect synchronization since it'll typically take so long for data to make the trip from one end to the other that you're a frame or two behind, so all you can hope for is 'close enough', at which point you have to ask why you'd bother chasing 60Hz physics and not just 10Hz or 1Hz.

 

In these situations, for an MMO game (NOT an FPS or similar), and especially those with no real-time combat, I would simply have the client perform local physics and let it tell the server both position and velocity. The server can perform broad validity checks but doesn't usually need to actually simulate it, providing position and velocity are within reasonable parameters. It can broadcast these values to other clients, who typically strike some compromise between the values their local simulation just calculated and the new values provided by the server by interpolating between one and the other. The locations a client sees for other players will be lagging some small degree behind what that other player sees on their screen but this doesn't matter in any meaningful way. (Again, the caveat being that this is an MMO with no mechanics based on strict real-time physics resolution.)

Share this post


Link to post
Share on other sites

I agree that synchronization would be too much effort for not much reward. It's perfectly fine that the clients will be slightly behind. Right now I'm trying to reduce the amount of packet overhead as well as have a world simulation on the server running at 60hz. It doesn't necessarily need to perform physics as it will mainly be used to perform NPC and Monster AI (like patrolling an area, aggro, etc), although implementing collision detection on the server side to reduce packets coming into the server would be pretty easy at this point. The way I'm doing this is to hold world ticks until the 50ms mark and then send those ticks. Each tick is attached with an int(ms) that tells me how long that tick took to process. Most of the time it will be 17ms, 17ms, 17ms. But it could also be 25 and 25 or 20 and 30 with just 2 ticks if the server can't keep up with a 17ms tick time.

 

Let's say the client receives a world update with 3 ticks marked at 17ms. The client will then take it's own delta time from the renderer and call a method called processTick(stage,deltaTime = 17ms in this case). The client then groups up the first N ticks with a ms sum closest to deltaTime. In this case it would be the first one, but if the client's delta time was 30ms for some reason, it would process tick 1 and 2 in the same render() call because tick 1 and 2 ms add up to 34ms. This would prevent server ticks from building up on the client without disregarding any tick update. Do you see any problem with this?

 

Regardless, thanks for the help! I've been learning a lot in the past few days.

 

EDIT: Not sure what I'm on about with "Most of the time the first tick will be 17ms, second 34ms, and third 51ms." and worldUpdateTickTime, it's early and I'm hungry.

 

EDIT2: I finished writing the world loop and network "pulse" and it's working flawlessly. Now I just need to work on reducing the amount of network packets sent and also lerping with client corrections. Thanks for the help!

Edited by triggdev

Share this post


Link to post
Share on other sites

Right now I'm trying to reduce the amount of packet overhead as well as have a world simulation on the server running at 60hz. It doesn't necessarily need to perform physics as it will mainly be used to perform NPC and Monster AI (like patrolling an area, aggro, etc)


If you're not doing physics, you don't need to be doing 60Hz. You don't even need to do 30Hz. Human reaction times are closer to 4Hz (http://www.humanbenchmark.com/tests/reactiontime/statistics) so why should your NPCs be thinking much faster than that.

I like to handle AI at MMO scale by performing expensive decision making rarely (less than 10Hz, maybe even 1Hz) and which leaves only the low-level semi-physics stuff like movement and steering running more often - and even that doesn't need to be 60Hz given that things are generally moving slowly.

although implementing collision detection on the server side to reduce packets coming into the server would be pretty easy at this point.


I don't see how collision detection on the server reduces packets. You're going to want collision detection at both ends, but on the client it's to make gameplay feel smoother, and on the server it's to protect against bugs, lag, and hacking.

The way I'm doing this is to hold world ticks until the 50ms mark and then send those ticks. Each tick is attached with an int(ms) that tells me how long that tick took to process. Most of the time it will be 17ms, 17ms, 17ms. But it could also be 25 and 25 or 20 and 30 with just 2 ticks if the server can't keep up with a 17ms tick time.


I don't think that is a useful thing to do. I would just have the server tell the client where things are, and how fast they're moving, and let it handle the rest.
 

[...] This would prevent server ticks from building up on the client without disregarding any tick update. Do you see any problem with this?


Yes, you're making a lot of work for yourself by trying to synchronise things that don't need synchronizing, and at a rate far beyond what is necessary.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!