Server receives messages from clients, adds messages to a queue

Started by
3 comments, last by hplus0603 17 years, 11 months ago
TCP Server application handles messages from users connected. With me so far? These messages could be chat, combat requests, movement requests, banking requests, and all the stuff that makes up a mmorpg. User sends messages to the server. The server receives these messages. Instead of just handling these messages immediately and dishing out a response, the messages are added to a queue to be processed periodically by another thread. Is this how it is normally done? ChatQueue, LoginQueue, CombatQueue, MovementQueue, BankingQueue? These are queues of messages from the users. Then you have maybe one thread for each that goes in and handles each of these queues? For a ChatQueue you would handle it every 500ms for example. For a LoginQueue every 1 second. For a CombatQueue every 5ms. For a MovementQueue every 5ms. For a BankingQueue (users who want to use the games bank for example) every 1 second. Does this make sense or am I way off here?
Advertisement
You dont need a thread for each one. Just a process that schedules processing.
It will simplify interlocks protecting data that overlaps between those different functions.

You might have seperate threads for file access and your network processing (and maybe AI...) as well as the main server processing.
Actually, having lots of threads isn't that good, and in your scenario there's also the problem with object locking, so

Have one thread for all network comms and have it block on incoming messages. Then when you receive one, you immediately process it - maybe passing it to the main server thread. If you don't process messages immediately, there could be the situation when several processing threads wake up at the same time, they all have some messages to process, one locks the world data because it needs it, others wait - and it lags, most probably.

If messages are processed when they are received, you physically can't have more than one message at a time, and then it's only a matter of performance that you process it before the next one arrives.
process messages with a little lag.

I've just recalled the idea: if you have to sync messages with the world (and you do), have a queue of relevant user *actions*, filled up by the network thread, and process all of them when the world ticks. That's the solution.
Last time i tried to do something that was meant to scale to a huge number of users [before it got absolutely smashed flat by my brain churning out ideas faster than i could implement them, and i became grossly dissatisfied with what i was implementing as i started to see what it could be, and thus forth should be, thus forth nothing less is worth doing :P], i did it with 2 major sections along the road from a message being received to processed. A message received from a client would first be applied to a table that checked sizes of values, min and max ranges applicable, general sanity checks [you can't send a chat string 80 million characters long, and you can't choose to adjust your speed to -1x10^37, stuff like that]. The checks were very superficial, and were just meant to check upper and lower plausable bounds based on message type. The message was then put into a queue to be fed into the message processor, and all the messages from this set of network units were sent to this queue. In the case of large numbers of users, i would have room for about a 250 user chunk per message cluster, in it's own thread, with it's own queue. The central message processor would read from the queues in order and do the finer checking [stuff like collisions, trying to pick up objects that aren't within reach, exceeding maximum speeds for specific units, the precise stuff]. It was looking pretty nice for a while there! [and text-based tests showed it scaled very well]. Each thread that dealt with the network interface had it's own outgoing and incoming message queue, and the whole thing seemed to stay reasonably well sync'ed.

Reasonably well sync'ed being the biggest problem with the system, and why it wasn't described as 'perfectly synced'. As with all things, networking introduces time lags, and dealing with the threads in the way that i did them [since i didn't include a time stamp that was important to the actual game data itself, instead just to assure that the client isn't spamming the server with hundreds of messages more than it ever should be able to], meant that two people who sent something at the same time, or even if person A sent something just before person B, who's packet gets processed first depends on where in the sequence it is, instead on who sent it first. This is a problem that your proposed possible solution would express drasticly, especialy with the seperation of things like 'combat queue' and 'movement queue'. Commands sent by a single player to do 2 actions in quick sucession may be excecuted in an order different from that which the actions were intended under a system that breaks the message ordering down by message type. [my method broke them down into whatever order my server discovered them, which is bad, but at least actions sent by individual players were assured to be processed in the order they were sent]

Some problems you might want to consider with your system.
Attacking, then moving out of range, having the 'move' command processed first, then the attack command.
Sending a chat message and then disconnecting entirely, what will happen with the chat message if the disconnect message is processed first?

Stuff like this will cause problems, but if you can manage it, and keep the potential issues in mind, then of course you can [,at least attempt to] compensate for the possible shortfalls of the system by including checks for these sorts of special cases.
The pattern we're suggesting is well know, it's typically known as an "event queue" or "action queue," and is typically implemented as a priority queue with the time at which each event should happen as the "priority".

So, if you want to do something once a second, the code would look something like:

void InitProgram(){  /* make sure the event fires the first time */  AddToEventQueue(&DoSomething, TimeNow());}void DoSomething(){  stuff();  AddToEventQueue(&DoSomething, TimeNow()+1.0);}


These kinds of systems tend to behave reasonably gracefully under load; they will just start doing things less often than indicated by the timeout time (because by the time the queue gets around to running the task, it's already after the time it was queued for, so TimeNow() returns the current, later time).

Also, the implementation of these queues typically use some C++ abstract base class coupled with template magic of some sort to make them easier to use and manage -- the raw C-style example was just to show how to create a recurring task.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement