Sign in to follow this  
bilsa

Threads in server?

Recommended Posts

Hi there! Currently I have created an async server with .Net. My question is not so much about the networking but about how many threads should be optimal usage? Currently Im using threads in the following cases: 1. The .Net thread pool is used for asynchronous read/write/connect/disconnect network data transfers. This thread pool may use up to 25 threads. 2. An cusom threadpool for generating events in the server. This thread pool uses 5 threads for generating events in the server such as OnPacketRecieved(...), OnClientConnected(...) etc... My thinking behind this wast that the networking code should not be interrupted by firing an event. 3. The actual server handling instance, recieves events from the server. The handling of events does not take much time. Most time is consumed by OnPacketRecieved()/OnPacketSent() event handling. In this case the packets are translated into a Request and put into a PriorityQueue. 4. A RequestHandler uses 10 wokrer threads to process the requests in the priority queue. Ok, so that makes about a total of up to 40 threads. In a game, I would say that is way to many threads... I was wondering if I'm overexaggerating the thread usage? Am I completely off the design? Would appreciate any tips, and suggestions.

Share this post


Link to post
Share on other sites
Quote:
Original post by bilsa
1. The .Net thread pool is used for asynchronous read/write/connect/disconnect network data transfers. This thread pool may use up to 25 threads.

2. An cusom threadpool for generating events in the server. This thread pool uses 5 threads for generating events in the server such as OnPacketRecieved(...), OnClientConnected(...) etc...
My thinking behind this wast that the networking code should not be interrupted by firing an event.

Why a custom thread pool?
First, .NET's default thread pool is likely to be more robust and better performing than any you create. And second, the entire point in a thread pool is to be able to share threads smoothly between tasks. So why create a *different* thread pool each time you need one? Reuse the same, and if neccesary, allow it to use more threads.

Quote:
4. A RequestHandler uses 10 wokrer threads to process the requests in the priority queue.

Why? Do you expect to have to handle 10 such requests simultaneously? And would there be a problem in handling, say, 5 of them, and leaving the next 5 in the queue until the first ones have finished? Or for that matter, handling one at a time?

And moreover, in a priority queue, I'd expect requests to be handled sequentially. that is, the topmost request (which has highest priority) should finish before starting on the next... Or what? Obviously I don't know what you're using the queue for, but it seems a bit conflicting to me.

Well, my guess would be something like this:
Thread pools: No big deal. The threads that aren't in use aren't active, so they won't take up resources. I wonder a bit about why you chose 25/5 threads specifically for your thread pools, but that doesn't sound like a problem to me.

Other than that, use as many threads as makes sense, and no more than that. On a multi-core system, you obviously need a couple of threads just to be able to utilize the CPU fully. But other than that, don't make threads you don't need. Spamming threads won't magically increase performance. (Taking your request handler as an example, if you receive 10 requests, and assign a thread to each, these 10 threads will be competing for CPU time, meaning they'll *all* be slowed down. If you took one at a time, that could get a full core to itself and finish faster (in 1/10th of the time), and so the next request could start after only a 10th of the total time has elapsed, and so on. So the first 9 requests would finish *faster* with a single thread than they would if you gave each request its own thread. And the last request would finish at roughly the same time in any case)

Share this post


Link to post
Share on other sites
Hum, ok first off, it's a server for a mmog.

Ok, so my thinking was that the priority queue, should not be used as an absolute sequence of requests.

For example, when a player is in combat - his requests should have higher priority than a player in a town. Though my thinking was that this shouldn't mean that this players request MUST be finished in order to continue with the other requests. It's just a loose indication of what requests should be handled in the nearest time. Ideally I would like to have one thread for each player and have their requests handled simultaneously o.O... Is my thinking stupid?

Ok, and my custom threadpool would be used to ONLY handle events, because I was thinking that in case some server handlers are heavy... they would block the network recieving/sending.

For example. If I have 10000 connected players, and I recieve 25 simultaneous/almost simultaneous incomming packets, that would mean that all the threads would be used to recieve data and fire OnPacketRecieved(...), the handler that is invoked could potentially take a long time, which would block the netwok send/recieve?

Lets just say for the sake, that the server handler that is invoked must wait to synchronize with some other player, which could take some time. Wouldn't it be possible to lock all the networking threads in that case?

Of course I can't make as good thread pool as the default .Net, but it works fine enough to suit the mentioned scenario.

I think I have to think about what you said, and try have it "sink in", as the multithreading concept is fairly new to me :O

Share this post


Link to post
Share on other sites
Quote:
Original post by bilsa
For example, when a player is in combat - his requests should have higher priority than a player in a town. Though my thinking was that this shouldn't mean that this players request MUST be finished in order to continue with the other requests. It's just a loose indication of what requests should be handled in the nearest time. Ideally I would like to have one thread for each player and have their requests handled simultaneously o.O... Is my thinking stupid?

Well, as I said above, "simultaneously" typically only means that *both* are slowed down to 50% of the execution speed they'd otherwise get. In other words, instead of request 1 finishing after x milliseconds, and request 2 finishing after 2x milliseconds (because it had to wait for request 1 to finish), you've ensured that *both* will finish after 2x milliseconds (because they run in parallel and have to share the CPU).
I'm not sure if I'd call that an advantage. [wink]
Of course it's not always that simple.
- If you have multiple CPU's/cores, each request might get a core to itself, and then both will be able to finish after x ms. But even then, you won't gain anything from having a large number of threads. (You might have 4 cores, so you can run 4 threads in parallel. If you try to run 8, they'll all be slowed to roughly 50% of their original speed)
- If the requests involve both a decent amount of work for the CPU as well as some blocking operations (I/O typically), then it might be an advantage to have multiple requests in flight, so that while some requests are waiting for I/O, others can be processed by the CPU.

So it's a tradeoff as always, and it depends on what work the threads are to perform. It might make sense to have a couple of threads for this, but I'd stick with, well, a couple... The real test is when you benchmark it under load. Which leads to probably the best answer... "Make sure it's not too difficult to change the number, and then wait and see if it becomes a problem").

Quote:

Ok, and my custom threadpool would be used to ONLY handle events, because I was thinking that in case some server handlers are heavy... they would block the network recieving/sending.

A general rule of thumb is that a thread pool should only be used for tasks that don't take too long. If they take long to complete, give them a dedicated thread until they finish.
But the big question is, *are* any server handlers so heavy that it'll become a problem?

Quote:
For example. If I have 10000 connected players

You do know that virtually all MMOG's currently have at most 3000 or so players online per server? (Eve Online is the only one I'm aware of to have broken 10k players)
In other words, start by aiming for, say, 500 players, because you're not Blizzard, and you don't have $10 million to spend on developing the server architecture... Do you?

Quote:
and I recieve 25 simultaneous/almost simultaneous incomming packets, that would mean that all the threads would be used to recieve data and fire OnPacketRecieved(...), the handler that is invoked could potentially take a long time, which would block the netwok send/recieve?

Yep, then these 25 requests would tie up your 25 threads. And they would then be tied up for roughly 25 times as long time as they would if you'd read one at a time, because they have to share the CPU with 24 other requests. So in all that time, you're completely unable to read incoming packets *anyway*.
If you are worried about the receive buffer filling up, you could just have a worker thread listening to that buffer and moving data to your own, bigger buffer (or a queue) as it arrives. Then you'll never fill up your buffer, and you have a dedicated thread for snatching data from the receive buffer as soon as it arrives. And since that thread has nothing else to do, you can be reasonably certain it won't be blocked long enough to cause problems.

Quote:

Lets just say for the sake, that the server handler that is invoked must wait to synchronize with some other player, which could take some time. Wouldn't it be possible to lock all the networking threads in that case?

That sounds like an operation that could take a while to complete. Either free the thread while you wait (just store what information you need, and then return. When you next receive data from the player, a new thread from the thread pool can be assigned to continue where you left off), or if you don't want to release the thread, don't use a thread pool. A thread pool is only for tasks that you know won't take too long. Anything that might block for hundreds of milliseconds (such as waiting for network data) is long-running enough that it'd be better to give it a dedicated thread.

Quote:
I think I have to think about what you said, and try have it "sink in", as the multithreading concept is fairly new to me :O

Don't bother trying to multithread everything. Don't try to assign 25 threads to handling incoming network data.
First, you don't know if you're going to need it, and second, it's going to be a pain to keep it free of any major bugs.
Instead of having 50 threads all doing the same thing (and competing over the same resources), it might be better to have 1 thread for each of 50 (or 5) isolated components of your server. (Isolated in the sense that they share as little state as possible. For example, one thread might put data in a queue, and the other read it, and apart from that, they have no common data they can both access. That's easy enough to make threadsafe and it means you know what each thread is doing, so you can actually tell in advance whether or not it might block. If a thread's only job is to receive network data and put it into a queue of some sort, then you can be reasonably sure it won't block and your receive buffer won't overflow. But if the thread were meant to take a packet, parse it, call whatever handlers are needed, wait for it to complete, then you have absolutely no clue how long it'll take, and when (or if) it'll return and be ready to read the next packet)

Other than that, you just need to keep in mind that in the typical case, spawning more threads does not mean your work units will be finished any faster. On the other hand, it means that the otherwise fast units will take much longer because they have to share CPU time with all the others.

Share this post


Link to post
Share on other sites
The number of threads in a thread pool should typically equal the number of physical CPUs on the machine, plus the number of expected simultaneously-blocking threads. I e, if you expect to receive 100 connections a second, and each will block on a disk request for 20 milliseconds, you'll get an average of 2 blocked threads. If the server is dual-core, then add two more threads, so your thread pool should have a total of four threads.

Also, you should never issue synchronous requests from one thread pool request to another thread pool request in the same pool -- that way lies madness, and deadlock.

Share this post


Link to post
Share on other sites
Thanks for the great replys guys! Special kudos to Spoonbender ;)

Btw, i'm guessing that the .Net runtime is using the 25 threads in the default threadpool for other stuff other than my async network IO calls, so maybe I was a bit unclear on that.

Anyway, I really enjoyed reading your replys and as you suggested I'll make sure to have the possibility to cut down on the threads easily. (Though I'm not sure how much wiser I have become :P )

Thanks!

Share this post


Link to post
Share on other sites


Threads waste resources. The minimum possible should be used which have independant chokepoints (ie- network, filetasks, main gameengine).

Possibly, if you have multiple disks that can execute requests independantly you could have a thread for each. With one disk you can only do one read/write at a time and all the rest have to wait anyway. Incomming packets get dumped into event queues in a linear fashion and likewise outbound packets goes thru a single chokepoint (possibly if you have multiple NICs you could have one thread each, but it might not gain you anything).

The main operations done by the game engine should have the many small tasks (like for each connection proxy) broken up into 'fibers' or 'micro threads' that dont call on the OS to switch (you do that yourself with FSM and queues). These tasks interact with the other support threads (Network/File) calling those services and waiting (switching to another fiber) until their request is ready.

The above would be for a single CPU. For multiple cores you add additional threads to divide up the game engine operations (and possibly dedicating a CPU to IO when it is heavily used). There often is data dependency which either requires linear processing or wasteful interlocks. Dividing up the game engine tasks for data indepenance is not always easy (when you want to keep all CPUs as busy as possible). Often things like AI and pathfinding which are delinked from the game turn/frame cycle are run on seperate threads because they can be preempted and then resumed (run as background filler between more time driven game processing). Multiple cores allow more flexibility in fitting the mix of the different processing which needs to get done.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this