So I have my little multi-player TCP game socket server running on a dedicated server on the web somewhere and I'm having issues with lag as the player numbers go up to the point where with 10 players I'm seeing 300ms ping times in game, making it unplayable.
Ping time is defined as the round trip time for an in-game message.
I'm struggling to diagnose the problem; the server side game loop itself doesn't slow down at all with 10 players, so I'm thinking it must be something to do with the socket server.
I'm running a c# sever in mono on a linux box. The game thread is separate from the socket server thread, messages are passed from the socket server into a queue on the game thread and all are guaranteed to be processed every frame.
* I've got NO_DELAY set on the socket connections made
* The messages received fit fine into the allotted buffer size I have
* I'm only doing around 100 send's per second on the server with 10 players
* I'm using the asynchronous socket functions BeginReceive(), BeginSend(), BeginAccept() etc...
Any advice about where to look would be greatly appreciated!
By putting an 'infinite' loop inside the BeginAccept callback I was essentially hijacking the IO thread-pool's threads, depleting them to such a degree that they must have run out inside the BeginReceive() callback which would cause the lag.
I fixed this by relinquishing the callback threads as quickly as possible so they could be returned to the IO completion thread pool.