Multiplayer Setup - Lobby
Members - Reputation: 772
Posted 02 February 2013 - 01:58 PM
- Requests to join a multiplayer game.
- Sends game data to HOST.
- Accepts game data from HOST.
- Accepts incoming message for matchmaking.
- Stores list of players ready to play.
- Pairs players up by random (at first, then by other factors later).
- Accepts incoming data from client and sends data out to all clients playing the game.
Here's the steps it would work:
1. Client requests to play online game.
2. Server accepts request, stores IP:Port of client.
3. Server waits for another request.
4. Server accepts another request.
5. Server matches 2 requests.
6. Server chooses 1 client to be host.
7. Server sends client connection data to each client and removes both clients from matchmaking list.
8. Both clients send game data to the host, which is 1 of the clients.
9. Game continues till end.
10. If no disconnects, host reports game stats to stat server for online leaderboards/etc.
Q1: Is this how a basic matchmaking server and client/host server works?
Q2: This example shows how a user is the HOST, if I wanted to have a server the HOST, would I need to run like 500 server applications as a service to host 500 games? lets say for example? Is that how that's done??
Members - Reputation: 239
Posted 02 February 2013 - 03:18 PM
If you're looking for a lobby system, you probably want to keep some representation of the lobby while people are waiting, not just pair everyone up once you've found enough people to start the game.
Then again, it all depends on what you're looking for. In Left 4 Dead you usually end up in a half-filled lobby when finding a random game, whereas other titles skip the lobby altogether.
But to answer your questions:
Q1: Your approach above should suffice, but in the end, it will change depending on how you want the game join flows, lobbies, etc. to work. The easiest approach is to have players choose between join or host up front. On the other hand it will occupy players until the game is filled up enough to start.
Q2: This is also depending a lot on your requirements. It's usually most straightforward to set up all game instances up front in various game mode, map, mutator combinations. However, it's not that adaptive to changing popularity.
More complex approaches entails having a system which monitors the server utilization of various setups, spinning up new instances if you have less than X percent capacity left for any one setup. You can even make it take resource costs into consideration, so it can have more smaller games or fewer larger ones on the same hardware.
Another approach is to spin up "empty" server instances which can be reset to any player requested configuration on demand. That way you don't have to maintain the various combinations ant leave it up to the players to choose when they reset the server. It's harder to manage CPU utilization though if players happen to create smaller games on one machine or if another gets all the large games.
Members - Reputation: 107
Posted 02 February 2013 - 04:17 PM
Just my 2 cents here.
The algorithm you have seems pretty solid to me. I’m not sure if you’re using winsocks or something else. Also not sure if you’re using UDP or TCP. If you’re using UDP you might want to add a check every few seconds to see if the clients are still responding, and/or perform a test to see how many packets are lost (make sure it’s in range for your game). A quick ping to make sure there wasn’t a disconnect, that way invalidated players could be taken out of the saved players list. Although TCP is more reliable for data packets, disconnects can still happen so you could do the same with a TCP connection. In the master server I would store all the players..but would also check every 60 seconds or so to see if the ip:port still responds to pings. I would add this because after a while your list of players could fill up with dropped connection player data, also if your match making ends up having some sort of user login/validation logic the server will think they’re already connected if they try to log back in after a disconnection event.
Q2: I’m not sure if you meant that you would code 500 host applications that are different than the master sever executable. I wouldn’t execute 500 applications at the start. This would use resources that aren’t needed until a game is actually matched. Also in this case I think you would end up having to setup a separate port for each application so that you could have 500 different listeners. This would also make you have to code the client to test every possible port to find an empty server game application. (However, if you had a single “master sever” clients could first connect to this and would be assigned port/host application to play on (possibly only executing new instances of the application as needed). Again, I not sure what your design plan was.) This is definitely doable.
I would code a single master server application as a multithreaded application, and code each host game as a thread in this application. I’d create a linked list of host game threads (or whatever structure you prefer) Every time two players were matched for a new game I would create a new node and spawn a new thread passing in the players
data. Since you’re limited by hardware I would probably code a max number of threads (available games). In this way games could be kept track of by a master server application and games could be added and removed from the linked list as needed. This would also give you the ability to check in on games. Test that the games haven’t disconnected. If so you could reclaim the game node and log the players out. After a while your list could fill up with dead games so you may want to code a process that
would walk the linked list every once and a while to make sure things are going ok. Maybe even later you could add some tests to make sure players aren’t cheating.
What I said above only works if you plan to have a single computer act as your server, if your plan was to have servers on several computers than you would need to do something different and have at least a server/host executable on each machine.