How to efficiently handle game objects in a client server model on the host side?

Started by
4 comments, last by Kylotan 11 years, 1 month ago

Hi all, I'm working client server networking model for my game built in Unity, but haven't found resolution to my issue.

Basically what happens is the server is authoritative and has a complete game state of all game objects. It runs all the logic and updates on the game objects. The clients also has its "own" copy of the game state, but it doesn't run any updates that the server does. It only listens to server messages, and then synchronizes the game state accordingly. So if the server happens to stop sending messages, the client game state would appear frozen because there are no updates.

I had this implemented and everything worked as expected, until I hit a performance problem on a old IPod Touch device. The issue is that when one of the players is the host, he pretty much has to allocate twice as many objects, 1 set for client processing, and the other for host processing. The server code still sends messages to the client and the client processes it the same exact way as remote clients even if the client exists on the same device. This turned out to be expensive performance wise on a old device, but it seems desirable because its a clean separation and ownership between client and server code in the same application.

So the question is, in a client sever model, is this how its commonly done professional games, like Halo? It seems like its a waste of memory to have twice as many objects, but yet it seems like its the clean way to do things. Granted, you don't need to allocate all the graphical stuff for the server side set of game objects, but there could still be significant resources.

My current workaround is when you are hosting a game, just allocate 1 set, and the client and server shares the game objects and updates it, but put in checks to not step on each other's toes. This workaround could actually just be the ideal solution, but I just need to design my code better around it.

Thoughts and feedback would be great appreciated. Thanks.

Advertisement

I came up with a very simple way of getting around this issue in my engine when I realized I would have two maps loaded (one for server side calculations and one for client rendering) on the client that was hosting the server. I decided to treat all map resources (maps, models, textures, entities, etc...) as a set of data separate from the client and the server. If the client is connecting to a remote server, it uses the "global" data set to maintain it's copy of the server set. If a server is running as dedicated, it uses this "global" data set to update the game state. If a client is hosting a multiplayer or running a single player game (i.e. connecting to a local server), the server-side code will modify the "global" data set and the client-side code will read from it.

Edit: I noticed you came up with the same idea after I wrote this. I recommend that you go for it.

Ok cool. Sounds like a good plan to move forward to. I just need to rework some of my code now so it fits better by this design.

I had this implemented and everything worked as expected, until I hit a performance problem on a old IPod Touch device. [...] So the question is, in a client sever model, is this how its commonly done professional games, like Halo?

Nobody ever expected to be able to host a Halo server on an iPod Touch. You're talking about quite old and slow hardware that wasn't even designed for games. So you do need to have reasonable expectations.

So yes, this is pretty much how real games do it. It's always been the case that whoever hosts the game is expected to have the more powerful machine.

The issue is that when one of the players is the host, he pretty much has to allocate twice as many objects, 1 set for client processing, and the other for host processing. The server code still sends messages to the client and the client processes it the same exact way as remote clients even if the client exists on the same device.

You can make the messaging more efficient by just passing the messages by reference and ignoring all the serialisation and networking. You can't do much about having twice the objects unfortunately, if you want clean code. I suppose you can hack around it by sharing the objects across both versions of the game but in my experience the client objects and server objects are usually quite different, both in terms of the data they store and in how they act (eg. interpolated positions for client objects). If you can get away with using one set of objects though, feel free to continue.

So yes, this is pretty much how real games do it. It's always been the case that whoever hosts the game is expected to have the more powerful machine.

I was thinking along these lines as well. Old IPod Touch is weak. Its not my fault. Once I move onto even "decent" hardware, I shouldn't have any performance problem concerning this issue. But then I thought, well my game is simple now, so the objects don't need much memory. What if I moved onto more powerful hardware, and created a more complicated game, and my objects each required more memory now, and there were more objects. So is it wrong to assume that moving on to better hardware, you will never deal with this problem again? Then I was thinking what if I worked on Uncharted for Naughty Dog? You know they literally try to maximize every ounce of resource they have, and I concluded, they are probably not doubling up the data.

You can make the messaging more efficient by just passing the messages by reference and ignoring all the serialisation and networking

Message serialization speed actually was a performance problem when the client and host were the same. I fixed it back then, and am still using the same fix, which is pretty much what you describe in that I don't actually serialize the messages, but have the objects references intact. I have a concept of a network pipe vs a local pipe which routes messages via serialization, and the local pipe doesn't actually do any serialization in its implementation.

client objects and server objects are usually quite different, both in terms of the data they store and in how they act (eg. interpolated positions for client objects).

I haven't thought about this so much, because I'm using Unity as my game framework. It uses the concept of prefabs so if I had a character prefab made up of components like animation and physics as well as any other components. I haven't thought through how to cleanly create a client and server version of the prefab. If would be a maintenance nightmare. You wouldn't think that a animation component is needed on the server side of things since its all "visual", but I need it since animation can trigger events based on specific events, and my game logics uses events a lot to run the actual game.

Then I was thinking what if I worked on Uncharted for Naughty Dog?

If you were working on Uncharted for Naughty Dog then you would have another 20 programmers on your team and can afford to spend time on these little things. For a small or one-person team, you can't worry about every byte. If you try and replicate AAA approaches with an indie team then you won't finish your game.

It uses the concept of prefabs so if I had a character prefab made up of components like animation and physics as well as any other components. I haven't thought through how to cleanly create a client and server version of the prefab. If would be a maintenance nightmare.

Generally you're not going to have every single thing be a prefab and nothing but the prefab. You might Instantiate a prefab and then add components to it depending on whether it's a server object or a client object. But it's up to you - if one object works well on both the client and server side then that's great.

This topic is closed to new replies.

Advertisement