Sanjokids

Members
  • Content count

    3
  • Joined

  • Last visited

Community Reputation

105 Neutral

About Sanjokids

  • Rank
    Newbie
  1.   I was thinking along these lines as well. Old IPod Touch is weak. Its not my fault. Once I move onto even "decent" hardware, I shouldn't have any performance problem concerning this issue. But then I thought, well my game is simple now, so the objects don't need much memory. What if I moved onto more powerful hardware, and created a more complicated game, and my objects each required more memory now, and there were more objects. So is it wrong to assume that moving on to better hardware, you will never deal with this problem again? Then I was thinking what if I worked on Uncharted for Naughty Dog? You know they literally try to maximize every ounce of resource they have, and I concluded, they are probably not doubling up the data.     Message serialization speed actually was a performance problem when the client and host were the same. I fixed it back then, and am still using the same fix, which is pretty much what you describe in that I don't actually serialize the messages, but have the objects references intact. I have a concept of a network pipe vs a local pipe which routes messages via serialization, and the local pipe doesn't actually do any serialization in its implementation.     I haven't thought about this so much, because I'm using Unity as my game framework. It uses the concept of prefabs so if I had a character prefab made up of components like animation and physics as well as any other components. I haven't thought through how to cleanly create a client and server version of the prefab. If would be a maintenance nightmare. You wouldn't think that a animation component is needed on the server side of things since its all "visual", but I need it since animation can trigger events based on specific events, and my game logics uses events a lot to run the actual game.  
  2. Ok cool. Sounds like a good plan to move forward to. I just need to rework some of my code now so it fits better by this design.  
  3. Hi all, I'm working client server networking model for my game built in Unity, but haven't found resolution to my issue.    Basically what happens is the server is authoritative and has a complete game state of all game objects. It runs all the logic and updates on the game objects. The clients also has its "own" copy of the game state, but it doesn't run any updates that the server does. It only listens to server messages, and then synchronizes the game state accordingly. So if the server happens to stop sending messages, the client game state would appear frozen because there are no updates.   I had this implemented and everything worked as expected, until I hit a performance problem on a old IPod Touch device. The issue is that when one of the players is the host, he pretty much has to allocate twice as many objects, 1 set for client processing, and the other for host processing. The server code still sends messages to the client and the client processes it the same exact way as remote clients even if the client exists on the same device. This turned out to be expensive performance wise on a old device, but it seems desirable because its a clean separation and ownership between client and server code in the same application.   So the question is, in a client sever model, is this how its commonly done professional games, like Halo? It seems like its a waste of memory to have twice as many objects, but yet it seems like its the clean way to do things. Granted, you don't need to allocate all the graphical stuff for the server side set of game objects, but there could still be significant resources.   My current workaround is when you are hosting a game, just allocate 1 set, and the client and server shares the game objects and updates it, but put in checks to not step on each other's toes. This workaround could actually just be the ideal solution, but I just need to design my code better around it.   Thoughts and feedback would be great appreciated. Thanks.