Why everyone develop MMORPG in client/server mode?

Started by
59 comments, last by Extrarius 18 years ago
The main challenge is cheating/security.

Another of the challenges is: if I am far away from you, and thus we're not currently sending information to each other, and I then move close enough to you that we should "discover" each other, then how does your peer and my peer find each other? Doing this entirely peer-to-peer is not yet a solved problem; there's always some seeding server that starts up the process in the architectures I've seen.

A third is that of efficiency: when you use client/server communication, then you send only your data to the server, and you receive a consolidated packet with the N closest entities from the server. Header overhead is small, and upload bandwidth use is small. In P2P, you have to send your data to each of the N nearby peers, which means your upstream is suddenly multiplied by N; further, you'll receive updates from the N closest peers as individual packets, instead of consolidated packets, for much less efficient usage of your network link. This is extra bad because most consumer broadband connections are asymmetric -- there are cable connections with 4.0 Mbit down, and 48 kbit up!

If you really want to look into P2P MMOGs, then I suggest you google for the "VAST" project; it's an academic research project and open source effort on the subject.
enum Bool { True, False, FileNotFound };
Advertisement
Hey everyone. This topic comes up often enough. Suffice it to say, this is a more diffilcult problem than the OP recognizes, but not as difficult as everyone else has been saying. It is not impossible to fix at least some of the problems mentioned here.

Possibly the easiest to fix is "having to send positions to everyone". Instead of considering your network distributed clients, consider them distributed servers. Instead of each "client" holding their own info, they are responsible for some other set of information. For example, one client could be a set of shops, another NPCs, and another player positions within a zone. On top of all this put a server that knows who has what. The server just needs to distribute that info as needed, and then the clients can talk ot each other to actually play.

Security is a more difficult beast, but steps can be taken. Redundant data across clients is one possibility, where the server checks data veracity every once in a while or only checks when clients disagree with each other. By intelligently copying data across clients you make it so that a cheater has to control a large percentage of clients to cheat (at which point you could even instance a new world just for the cheaters).

Of course, every possibility also raises questions of bandwidth. Creating data redunandancy also means increasing the required bandwidth for each client. Doubling redundancy can also mean doubling bandwidth. You also have to deal with the normal connection issues like lag, except with someone elses computer playing the server.
Turring Machines are better than C++ any day ^_~
Sadly, it isn't as simple as assigning each client a small load of the world(shops, npc's, etc). On the other hand, if you "packed" your server objects, you could assign, let's say, a pack of 5 npc's from the graveyard to 2 clients. This would cause the server to have a backup type system incase one player drops, bringing us to the question: What happens if the 2 clients drop from a power outage in their area and they were assigned the same pack? The npc's in the graveyard would have no interaction with the game world? They would disappear until re-assigned?
Hello?
Before you say that redundancy is a solution, I encourage you to actually try implementing it, and trying to avoid world splits and keeping everything consistent. Let us all know when you've succeeded ;-) It's a really hard problem, that's not yet been solved in this context.
enum Bool { True, False, FileNotFound };
I have also tought about this, and what i found out is that you
have to redesign your game :-)

For example, you could be an iseland in an iseland world,
Iselands dont move, so P2P communication with nearby islands
would be possible, and would benefit your server. Your world could consist of simulation objects, like a bullit from A to B, and then maybe 3 peers could precalculate the result, and then compare the different results?

Slow changes could be a keyword here :-)

The free roaming open world MMO, with high speed action isnt benefitial
to do P2P as i see it.
-Anders-Oredsson-Norway-
There has been talk (in at least one game programing book I saw more than a year ago) about having some server functionality (data processing) farmed out to the client machines. To overcome the reliability and security issues, part of the system would do testing to establish 'trusted peers' and have the data processing be sufficiently mystified and encrypted to make it hard for any cheater to try to glean useful game info (and change it as well).

Batched tasks would be sent from centralized servers out to the processing peers and could change constantly so that it would be hard to intercept related data for any length of time. Certain AI functions could be done (and the higher downlink speed ratio isnt as prohibitive, as usually the results are much smaller(uplink) than the data (world situational data) used as input (downlink).

Mystification measures might include AI scripts/dll code being resent on the fly to run -- possibly patched/recompiled daily to make reverse engineering in real-time nearly impossible (because you dont get the code til its to be run). Data could have any recognizable game identifiers removed/encoded (basicly a bunch of ordinals representing some small part of the map and unknown objects...) and decoded when reinserted into the central servers.

Tasks can easily be reassigned if a client goes down and statistics would be kept to decide which 'peers' can be 'trusted'...
Interesting, but I imagine the processing involved to keep track of which tasks are to be sent to which clients, which are in progress and so on would be as complicated as just doing the tasks, unless there's some really complex algorithms being run.
Quote:have the data processing be sufficiently mystified and encrypted


That doesn't actually work, because no matter how encrypted or mystified something is, it can be reverse engineered, and the data snatched out of the memory as the CPU is working on it. A secure or trusted computing system must be built on algorithms, not voodoo.

Regarding farming out "batches of work" to clients: the reason we have servers is that games require low latency responses. Servers can have context for all questions they need to answer, and thus arrive at a low latency answer. If you "farm out" "batches" to peers, without those peers having sufficient context, your latencies will skyrocket, and the game will not be responsive.
enum Bool { True, False, FileNotFound };
Quote:Original post by hplus0603
Quote:have the data processing be sufficiently mystified and encrypted


That doesn't actually work, because no matter how encrypted or mystified something is, it can be reverse engineered, and the data snatched out of the memory as the CPU is working on it. A secure or trusted computing system must be built on algorithms, not voodoo.

Regarding farming out "batches of work" to clients: the reason we have servers is that games require low latency responses. Servers can have context for all questions they need to answer, and thus arrive at a low latency answer. If you "farm out" "batches" to peers, without those peers having sufficient context, your latencies will skyrocket, and the game will not be responsive.



As always its a matter of making it harder (from trivial to time consuming -- especially to destroy recognizeable patterns that act as tags that give clues to related data).
Combine that with having the task program's code mutate (new differently compiled resent every day -- or even more frequently) to make it nearly impossible to reverse engineer fast enough (ie- require it be done manually) before it changes again.

More likely, having the tasks be randomly reassigned frequently will do more to making hacking worthless -- except for the sick people who simply want to disrupt the servers operations or arbitrarily pervert the data for whatever players are being processed.


As someone else mentioned, the distributed processing tasks taken by these 'trusted peers' have to be complicated enough (low transfer overhead to work ratio) such that it woulldnt be cheaper just to execute on the promary servers.

The problem usually is that there often is alot of current situational data which must be initially provided and kept up to date. eample- an A* pathfind on a map requires the map data, which may be quite large (if it was small, the overhead would make running it remotely more wasteful) and map updates (probably including dynamic objects) would be required.

More complicated tasks like running a 'zone' may be too large a task to be done at the same time as a running client interface. Or if they attempt to do it by breaking the world up into many small zones, then the inter zone communications
start becoming prohibitive -- more cross boundry events, more zone->zone transitions that require protocols where internet type lag gets ugly...
Failure would be massive (detrimental to multiple players) in that zone locality until the task could be reassigned -- too risky for a bussiness run game.

Maybe running NPC/monster AI (with a smaller local world representation than the players avatar gets??) -- this would be to the server the equivalent of a second client connection (transmission of map data window/object update events stream, etc...). NPC AI is one of the things that needs significant improvemnet and the computing requirements can be $ubstantial (and the failure path can fallback to simpler default 'crappy AI'(tm) that we have now....



One of the other advantages of such a 'peer' system might also be the communications bandwidth costs can be distributed (versus the usual large pipe into a central server) to go along with the lower cost of fewer company run servers....











Quote:Original post by hplus0603
Quote:have the data processing be sufficiently mystified and encrypted


That doesn't actually work, because no matter how encrypted or mystified something is, it can be reverse engineered, and the data snatched out of the memory as the CPU is working on it. A secure or trusted computing system must be built on algorithms, not voodoo.

Regarding farming out "batches of work" to clients: the reason we have servers is that games require low latency responses. Servers can have context for all questions they need to answer, and thus arrive at a low latency answer. If you "farm out" "batches" to peers, without those peers having sufficient context, your latencies will skyrocket, and the game will not be responsive.



Not all tasks require 'twitch' response. Some AI tasks are ongoing and/or can make use of the 'thinking' type delays seen frequerntly in games. Immediate responses like action animation sequencing and battle behaviors can still be done on the primary servers, but higher level behaviors that dont require instant results are possible (some games today use round-robin pathfinding spread over many 'turns'). Another use would be optional optomizations where pathfinding are redone continuoulsy and if a better path is found it would replace the existing one, and if unreliable isnt to disruptive.

There are background tasks like economic simulation and incidental behaviors (guiding those pretty butterflies around the map....) that arent high priority.
Quest generation could be a candidate -- running scripts to do validation/pattern matching/fitting can take quite a bit of time and the results arent usually needed instantly.






This topic is closed to new replies.

Advertisement