MMOG Implementation and databases

Started by
12 comments, last by _winterdyne_ 16 years, 11 months ago
Hi, Whats the best / standard way of handeling the actual implementation of a basic MMORPG? Our current plan is as follows: Server Game Thread: * The game thread runs the main game code. It has a circular command buffer that is filled by the server thread with commands from clients. Each game iteration the current command buffer is applied to the game state. Client updates are then placed into the server output command buffer. Every so often, the current game state is stored to the game database Server Comms Thread: * The server handles incoming comm commands from clients, converting them to game commands, which are then placed into the games command buffer. Any outgoing commands in the comms out command buffer are sent to the affected clients. If there is a better / more efficient approach, I would love to know. Also, whats the best approach to storing the current game state. Would files or SQL database be best for speed? And, would storing updates to the game state be best saved as they happen or every so often? Thanks for any advice Mat [Edited by - mattius on May 16, 2007 10:12:24 AM]
Advertisement
First, how many players and how large of a persistent world? Running tons of players on 1 server may not work for more than say 64 people.... Also what kind of RPG? Tile based? 3D?

I think there are threads about zone mmo systems on this forum.
Quote:Original post by Sirisian
First, how many players and how large of a persistent world? Running tons of players on 1 server may not work for more than say 64 people.... Also what kind of RPG? Tile based? 3D?

I think there are threads about zone mmo systems on this forum.


Initailly, we are looking at a few hundred, but with a view to 500-1000 per server. The only peristent part of the world will be resources and players on different servers cannot communicate with each other.

It is also 3D based RPG (not fast paced action, unless in an instance, which will have a limited number of players 10-20) and the plan is to only send updated player information to players that are within x units distance of the player.


Definitely serialise in your persistant player data from a database IMHO. That's what databases are designed for :)

Instead of "every so often", it's probably worth only persisting significant events, such as "player gains exp" or "player buys/sells/loots item".
"He took a duck in the face at 250 knots."
You will want to checkpoint state every so often, just because players expect it.

Also make sure that a commit of the character to a database is part of any significant event (loot, trade, etc). Ideally, commit all involved parties in a single transaction, to avoid the possibility of duping attacks and other bad bugs.

Thus, do the union of the two suggestions :-)
enum Bool { True, False, FileNotFound };
I actually had this idea for creating an MMO based on a database backend. The awesome part about having a database backend would be that you could have as many users as you wanted ont he same virtual server. As the real servers would connect to a backend database which would contain current locations etc... That server (for each character) would pull information on what that character would expect to see from the database based on his/her location. Your database would definitely need to be beefey. However, if beefey enough it could house thousands of "characters" as each character would have its own connection back to the database that would be established when logging in. You could technically prove this by writing it as a backend and writing a Perl tester to see how fast response time is etc... Let me know if I could help, I'd love to actually see this in real time.
That's no different from how a typical MMO interfaces with its database right now. The key lies in the front-ends, because they need to both verify game rules, and provide the proper view of the world to each of the connected players.
enum Bool { True, False, FileNotFound };

You want to keep your runtime state In Memory for speed. Disk access is glacially slow. Rolling in/rolling out users info to disk when they login/logout could be done to a DB (as can operations like bank boxes/container access) because the data is only relevant to the one player and a fraction of a second delay to fetch a record from disk can be tolerated. Inventory (expecially interactive inventory) you want in-memory. Serializing everything for s save/restore point if you do it frequently has to be assembled quickly in-memory and then dumped to disk via a background thread to minimize the impact to the game (no frequent multi second freezes). Use of a Database for the constant game mechanics access would be alot slower (and thus your game slower) than custom native code data manipulation.
--------------------------------------------[size="1"]Ratings are Opinion, not Fact
Size, in this case, matters.

Some MMOG can keep their full state in memory, including complete database.

But many can't. The database can over time grow well over 10, 20 or more gigabytes (acording to some commercial mmorpg author's reports), which means that not only it becomes unviable to keep it in memory, but also accessing it will become incredibly costly.

Another reason why keeping too much state in memory is suboptimal is data locality. Granted, with the sizes of MMO servers that's a problem in the first place, but if you decide to support only very limited numbers, some optimizations can be made - more memory can be statically allocated, there's less variables, and so on.

But no matter how you organize it, if going for any semi-professional server, you'll need to account for growth. So saying 640Mb should be enough for everyone is a big falacy.

If you're serious about designing something scalable, consider from the start that you'll have an ever increasing number of users, everyone with their inventory and other stats maxed out, then use that as an estimate for memory and database.

There are several good articles in Game Programming Gems about organizing your data, templating, persisting, propagating and synchronizing your in-game objects from real-world games.
Quote:You want to keep your runtime state In Memory for speed. Disk access is glacially slow. ... Use of a Database for the constant game mechanics access would be alot slower


Surprisingly, the Sun Game Server architecture does exactly this. Each time a request comes in, it hits the persistency database, de-serializes the object, runs the transaction code, and then re-serializes the object to database. I've asked them about it, and they actually do this -- there is no optimization by keeping objects "live" for a while. This obviously limits the throughput of the system as it scales, especially since it doesn't federate the data from what I can see.

Not surprisingly, they recommend against putting physics in the data model :-)

Also, in such a model, you have to be careful, because objects seldom execute in solitude; most interesting transactions include more than one object, and thus you have to de-serialize and re-serialize multiple objects per individual transaction. At that point, you also have to avoid deadlock (which Sun breaks by killing one of the contenders and re-starting it later).

20 gigabytes is a small MMO database. There has state, records and a detailed data warehouse for five years back, and that is measured in Terabytes. However, only a fraction of that is in the working set at any one time, and as long as you can keep your working set in memory, a cheap database such as MySQL will be reasonably fast (where reasonably fast means a few hundred transactions per second). Note that commits have to be flushed to disk, even if the data is kept in memory.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement