Disadvantages of data-driven design
Let's say I have a bunch of data stored in a XML file which I load in the beginning of the game. The data is mostly comprised of various attributes for each type of entity (position, color, whatnot). Every time a spawn a new entity, I need to look up this data in order to initialize it with the proper values. Instead of loading the data in each time, I load it in once and store it in a large table (let's say, an array) in memory. Now I have this huge table of data which is taking up space in memory instead of being intertwined with the rest of the code. The obvious way to reduce this extra overhead is to try and load the data by a need-to-know basis (ie, for each level), but if all of the entities play a part in the game at all times, then there is not much else I can do. Or is there?
Does this table in memory take that much space, as opposed to hardcoding it? How much space do you lose for each entity type? How many entity types do you have?
If your "overhead" is a kilobyte or two, it just isn't worth optimizing at all.
If your "overhead" is a kilobyte or two, it just isn't worth optimizing at all.
Hi,
Are you convinced that this data takes a lot of memory ? I use a similar system for creating objects and I never noticed that the object descriptors would take too much of memory (few hundred kbs at most).
Of course, if you are concerned about the memory, you can use a system which loads the desired data only on demand. When you spawn an entity with some ID, you'll check your list in memory if you have already loaded that ID, and if not, check if it can be found on disk.
Best regards!
Yeah, I just realized that it's a much smaller amount of overhead for my particular game than I thought, but I guess what I thinking in the large scheme of things (for example, commercial games).
Quote:Original post by fd9_
Yeah, I just realized that it's a much smaller amount of overhead for my particular game than I thought, but I guess what I thinking in the large scheme of things (for example, commercial games).
For most games, this data will be so much smaller than art assets (textures, meshes, sounds) that it will be negligible.
Quote:Original post by ToohrVykQuote:Original post by fd9_
Yeah, I just realized that it's a much smaller amount of overhead for my particular game than I thought, but I guess what I thinking in the large scheme of things (for example, commercial games).
For most games, this data will be so much smaller than art assets (textures, meshes, sounds) that it will be negligible.
Seconded. To give you some numbers, the gameplay objects in an average Counter-Strike level take about 1% of a maps filesize. For de_dust2, it's 0.5% - roughly 10 kb, in a file that's 1.9 mb. The rest is mainly level geometry and lightmap data. For an average Counter-Strike: Source level, that ratio is probably closer to 0.1%, a few hundred kb per level. Of course, that's filesize, not memory, but it should give you a rough idea of what to expect.
Quote:Original post by fd9_
Now I have this huge table of data which is taking up space in memory instead of being intertwined with the rest of the code.
You present this as being a problem. But consider: if you didn't have the XML file, you would still need that data *somewhere* in order to make the correct decisions whenever you constructed an entity.
Quote:The obvious way to reduce this extra overhead is to try and load the data by a need-to-know basis (ie, for each level), but if all of the entities play a part in the game at all times, then there is not much else I can do. Or is there?
Not really. Don't put effort into this; it effectively turns into reinventing virtual memory (i.e. you're putting thought into the task of keeping things on disk vs. in memory, beyond the most obvious way of doing things).
But do consider if XML is really the best way to store your data in the file.
Quote:Original post by Zahlman
But do consider if XML is really the best way to store your data in the file.
Would you suggest something different?
Quote:Original post by fd9_Quote:Original post by Zahlman
But do consider if XML is really the best way to store your data in the file.
Would you suggest something different?
Different file storage methods have their advantages and disadvantages. XML is easy to read by humans, but is very inefficient at storing data and is slower for computers to read. Binary files are the exact opposite; they are almost impossible to read by humans but are very efficient at storing data and are quick for computers to read/write.
To get the best of both worlds, you could use binary files, and develop a tool so that you can easily read them and create them.
If you realy need to squeze performance and have readbility
you can have a cache, wich can be a binary dump of the data in the xml file.
you just need to build the cache by reading the xml file then doing a binary serialization the first time its accessed.
if the original file changes or your data format changes or it just simply hasnt been used for ages you just delete the cache.
I did a similar thing with huge amounts of data in csv files.
you can have a cache, wich can be a binary dump of the data in the xml file.
you just need to build the cache by reading the xml file then doing a binary serialization the first time its accessed.
if the original file changes or your data format changes or it just simply hasnt been used for ages you just delete the cache.
I did a similar thing with huge amounts of data in csv files.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement