Currently what I do is, serialize the 2D Tile array that I've made. The size I test on is 10240 * 512. Now I know I can optimize the file size further, but currently this gives me a file size of roughly 305MB.(I am planning to have worlds much bigger than this one) My problem occurs when I try to deserialize the array. The game freezes, and rapidly swallows more and more memory, and after a while (5-10 mins) it will throw an out of memory exception. Is this due to the huge size of the file? Or are there other factors I should consider. My code for serializing and deserializing is here:
public void SerializeWorld(string fileName)
{
using (Stream stream = File.Open(fileName, FileMode.Create))
{
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(stream, world);
}
}
public static void DeserializeWorld(string filename)
{
using (Stream stream = File.Open(filename, FileMode.Open))
{
BinaryFormatter bf = new BinaryFormatter();
world = (Tile[,])bf.Deserialize(stream);
}
}
In the above code world is the array where I keep my tiles. I've tested this in two ways:
On both tests I create the world as I usually do, and then serialize it into a file right after the creation. The difference is when I load it. First I tried loading it while the game was running, to see if It would be able to load the world without having to leave the game session. Failing this I tried simply loading the world when I start the game, but with the same results- Game freezes for 5-10 mins, swallows memory and ends up with Out of memory exception.
Now the way I believe I should do this is divide the world into chunks, and only keep 9 chunks loaded at any one time. (Normally it would be 3, but I need to consider up and down as well). This would also require me to keep each chunk in its own file, giving me tons of smaller files. This is sort of okay with me as long as this does not cause any other issues? I would then load those 9 chunks into my in memory world array, and release the ones out of range. Currently I'm not quite sure how to do this effectively.
Maybe keeping it in small files, might fix the problem with deserialization?
So to sum it up:
- Am I right to assume that the 305MB file, is the cause of the freeze and out of memory exception when I deserialize?
- How should I correctly handle chunks?
- Will it be an issue to have multiple smaller files, where each file represents one chunk?
- Are there anything wrong with the way I de- and serialize right now, and should I consider other methods?
Thank you for reading.