So I've finally had some time to play around with this, and I've managed to get it to work... But only on very small worlds/file sizes. Now I've tried two ways of doing it, with exactly the same results. The first way is the simple way I described earlier, I did however mark a few objects as nonserializable to save additional space.
The other solution I've tried is the one posted here:
http://www.codeproje...ization-using-C.
I got the exact same result with this one when deserializing- The game stays frozen for 5-10 minutes, and then throws an out of memory exception. I then tried to reduce the world to 1024*512. This actually fixed it. I still had a load time on about 1.5 minutes and a save time of maybe 15 second, but it seems to work flawlessly after that.
Just prior to posting this I then figured that it might work by doing it the "simple" way. So I commented out all the code related to the CodeProject article, and tried again, and exactly the same result.
Just now I tried with a world size of 128*128. Here the save time is like .3 seconds, and the load time is maybe around 1 second. This is close enough to be acceptable, and it might probably work if I chunk it down to 128*128 chunk pieces, but it feels wrong doing it like this, since I got the feeling I am doing something wrong.
I did get a suggestion to try and do a custom serialization, but as far as I understand, that's exactly what I am doing if I follow the CodeProject article?
I do have a few Tile classes that inherrits from the base Tile class, to handle objects that changes state, and objects that take up more than 1 tile in the world grid, but I did remember to follow these rules(when using the CodeProject article) -
http://msdn.microsof...326(VS.80).aspx so I don't think that is the cause.
Below is the code I use for the CodeProject solution:
/// <summary>
/// Constructer for deserialization
/// </summary>
public Tile(SerializationInfo info, StreamingContext ctxt)
{
Type = (TileType)info.GetValue("TileType", typeof(TileType));
BackgroundType = (TileType)info.GetValue("BackgroundType", typeof(TileType));
Orientation = (TileOrientation)info.GetValue("Orientation", typeof(TileOrientation));
_variant = (byte)info.GetValue("Variant", typeof(byte));
_lightCalculated = false;
if (Type == TileType.None)
{
if (BackgroundType == TileType.None)
_ambientLightLevel = Light.AmbientLight;
else
_ambientLightLevel = 0;
_directLightLevel = 0;
_flickerTimer = 0;
}
else if (Type == TileType.Torch)
{
_directLightLevel = 255;
_ambientLightLevel = 0;
_flickerTimer = 1;
}
else
{
_directLightLevel = 0;
_ambientLightLevel = 0;
_flickerTimer = 0;
}
}
public virtual void GetObjectData(SerializationInfo info, StreamingContext ctxt)
{
info.AddValue("TileType", Type);
info.AddValue("BackgroundType", BackgroundType);
info.AddValue("Orientation", Orientation);
info.AddValue("Variant", _variant);
}
Notice that GetObjectData is virtual. The child classes overrides it, add their own values in the same manner, and then calls the base method. I also remember to make the Tile class inherrit from ISerializable. I then serialize exactly the same way as before, using the methods I posted earlier.
So to sum it up:
As I understand it, when a class inherrits from ISerializable and has the GetObjectData method and the deserialization constructer, both should automatically be called when you are both de- and serializing. This seems very odd to me, but is this true?
And secondly is this what is meant by making custom serialization, or am I completely off?
I'm off to read up on all this, and I'll post here if I happen to answer some of my own questions.
Thanks for reading