Hm.
I must admit I haven't programmed in a while so this might be a really silly problem, but I can't seem to figure it out!
private static void GetIffTypes(BinaryReader Reader)
{
List<IffType> Types = new List<IffType>();
Reader.BaseStream.Seek(64, SeekOrigin.Begin);
while (Reader.BaseStream.Position < Reader.BaseStream.Length)
{
IffType T = new IffType();
T.Offset = (int)Reader.BaseStream.Position;
T.Type = new string(Reader.ReadChars(4));
T.Size = ConvertFromCharArray(Reader.ReadChars(4));
T.ID = ConvertFromCharArray(Reader.ReadChars(2));
T.TypeNum = ConvertFromCharArray(Reader.ReadChars(2));
T.Label = new string(Reader.ReadChars(64));
T.DataOffset = T.Offset + 76;
T.DataSize = (T.Size - 76);
T.Data = Reader.ReadBytes(T.DataSize);
Types.Add(T);
Console.WriteLine("Type: " + T.Type.ToString());
Console.WriteLine("Size: " + T.Size.ToString());
Console.WriteLine("ID: " + T.ID.ToString());
Console.WriteLine("TypeNum: " + T.TypeNum.ToString());
Console.WriteLine("Label: " + T.Label + "\n");
}
}
In the above code, program execution crashes on the line:
T.Data = Reader.ReadBytes(T.DataSize);
Because T.DataSize is (apparently) -76!
The value of T.Size as printed to the console comes out as 70397, but whenever I check the value in the Debugger, it is 0. Which means that T.DataSize ends up at -76.
How is it possible that the value of T.Size differs internally from what it's value is said to be in the console?!
Here's the rest of the relevant code:
public class IffType
{
public string Type = "";
public int Offset = 0;
public int Size = 0;
public string Label = "";
public int ID = 0;
public int TypeNum = 0;
public int DataOffset = 0;
public int DataSize = 0;
public byte[] Data;
}
private static int ConvertFromCharArray(char[] CharArray)
{
int result = 0;
for (int i = 0; i < CharArray.Length; ++i)
result += (CharArray[(CharArray.Length - 1) - i] << (i * 8));
return result;
}