Members - Reputation: 191
Posted 26 June 2012 - 02:32 PM
Before thinking about compression I used to store files with non-unified indices (like .obj). This would reduce (uncompressed) file size as there were no duplicates (in terms of whole vertices/normals/texcoords) being stored. However when building unified vectors for OpenGL VBO's, I would not check for duplicates and end up with a bloated and inefficient buffer sizes.
Checking for these duplicates at run time seems wasteful to me so then when exporting to my run-time format I now check for duplicates and store everything in unified indices. This bloated the file a little but reading them in was fast, trivial and more efficient in terms of VBO size.
Now today I saw the recent article http://www.gamedev.net/topic/626987-3d-file-efficiency/ and realized I should start exploring some sort of file compression scheme.
So my question is: before I send my model data into some sort of compression algorithm should I unify my indices and check for duplicates or leave them as is and let the decompression deal with duplicate values/unifying the indices.
Thanks for any input!