Sign in to follow this  
Catafriggm

Compression, Anyone?

Recommended Posts

What are the good (and free, with source) compression algorithms out there for game data files? I know about Deflate (the ZLib algorithm) and BZip2. Are there any others that are good (or better than these)?

Share this post


Link to post
Share on other sites
This page lists a few compression/archiving libraries available for free. There only a couple more algorithms supported in these libraries than the ones you've mentioned, but it should be a starting point.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Depends what you are trying to do; For resources such as textures or sounds use existing, context specific, formats (& libraries - eg PNG (lib_png), JPEG, MPEG4, OGG Vorbis etc.) - resource type specific compression will generally be better for the type of resouce and libraries that are easy to use are generally avalible.

For custom data formats (say maps etc.) - tune your structures to cut out wasted space (use bit-fields, and use the smallest appropriate data types, none of those funky & nasty serialising C++ classes etc.).

Overall though you might want to archive everything into a single resource file. I think there are some pre-written solutions out there, I know there's an article on GameDev about this kind of thing. When you have compressed resources then archive-compression isn't going to make much difference anyway and will just serve to increase access times.

Share this post


Link to post
Share on other sites
Quote:
Original post by Anonymous PosterFor custom data formats (say maps etc.) - tune your structures to cut out wasted space (use bit-fields, and use the smallest appropriate data types, none of those funky & nasty serialising C++ classes etc.)

i'm sure using a common compression algo will lead to good results, too - without spending endless time on macro-optimizing your data-structures...

Share this post


Link to post
Share on other sites
Quote:
Original post by maximAL
Quote:
Original post by Anonymous PosterFor custom data formats (say maps etc.) - tune your structures to cut out wasted space (use bit-fields, and use the smallest appropriate data types, none of those funky & nasty serialising C++ classes etc.)

i'm sure using a common compression algo will lead to good results, too - without spending endless time on macro-optimizing your data-structures...
Ah, but never forget that the best gains from compression come from having less to compress in the first place.[smile]

The best compression of data, happens when you know it's structure or pattern. Knowing whether a lossy algorithm is acceptable, also helps greatly. If Catafriggm could tell us exactly what types of data he is wanting to compress then we can suggest the best overall answer.

The other thing is, what are the importances of compressed size, decompresion speed, and compression speed? Some people are quite happy with algorithms that are very fast and reasonably good. Some may prefer an algorithm that gives extremely good compresion at a cost of being very very slow either to compress, decompress or both.

From what I've read, LZO is very good speed-wise and quite satisfactory compression-wise.

Share this post


Link to post
Share on other sites
I'm looking for a good "default" algorithm for game data compression. It needs to be lossless; other than that, decompression speed and compression ratio are the most important factors (slow compression speed is an annoyance, not an obstacle). Of course it's best to use lossy compression whenever possible, due to the orders of magnitude difference in compression ratio (i.e. MP3, Ogg, JPEG, H.264, etc.), but this is just the default algorithm; more or less it's for "none of the above" files.

Share this post


Link to post
Share on other sites
I highly recommend an implementation of LZ77 for compressing general resources in games. It took me around 5 days to go from the intent to a fully working and highly optimized implementation for my engine, but I had a year or so of experience with compression beforehand, so that helped somewhat.

Anyway, LZ77 is extremely fast to decompress. You're likely to find it actually speeds up load times, due to the decreased disk activity. This compression method was commonly used on early consoles such as the Mega Drive (Genesis) where there was limited processing power, but every byte of cart space counted. It was sometimes used to decompress several kilobytes of data in one birst during gameplay without a noticeable drop in framerate (case in point: Sonic 3/Sonic and Knuckles), which for a console with a 7.6MHz processor is a big deal. It pays for this lightning fast decompression with an incredibly slow compression; one of the slowest you'll find. Thankfully there are a lot of optimizations you can do to decrease this time. They reduced compression time by a factor of around 100 times in my implementation. As for the compression rate, you really have to go beyond the basic implementation to get the best results, but again in my implementation, for a 115MB test file (an iso), I got a compressed file that was 43.6MB, compared to the 35.5MB zip file that winrar yielded on maximum compression.

LZ77 is ideal for storing static resources in games IMO, because it offers the following:

-Lightning fast decompression
-Reasonable compression rates

The relatively slow compression process is of no consiquence, as long as this compression algorithm is only used for static resources that the end user never has to recompress. The effect on developers can be minimized if your engine allows resources to either be compressed or uncompressed. This allows content that is being developed to be saved in an uncompressed form, while the compression step is saved for release or full test builds. Compression time should only be a second or two for a few megs of data however. The compression rate is also reasonable. Don't focus on getting the maximum compression possible. Current maximum compression schemes are not suitable for content that must be loaded quickly, and unless space is at a premium (which isn't really an issue nowadays), the kind of compression rates you get from this method are suitable.

Anyway, I was going to paste a list of compression/decompression times and file sizes with my implementation of LZ77 vs ZIP at best compression through winrar, but I seem to have misplaced the notes, so I'll say that from memory, that 115MB sample iso took around 8 minutes to compress in my format, compared to 42 seconds in winrar, and it took around 1.5 seconds to decompress, compared to 3.6 seconds in winrar. I just tested the winrar figures, but the figures for my format are from memory, so they might not be accurate. I'll try and dig up the notes. I'd run the tests again, but I'm in the middle of making major additions to the project right now, it won't compile in its current state, and I don't want to mess around chopping parts until it does.


I'm willing to post my implementation, as well as discuss general theory and optimization techniques, if you want to use this compression method.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this