In-Memory Archive Library (Tarballs)?
This is actually a supplemental question to my last post.
I'm wondering if anyone knows of a cross platform archiving library that can read/write files within an archive, specifically in memory (i.e., without actually extracting everything to disk).
I would like a library that supports the TAR format. I've found libtar, but I can't seem to find a word of documentation and it doesn't appear to be cross platform (but I'm not sure).
Any suggestions? Really what I need is some library to expose the files in an archive in memory so that I can connect a stream to those files to perform I/O operations on them.
Thanks.
That is somewhat futile, since you're using double the memory. For small files, the difference will be negligible, for large files, this approach will be considerably slower than writing to disk directly, especially if you exhaust the physical memory.
This approach is used with files which are accessed directly - without reading, just loading a chunk into memory. This approach by definition isn't portable.
The whole purpose of writing to files in a portable manner is to avoid platform dependency. Then you re-interpret them into your memory structures, which will vary between platforms, compilers and even compiler settings.
This approach is used with files which are accessed directly - without reading, just loading a chunk into memory. This approach by definition isn't portable.
The whole purpose of writing to files in a portable manner is to avoid platform dependency. Then you re-interpret them into your memory structures, which will vary between platforms, compilers and even compiler settings.
Quote:Original post by Antheus
That is somewhat futile, since you're using double the memory.
What exactly is futile? Keeping the file data in memory? If it would be more robust and cross platform, then I don't have a problem extracting the files to disk; they're encrypted anyway.
It seems to me that you're saying I should just extract all the files and read them from there. Is this right?
Endianess seems to be the biggest problem at this point, but I plan on incorporating a system in the engine I'm using that allows code to query the endianess of the system. I'm using Boost.Iostreams to perform decompression and decryption. I seem to remember reading a blurb about handling different endianess with Boost.Iostreams as well.
Quote:Original post by GenuineXPQuote:Original post by Antheus
That is somewhat futile, since you're using double the memory.
What exactly is futile? Keeping the file data in memory? If it would be more robust and cross platform, then I don't have a problem extracting the files to disk; they're encrypted anyway.
Well, you are wasting a lot of RAM, which may or may not be a problem, depending on the size of your assets. Your encryption is however futile - anyone who wants your assets badly enough can read them from memory while your program is using them - or just decrypt the xor trick themselves (not hard to do, although it can be time consuming). This is all besides the point though, in general nobody steals assets (Limbo of the Lost being the exception to this rule), so time spent protecting them is time wasted.
Quote:It seems to me that you're saying I should just extract all the files and read them from there. Is this right?
Just use an existing archive format (ZIP/JAR/PAK is very popular and simple). You can find IOStream filters to read zip compressed data, and libzip or similar to read the file system in the zip file.
I would even suggest going one step further, and using middleware like PhysFS to make your life even easier.
Thanks, I'll look into PhysicsFS.
If there are already streams out there to pull data from an archive, then I don't think I have a problem. Now that I've thought about it, extracting the files to disk shouldn't be a problem.
My encryption scheme is very simple and probably very, very easy to break. I only want to implement it to keep end-users from easily just copying the files around. For most users, I think simple XOR encryption is enough to make them give up. Anyone who is capable probably wouldn't care enough to waste their time. I certainly don't think anyone would want to take the time to rip them from memory! I guess the idea is to make it too much of a pain, not to make it extremely difficult or nearly impossible.
Thanks for the help.
If there are already streams out there to pull data from an archive, then I don't think I have a problem. Now that I've thought about it, extracting the files to disk shouldn't be a problem.
My encryption scheme is very simple and probably very, very easy to break. I only want to implement it to keep end-users from easily just copying the files around. For most users, I think simple XOR encryption is enough to make them give up. Anyone who is capable probably wouldn't care enough to waste their time. I certainly don't think anyone would want to take the time to rip them from memory! I guess the idea is to make it too much of a pain, not to make it extremely difficult or nearly impossible.
Thanks for the help.
Quote:Original post by niteiceI wouldn't really recommend it - libarchive doesn't work on Windows, and none of the extra formats are even used anymore (uncompressed zip, ustar, etc.)
libarchive is a good library to read tar files and a lot of others.
Quote:Original post by swiftcoder
You might like to take a look at libzip, which does most of what you are looking for.
To extend the use of zlib a bit further, I hereby present zziplib.
Quote:Original post by nife87
I hereby present zziplib.
Looks great, thanks. :-)
Personally, I'd like to develop the package system (which is reading/writing these archived/encrypted files) independently of libraries like PhysicsFS. However, I also see the enormous value in using a powerful library like PhysicsFS, so I certainly won't forget about it. It really depends on how much flexibility I need in grabbing resources from disk.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement