Jump to content

  • Log In with Google      Sign In   
  • Create Account


#ActualBLM768

Posted 15 April 2013 - 12:43 AM

Original File:   100mb
CAR File:        100mb
ZIPped Original: 99mb
ZIPped CAR:      3 bytes

 

A 3-byte file only has enough entropy to encode 2553 different combinations. Even if we assume that every one of those combinations decompresses to a 100MB file, those three bytes can only represent a very tiny percentage of all possible 100MB files. How tiny? My calculator can represent numbers up to 1e999, and it overflowed when calculating the number of possible 100MB files in existence. Since my calculator won't cut it, I started a calculation in GNU bc, which is an arbitrary-precision calculator. It's been running for about 5 minutes now, and it still hasn't given me the answer. This result would be a very special case of the algorithm.

 

On top of that, I'm pretty sure that the header for a ZIP file alone is well more than 3 bytes.

 

Edit:

 

After over 45 minutes, I just killed the job. Calculating it in a more sane manner, it's about 10^(-2,523,430), which is reeeealy tiny.


#1BLM768

Posted 14 April 2013 - 11:47 PM

Original File:   100mb
CAR File:        100mb
ZIPped Original: 99mb
ZIPped CAR:      3 bytes

 

A 3-byte file only has enough entropy to encode 2553 different combinations. Even if we assume that every one of those combinations decompresses to a 100MB file, those three bytes can only represent a very tiny percentage of all possible 100MB files. How tiny? My calculator can represent numbers up to 1e999, and it overflowed when calculating the number of possible 100MB files in existence. Since my calculator won't cut it, I started a calculation in GNU bc, which is an arbitrary-precision calculator. It's been running for about 5 minutes now, and it still hasn't given me the answer. This result would be a very special case of the algorithm.

 

On top of that, I'm pretty sure that the header for a ZIP file alone is well more than 3 bytes.


PARTNERS