# Entropy and Information

## Recommended Posts

arithma    226
I was reading about entropy (information theory not the physics thingy) and so I tried doing something myself. So the entropy of a fair coin is: -(0.5*log(0.5)/log(2)+0.5*log(0.5)/log(2)) = 1 bit. which come from the equation: sum of -pi*log(pi). This tells us that with each event of a toss we get a bit in return. (Or that the toss of the coin is equivalent to a random bit). Imagine now having an unfair coin of one face having the probability of 0.11 (the other having 0.89 of course) The entropy of this coin is: -(0.11*log2(0.2)+0.89*log2(0.89))=0.4999 Tossing two of this coin after each other should give an entropy of near 1. However I can hardly see how two coins tossed could be mapped into a single bit.

##### Share on other sites
Brother Bob    10344
Say you have two outcomes, A and B, where the probability of A is larger than B. You could, for example, encode AA (two consecutive A outcomes) as 0, AB as 10, BA as 110 and BB as 111. So that would encode two tosses (AA) using one bit (0).

For optimal encoding like this, check out Huffman coding.

An entropy of 0.5 does not really mean you can encode 2 coin tosses in 1 bit. It means that, when the number of tosses approaches infinity, the average number of bits to encode the sequence can not be less than 0.5 bits per toss. In the example above you can encode some sequences using small number of bits, but some sequences requires more (BA and BB required three bits), but on average, the entropy states that you can use slightly less than one bit per outcome, and that is what you get if you weight the bit lengths by the corresponding probabilities.

##### Share on other sites
arithma    226
IT CLICKED!
Thanks a lot, first I thought I wouldn't get it
That's where the log2 comes from (ie. encoding length)
And assuming I toss the coin a thousand times, I would need 500 bits [of optimal encoded like huffman] to represent it on average.

##### Share on other sites
serg3d    100
Essentially you can think of entropy as the (weighted) log of the number of states the system could be in. The more complex system is, the bigger it's entropy.