Sign in to follow this  

Entropy and Information

This topic is 3505 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I was reading about entropy (information theory not the physics thingy) and so I tried doing something myself. So the entropy of a fair coin is: -(0.5*log(0.5)/log(2)+0.5*log(0.5)/log(2)) = 1 bit. which come from the equation: sum of -pi*log(pi). This tells us that with each event of a toss we get a bit in return. (Or that the toss of the coin is equivalent to a random bit). Imagine now having an unfair coin of one face having the probability of 0.11 (the other having 0.89 of course) The entropy of this coin is: -(0.11*log2(0.2)+0.89*log2(0.89))=0.4999 Tossing two of this coin after each other should give an entropy of near 1. However I can hardly see how two coins tossed could be mapped into a single bit.

Share this post


Link to post
Share on other sites
Say you have two outcomes, A and B, where the probability of A is larger than B. You could, for example, encode AA (two consecutive A outcomes) as 0, AB as 10, BA as 110 and BB as 111. So that would encode two tosses (AA) using one bit (0).

For optimal encoding like this, check out Huffman coding.

An entropy of 0.5 does not really mean you can encode 2 coin tosses in 1 bit. It means that, when the number of tosses approaches infinity, the average number of bits to encode the sequence can not be less than 0.5 bits per toss. In the example above you can encode some sequences using small number of bits, but some sequences requires more (BA and BB required three bits), but on average, the entropy states that you can use slightly less than one bit per outcome, and that is what you get if you weight the bit lengths by the corresponding probabilities.

Share this post


Link to post
Share on other sites
IT CLICKED!
Thanks a lot, first I thought I wouldn't get it
That's where the log2 comes from (ie. encoding length)
And assuming I toss the coin a thousand times, I would need 500 bits [of optimal encoded like huffman] to represent it on average.

Share this post


Link to post
Share on other sites
Essentially you can think of entropy as the (weighted) log of the number of states the system could be in. The more complex system is, the bigger it's entropy.

Share this post


Link to post
Share on other sites

This topic is 3505 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this