# Name the physical constant

This topic is 4898 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I'm doing some research into Maxwell's Demon and I would really appreciate it if someone could tell me what the latent energy in one bit of information is. This would be the amount of energy released when one bit of information is destroyed. What is the name of this constant and how is it derived? Thanks for the help!

##### Share on other sites
Huh? Maxwell's Demon has to do with Entropy and the second law of thermodynamics. What's this about 'one bit of information is destroyed.' and such?

##### Share on other sites
Quote:
 Original post by TelamonI'm doing some research into Maxwell's Demon and I would really appreciate it if someone could tell me what the latent energy in one bit of information is.This would be the amount of energy released when one bit of information is destroyed.What is the name of this constant and how is it derived?

It's not a constant. At a constant temperature T, the amount of energy released by erasing n bits is nkT · ln(2) joules, where k is Boltzmann's constant, ~1.380 * 10-23 J/K.

If you meant the amount of entropy created (rather than energy released), this is a constant regardless of the temperature, kn · ln(2). This is called Landauer's principle. Not everyone seems to agree that Landauer's principle is correct, however. See here.

##### Share on other sites
Quote:
Original post by kSquared
Quote:
 Original post by TelamonI'm doing some research into Maxwell's Demon and I would really appreciate it if someone could tell me what the latent energy in one bit of information is.This would be the amount of energy released when one bit of information is destroyed.What is the name of this constant and how is it derived?

It's not a constant. At a constant temperature T, the amount of energy released by erasing n bits is nkT · ln(2) joules, where k is Boltzmann's constant, ~1.380 * 10-23 J/K.

that reeks like massive bullshit. in fact it is completely meaningless without defining a 'bit', but even with one itd be meaningless since the equation nowhere takes the properties of a bit into account.

on top of that it just doesnt make sense.

##### Share on other sites
Quote:
 Original post by Eelcothat reeks like massive bullshit. in fact it is completely meaningless without defining a 'bit'

The term "bit" is well-defined, as a piece of information having two possible states. If you're talking about non-binary bits, you say so.
Quote:
 but even with one itd be meaningless since the equation nowhere takes the properties of a bit into account.

I suggest you think about the ln(2) factor in the equation. Exercise for the reader: how could the equation be modified for a trinary bit? Is the energy released by destroying two binary bits the same as that of destroying a single quaternary bit? What energy is released when a trivial amount of information (only one possible state) is destroyed?
Quote:
 on top of that it just doesnt make sense.

Then perhaps you should learn about it.

##### Share on other sites
Quote:
 Original post by SneftelWhat energy is released when a trivial amount of information (only one possible state) is destroyed?

Our current definition of entropy relies on the number of possible states that an object can be in. If a piece of information can be in only one state (as would be the case for a physical object at absolute zero), no energy can be released by destroying it.

In other words, information that only describes one possible state is not information at all. It would be as though I had shown you a deck of cards containing only one card, then "shuffled" the deck and asked you to guess what card I was holding; there is only one possible answer and it is always correct. This corresponds neatly with the fact that ln(1) == 0.

##### Share on other sites
It's important to note that "thingy that can be in two states, either 1 or 0" doesn't necessarily hold 1bit (it can hold less).
Also, let's look what is destroying of information? If we hold a and b, then compute sum c=a+b, we can destroy c without destroying any information if we still have a and b. We can reverse process of computation c=a+b. In other words, if we have a,b, c defined as c=a+b, phase space of a,b,c have same size as phase space of a,b. c essentially holds no information. So, it's possible to destroy c without increasing entropy outside of system a,b,c.

Speaking of Landauer's principle, somewhat simplified explanation: 1 bit memory device holding random data (i.e. that can be in 2 equiprobable states 1 and 0) have entropy of k*ln(2) (by definition). If you want to decrease entropy of memory device by setting it to 0, you must put this entropy elsewhere, i.e. dump it into environment in form of heat.
It's similar to how to cool down something in freezer you must increase entropy.
Of course there's somewhat paradoxial idea that data memory device holds is typically not more random that data written over...

Also, there are some misapplications stating that traditional information processing must release heat, i.e. that program code, for example written in normal C++ theoretically can not be executed without releasing some heat to environment roughly proportional to amount of operations.
But actually there's no theoretical law forbidding to discard intermediate results reversibly (I even seen some paper with example machine concept that in first pass stores all intermediate results that can be "lost" (solution pass), and in second pass it reverses each and every operation finally coming to initial state. Note that by subdividing task into parts, memory consumption for intermediate results can be made much smaller than number of "irreversible" operations (maybe O(log(n))?). I think it can execute normal C++ transparently to programmer). In other words, input data + results and intermediate results have exactly as many possible states as input data alone, so if you discard intermediate results and/or final results of computation, you don't change entropy of the system and don't have to release heat (theoretically). So, this principle doesn't set theoretical limit of speed of execution of computer programs written in traditional languages (e.g. C++) assuming energy consumption is limited, but only limit on how much data you can pump into computer if computer discards all input. It indeed does set limit of speed of current processor designs, though this limit is far away now and it is not clear if silicon can do that far.

[Edited by - Dmytry on August 19, 2005 5:50:02 AM]

##### Share on other sites
Quote:
 Original post by kSquaredIn other words, information that only describes one possible state is not information at all. It would be as though I had shown you a deck of cards containing only one card, then "shuffled" the deck and asked you to guess what card I was holding; there is only one possible answer and it is always correct. This corresponds neatly with the fact that ln(1) == 0.

Of course, if you're allowed to have more than one 'bit' that can only be in the lone state, then the number of 'bits' you have would be considered information :)

##### Share on other sites
Quote:
Original post by kSquared
Quote:
 Original post by SneftelWhat energy is released when a trivial amount of information (only one possible state) is destroyed?

Our current definition of entropy relies on the number of possible states that an object can be in. If a piece of information can be in only one state (as would be the case for a physical object at absolute zero), no energy can be released by destroying it.

In other words, information that only describes one possible state is not information at all. It would be as though I had shown you a deck of cards containing only one card, then "shuffled" the deck and asked you to guess what card I was holding; there is only one possible answer and it is always correct. This corresponds neatly with the fact that ln(1) == 0.

Yeh... my questions were mostly rhetorical ones for Eelco. Similarly, the two-binary-bits vs one-quaternary-bit question corresponds to 2*ln(2) = ln(4).

##### Share on other sites
Quote:
Original post by Sneftel
Quote:
 Original post by Eelcothat reeks like massive bullshit. in fact it is completely meaningless without defining a 'bit'

The term "bit" is well-defined, as a piece of information having two possible states.

except that they dont seem to exist in real life as far as im aware.

Quote:

Quote:
 but even with one itd be meaningless since the equation nowhere takes the properties of a bit into account.

I suggest you think about the ln(2) factor in the equation. Exercise for the reader: how could the equation be modified for a trinary bit? Is the energy released by destroying two binary bits the same as that of destroying a single quaternary bit? What energy is released when a trivial amount of information (only one possible state) is destroyed?

so lets say i have a tank of air. if its pressurized it counts as a high bit, otherwise low. now i have two: one tiny one and one huge one. according to that law there the amount of entropy created by opening the tanks would be equal.

now i know where the fallacy lies: either of these tanks has a lot more possible states than two different ones. the point is thats kinda the case with all things in real life, computer bits being no exception, considering they consist of a truckload of electrons, that all can take infinitly many states. so how is this law usefull in any way if a finite amount of states doesnt exist, or atleast if they do, are beyond our measuring capabilities?

Quote:

Quote:
 on top of that it just doesnt make sense.

Then perhaps you should learn about it.

perhaps, so far it hasnt been made clear to me.

• ### What is your GameDev Story?

In 2019 we are celebrating 20 years of GameDev.net! Share your GameDev Story with us.

• 13
• 9
• 9
• 15
• 14
• ### Forum Statistics

• Total Topics
634070
• Total Posts
3015333
×