Entropy has two definitions, which come from two different branches of science:
ID: 1320583 • Letter: E
Question
Entropy has two definitions, which come from two different branches of science: thermodynamics and information theory. Yet, they both are thought to agree. Is it true?
Entropy, as seen from information theory, measures our ignorance about a system. Entropy, as seen from thermodynamics, measures the amount of extractable work. One is epistemological, the other is ontological. Yet, they both can agree, if entropy really measures the amount of extractable work, given one's knowledge about the system.
But, is this true? Does Shannon's entropy computed on the probability distribution of a physical system express the amount of work we can obtain from it? How do increases in our knowledge increase the amount of extractable work? Is Landauer principle powerful enough to establish the connection?
Explanation / Answer
UPDATE: Below I am answering yes to the first question in the post (are the two kinds of entropy the same up to a constant). This has led to some confusion as both Matt and John gave answers saying "the answer is no", however I believe they are referring to the title "Does entropy measure extractable work?". Although the author uses the two interchangeably, the question itself contains a false premise: namely that physical entropy is a measure of extractable work ("Entropy, as seen from thermodynamics, measures the amount of extractable work"). This is simply not true in the general case, though can be true for certain special cases, as Matt's counter example shows. A concrete case of this is a ball placed anywhere on a level surface. If the ball is placed randomly, no more work can be extracted than if its location is known.
The answer is yes, the two kinds of entropy are the same up to a constant factor of kBlog2 (which is also the origin of Landauer's principle). On way to see this is via Maxwell's demon. In particular, via the Szilard engine, an idealised heat engine that uses a single particle gas. You then introduce a partition, which effectively partitions the gas into two regions, only one of which contains the particle. Now, if you knew which side of the partition the particle was on, you could use the pressure difference to extract work, and if you don't you can't since you don't know which way it will push.
Now the connection with information theory comes in when we measure which side of the partition the particle is on. From this we are gaining a certain amount of entropy (and hence information) in the register which holds our measurement result. But having this information decreases the entropy of the gas. And hence you can go back and forth between information entropy and physical entropy.
There is a fairly extensive literature on the matter, so instead of trying to give you a list, I'll point you to a review article on Maxwell's Demon and information theory from a few years ago: arXiv:0707.3400,
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.