home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!biosci!buc.edu.au!kjm
- From: kjm@buc.edu.au (Kevin Moore)
- Newsgroups: bionet.info-theory
- Subject: Re: A Mathematical Fireside Chat at 300K about 0K
- Message-ID: <9211152208.AA19972@eureka.buc.edu.au>
- Date: 16 Nov 92 14:08:15 GMT
- Sender: daemon@net.bio.net
- Distribution: bionet
- Lines: 100
-
-
- In article <9211140822.AA01880@net.bio.net>
- burchard@horizon.math.utah.edu (Paul Burchard) writes:
-
- > The energy of the system will be distributed among its states in some way.
- > Let p(E) be the distribution (where E = energy per state). The simplest
- > measure that can be derived from this energy distribution is the
- > *average* energy per state---this we call the "absolute temperature" T
- > (working in suitable units).
-
- WRONG! Consider the Fermi-Dirac distrubution to see the error with this.
-
- > Now if all we know about a system is its temperature T, what can we
- > say about p(E)? Well, not forgetting our other assumption, that E>0,
- > we can use information theory to make the *least presumptive* guess
- > for p(E). We do this by maximizing the information-theoretic *entropy*
- > of the probability distribution, constrained by our two assumptions.
- > In this way we find that our best guess is the exponential distribution
- > p(E) = (1/T) e^{-E/T}.
-
- That's p(E) = (1/Z) e^{-E/kT} where Z is a normalisation constant,
- otherwise known as the partition function (or functional, in the
- non-equilibrium case).
-
- An ideal system that actually conforms to this
- > generic energy distribution we will call a "heat bath".
- >
- > Note that T also shows up as the exponential decay rate of p(E). This
- > suggests an alternative definition of temperature, which allows the
- > possibility of "negative absolute temperature". We ignore this definition.
-
- Why? That T corresponds to the integrating factor in dQ=dS/T. It therefore
- corresponds to the thermodynamic definition.
-
- [deletia]
-
- > Our conclusion? Temperature implies an upper bound on (physical)
- > entropy, and under physically correct assumptions, this upper bound
- > goes to zero as the temperature goes to zero. Therefore,
- > zero temperature implies zero entropy.
-
- I'll go along with this.
-
- > But...you may be troubled by the following elements of the argument:
- > 1. Saying that E is always positive is tantamount to making
- > E an absolute, rather than relative, quantity.
- > This flies in the face of classical mechanics.
-
- E does not have to be positive for statistical mechanics to work. You
- simply get different values for Z with different zeroes of potential
- energy.
-
- > 2. In order to make sense of the limiting maximum entropy
- > as T goes to zero, we had to assume discreteness
- > of energy levels. Classically, it's trouble again.
- > In fact, I'm skeptical that we can make classical sense of entropy at
- > 0K. The entropy bound's divergence to minus infinity reflects the
- > infinite information difference between a continuous and a discrete
- > probability distribution. It might still be possible to renormalize
- > these calculations somehow, but that's a slippery business in itself.
-
- Yep. You use $\int \rho\log\frac{\rho}{m} d\tau$, where $m$ is the Lebesgue
- measure on phase space $\tau$. The $m$ is important, because it makes the
- information entropy invariant under choice of parametrisation of phase
- space. Klir, the fuzzy set person, (unforgivably) sets up a straw man by
- leaving out the measure when criticising the maximum entropy procedure.
-
-
- > Quantum physics addresses both problems. First of all, it gives us
- > discrete energy levels, at least for finite systems. Actually, all
- > we really need is that there be a finite energy gap at E=0. This is
- > a very mild assumption quantum mechanically.
-
- > More substantially, QM typically gives us an absolute zero energy level.
- > This zero level *need not* be the ground state of the system; it is
- > sensible (as sensible as QFT gets :-) to ask if the ground state energy
- > vanishes. The most familiar example of this is the quantum harmonic
- > oscillator, where the ground state energy is hw/2 (h = Plank's h-bar;
- > w = angular frequency). Notice that if the ground state energy does not
- > vanish, then the system cannot get to 0K, purely by definition.
-
- As I said earlier, the average energy definition is a BAD one. In fact it
- only works for classical particles, with no exchange interaction. All the
- particles in the ground state is 0K. ($\beta=1/kT$ is infinite) The quantum
- mechanical reason for the third law is the energy-time uncertainty
- relation. In order to establish 0K, you have to constrain the energy state
- for an infinite time.
-
- > So now I'll go warm my hands by the resulting flamefest!!!
- > I vote for 0 entropy at 0K!!!
-
- > P.S. Please criticize my *definitions* first, before flaming
- > my conclusions!
-
- I did, and I hope you're satisfied. :-)
-
- Cheers,
- Kevin.
-
- kjm@eureka.buc.edu.au
-