home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!biosci!agate!ames!saimiri.primate.wisc.edu!zaphod.mps.ohio-state.edu!darwin.sura.net!spool.mu.edu!umn.edu!news
- From: burchard@geom.umn.edu (Paul Burchard)
- Newsgroups: bionet.info-theory
- Subject: Re: A Mathematical Fireside Chat at 300K about 0K
- Message-ID: <1992Nov17.155936.12947@news2.cis.umn.edu>
- Date: 17 Nov 92 15:59:36 GMT
- References: <9211152208.AA19972@eureka.buc.edu.au>
- Sender: news@news2.cis.umn.edu (Usenet News Administration)
- Distribution: bionet
- Organization: University of Minnesota
- Lines: 104
- Nntp-Posting-Host: mobius.geom.umn.edu
-
- In article <9211152208.AA19972@eureka.buc.edu.au>
- kjm@buc.edu.au (Kevin Moore) writes:
- >
- > In article <9211140822.AA01880@net.bio.net>
- > burchard@horizon.math.utah.edu (Paul Burchard) writes:
- >
- >> Our conclusion? Temperature implies an upper bound on (physical)
- >> entropy, and under physically correct assumptions, this upper
- >> bound goes to zero as the temperature goes to zero. Therefore,
- >> zero temperature implies zero entropy.
- >
- > I'll go along with this.
-
- Well, that is the basic argument I wanted to make, so let's
- work backwards from there and see where I may have erred
- along the way.
-
- What seems to have gotten people the most upset was my talk
- of absolute energy scales. As Kevin points out, you can of
- course shift the entire theory to any energy "origin" and
- it will work fine. (For exponentials, translation and
- scaling are the same thing, hence the Z.)
-
- So to make my point in a less inflamatory way, what I'm claiming
- is that we must be given an absolute energy minimum as prior
- information. Without this, there is no consistent way to
- define a heat bath, whose only property is temperature. For
- simplicity, we can then shift our energy scale to put that
- absolute minimum at zero. (I remain skeptical, however, that
- this absolute minimum would always be set to the ground state
- energy in the quantum case.)
-
- The other thing that proved to be kindling was my definition
- of temperature as average energy per state. Here the disagreement
- is more interesting:
-
- >> Note that T also shows up as the exponential decay rate of p(E).
- >> This suggests an alternative definition of temperature, which
- >> allows the possibility of "negative absolute temperature". We
- >> ignore this definition.
- >
- > Why? That T corresponds to the integrating factor in
- > dQ=dS/T. It therefore corresponds to the thermodynamic
- > definition.
-
- That's an interesting point. Could you explain this a bit further?
- The correspondence isn't immediately obvious to me...
-
- > As I said earlier, the average energy definition is a BAD
- > one. In fact it only works for classical particles, with
- > no exchange interaction. All the particles in the ground
- > state is 0K. ($\beta=1/kT$ is infinite)
-
- My definition was E per state, not per particle.
-
- In any case, the REAL test of which temperature definition is
- correct must be settled by checking which one gives the correct
- direction of heat flow when two systems of different temperature
- are "weakly coupled". I would think that the average energy per
- state would equalize between the two---are you claiming that the
- exponential decay rates of the energy distribution equalize
- instead, in the case when these two definitions disagree?
-
- >> 2. In order to make sense of the limiting maximum entropy
- >> as T goes to zero, we had to assume discreteness
- >> of energy levels. Classically, it's trouble again.
- >> In fact, I'm skeptical that we can make classical sense of
- >> entropy at 0K. The entropy bound's divergence to minus infinity
- >> reflects the infinite information difference between a continuous
- >> and a discrete probability distribution. It might still be
- >> possible to renormalize these calculations somehow, but that's a
- >> slippery business in itself.
- >
- > Yep. You use $\int \rho\log\frac{\rho}{m} d\tau$,
- > where $m$ is the Lebesgue measure on phase space $\tau$.
- > The $m$ is important, because it makes the information
- > entropy invariant under choice of parametrisation of
- > phase space. Klir, the fuzzy set person, (unforgivably)
- > sets up a straw man by leaving out the measure when
- > criticising the maximum entropy procedure.
-
- Wait! I think max entropy is great, and in fact I'm already using
- your definition for continuous distributions.
-
- My point is that we get in trouble defining the *limiting* entropy
- in the continuous case. Measured continuously, the limit is minus
- infinity; however, that corresponds to the fact that the continuous
- distribution is becoming discrete---somehow we need to "subtract an
- infinite constant" to make sense of the limit. As long as the energy
- minimum is non-degenerate, we could hope that this minus infinity
- "renormalizes" to zero, because in that case, the distribution is
- approaching a single-point discrete distribution.
-
- >> P.S. Please criticize my *definitions* first, before
- >> flaming my conclusions!
- >
- > I did, and I hope you're satisfied. :-)
-
- Thank you for your interesting comments!
-
- --------------------------------------------------------------------
- Paul Burchard <burchard@geom.umn.edu>
- ``I'm still learning how to count backwards from infinity...''
- --------------------------------------------------------------------
-