home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!biosci!buc.edu.au!kjm
- From: kjm@buc.edu.au (Kevin Moore)
- Newsgroups: bionet.info-theory
- Subject: Re: A Mathematical Fireside Chat at 300K about 0K
- Message-ID: <9211180412.AA11952@eureka.buc.edu.au>
- Date: 18 Nov 92 20:12:34 GMT
- Sender: daemon@net.bio.net
- Distribution: bionet
- Lines: 103
-
-
- In article <9211171601.AA15615@net.bio.net>
- burchard@geom.umn.edu (Paul Burchard) writes
-
- > The other thing that proved to be kindling was my definition
- > of temperature as average energy per state. Here the disagreement
- > is more interesting:
- >
- > >> Note that T also shows up as the exponential decay rate of p(E).
- > >> This suggests an alternative definition of temperature, which
- > >> allows the possibility of "negative absolute temperature". We
- > >> ignore this definition.
- > >
- > > Why? That T corresponds to the integrating factor in
- > > dQ=dS/T. It therefore corresponds to the thermodynamic
- > > definition.
- >
- > That's an interesting point. Could you explain this a bit further?
- > The correspondence isn't immediately obvious to me...
-
- Okay... When you maximise information entropy with an energy constraint,
-
- <E> = \sum_i p_i E_i
-
- in addition to normalisation, you naturally obtain the Canonical
- distribution,
-
- p_i = 1/Z exp(\beta E_i),
-
- where \beta is the Lagrange multiplier corresponding to the energy
- constraint, and Z is the partition function. We can write the expected
- energy as
-
- <E> = -\frac{\partial}{\partial\beta} \log Z
-
- The information entropy of this distribution is
-
- S_I = \log Z(\beta) + \beta <E>
-
- Now the E_i may depend on extrnally measurable parameters, such as volume,
- magnetisation, stress, etc. So for each parameter q_r, we have a
- generalised force,
-
- F_r = <\frac{\partial E}{\partial q_r}
-
- = \sum_i p_i \frac{\partial E}{\partial q_r}
-
- = \frac{-1}{\beta}\frac{\partial}{\partial q_r} \log Z
-
- (Almost there, guys) Now if we write the differential of the information
- entropy,
-
- dS_I = d\log Z(\beta,q_r) + \beta d<E> + <E> d\beta
-
- = \frac{\partial\log Z}{\partial\beta} d\beta
- +\sum_r \frac{\partial\log Z}{\partial q_r} dq_r
- +\beta d<E> + <E> d\beta
-
- = -\beta\sum_r F_r dq_r + \beta d<E>
-
- and rearrange, we get:
-
- d<E> = \frac{dS_I}{\beta} + \sum_r F_r dq_r
-
- = \frac{dS_I}{\beta} + dW
-
- We are now in a position to identify this as the first law, and the first
- term as heat,
-
- dQ = TdS,
-
- so \beta = 1/kT identifies with the thermodynamic temperature, and S=kS_I
- identifies with the thermodynamic entropy.
-
- >
- > > As I said earlier, the average energy definition is a BAD
- > > one. In fact it only works for classical particles, with
- > > no exchange interaction. All the particles in the ground
- > > state is 0K. ($\beta=1/kT$ is infinite)
- >
- > My definition was E per state, not per particle.
-
- Point taken, but it still doesn't work, and the counterexample is still the
- Fermi-Dirac case.
- >
- > In any case, the REAL test of which temperature definition is
- > correct must be settled by checking which one gives the correct
- > direction of heat flow when two systems of different temperature
- > are "weakly coupled". I would think that the average energy per
- > state would equalize between the two---are you claiming that the
- > exponential decay rates of the energy distribution equalize
- > instead, in the case when these two definitions disagree?
- >
-
- Given the correspondence with the Thermodynamic temperature just
- demonstrated, yes.
-
- Kevin Moore,
-
- kjm@eureka.buc.edu.au
-
-
- Typographcal errors notwithstanding, of course.
-