Transcript 15-18
Dr Roger Bennett
[email protected]
Rm. 23 Xtn. 8559
Lecture 15
Free expansion
• No temperature change means no change in kinetic
energy distribution.
• The only physical difference is that the atoms have
more space in which to move.
• We may imagine that there are more ways in which
the atoms may be arranged in the larger volume.
• Statistical mechanics takes this viewpoint and
analyses how many different states are possible that
give rise to the same macroscopic properties.
Statistical View
• The constraints on the system (U, V and n) define the
macroscopic state of the system (macrostate).
• We need to know how many microscopic states
(microstates or quantum states) satisfy the macrostate.
• A microstate for a system is one for which everything
that can in principle be known is known.
• The number of microstates that give rise to a macrostate
is called the thermodynamic probability, , of that
macrostate. (alternatively the Statistical Weight W)
• The largest thermodynamic probability dominates.
• The essential assumption of statistical mechanics is that
each microstate is equally likely.
Statistical View
• Boltzmann’s Hypothesis:
• The Entropy is a function of the statistical weight or
thermodynamic probability: S = ø(W)
• If we have two systems A and B each with entropy SA
and SB respectively. Then we expect the total entropy of
the two systems to be SAB = SA + SB (extensive).
• Think about the probabilities.
• WAB = WA WB
• So SAB = ø(WA) + ø(WB) = ø(WAB) = ø(WAWB)
Statistical View
•
•
•
•
Boltzmann’s Hypothesis:
SAB = ø(WA) + ø(WB) = ø(WAB) = ø(WAWB)
The only functions that behave like this are logarithms.
S = k ln(W) Boltzmann relation
• The microscopic viewpoint thus interprets the increase in
entropy for an isolated system as a consequence of the
natural tendency of the system to move from a less
probable to a more probable state.
Expansion of an ideal gas - microscopic
• Expansion of ideal gas contained in volume V.
• U, T unchanged and no work is done nor heat
flows.
• Entropy increases – what is the physical basis?
Expansion of an ideal gas - microscopic
• Split volume into elemental cells V.
• Number of ways of placing one atom in volume
is V/ V.
• Number of ways of placing n atoms is
– W = (V/ V)n S = nk ln(V/V)
– Is this right? Depends on the size of V.
Expansion of an ideal gas - microscopic
•
•
•
•
Is this right? Depends on the size of V.
Yep, we only measure changes in entropy.
Sf-Si=nk(ln (Vf/V) - ln (Vi/V))= nk ln(Vf/Vi)
Doubling volume gives S = nk ln(2) = NR ln(2)
Statistical mechanics
• We have seen that the entropy of a system is
related to the probability of its state –entropy is a
statistical phenomena.
• To calculate thermal properties we must combine
our knowledge of the particles that make up a
system with statistical properties.
• Statistical mechanics starts from conceptually
simple ideas but evolves into a powerful and
general tool.
• The first and cornerstone concept is a clear
understanding of probability.
Probability
• Two common versions
– Classical Probability: the power to predict
what the likely outcome of an experiment
is.
– Statistical Probability: by repeated
measurement we can determine the
probability of an experimental outcome by
measuring is frequency of occurrence.
Relies on the system being in equilibrium
and the existence of well defined
frequencies.
Classical Probability
• Classical Probability
– Must determine all the possible outcomes and assign
equal probabilities to each.
– Why equal probs. Surely not all outcomes are equal?
– We ensure this is the case by looking at the system
in absolute finest detail such that the outcome is
related to a simple event.
– By definition no further refinement would enable us
to define the properties of the state in any finer
detail. This is the microstate or quantum state of the
system.
– We have already done this example by boxing atoms
of a gas in small volumes V.
Example of Classical Probability
• Heads and Tails Coin Toss – best of 3.
• Possible outcomes
– HHH, THH, HTH, HHT, TTH, THT, HTT, TTT
– Each one of these is a microstate.
• Probability of exactly two Heads?
• 3 microstates have two heads in at total of 8
possible outcomes so probability 3/8.
• This is easy - we can count the number of
microstates.
Example of Classical Probability
• Heads and Tails Coin Toss – best of 30.
– What is probability of 30 heads?
– Only one microstate has 30 H but how many
microstates are possible in total? About 109
– Probability on each toss of a H is ½ so for
30 tosses gives P30H = (½)30
– What is probability of 5 Heads and 25 Tails?
– How many ways of this occurring – how
many microstates?
30!
30
5!25!
142,506
Example of Classical Probability
• Heads and Tails Coin Toss – best of 30.
– What is probability of 5 Heads and 25 Tails?
– How many ways of this occurring – how
many microstates? 30!
30
5!25!
142,506
– Each microstate equally likely
– Probability= 142,506 × (½)30 = 1.3×10-4
• Prob. 15 H and 15 T = 155,117,520 × (½)30
Microstates in a configuration
• No. of microstates in a configuration where
particles are distinguishable.
N!
W
ni !
• Where N is the total number of
i
particles/events/options etc.
• ni is the no. of particles in the ith distinct state.
• means product (cf. for sum).
• Eg. how many distinct anagrams of STATISTICS
10!
W
50,400
3!3!2!
Dr Roger Bennett
[email protected]
Rm. 23 Xtn. 8559
Lecture 16
Microstates in a configuration
• No. of microstates in a configuration where
particles are distinguishable.
N!
W
ni !
• Where N is the total number of
i
particles/events/options etc.
• ni is the no. of particles in the ith distinct state.
• means product (cf. for sum).
• Eg. how many distinct anagrams of STATISTICS
10!
W
50,400
3!3!2!
Equilibrium
• Take an isolated system which is partitioned into two
subsystems.
• U=U1+U2
U2, V2, N2
U
,
V
,
N
1
1
1
• V=V1+V2
• N=N1+N2
• The statistical weight W of the entire system (total
number of microstates) is the product of the weights of
the subsystems.
• W(U,V,N,U1,V1,N1)=W1(U1V1N1) × W2(U2V2N2)
• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)
Equilibrium
• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)
• Now let us exchange energy through the wall
(same methodology works for exchange of
particles or volume).
• Use Clausius Entropy principle (at equilibrium
entropy is a maximum) – independent variable is
U1 holding all other terms fixed.
S
S1
S2
dU2
0
U1 U ,V ,N ,V ,N U1 V ,N U 2 V ,N dU1
1
1
1
1
2
2
Equilibrium
S
S1
S2
dU2
0
U1 U ,V ,N ,V ,N U1 V ,N U 2 V ,N dU1
1
1
1
1
2
S1
S2
U1 V ,N U 2 V ,N
1
1
2
2
2
• This is the condition for thermal equilibrium – the two
subsystems must be at the same temperature.
• We can now define an absolute temperature for each
subsystem i. At equilibrium all subsystems are at the same
temperature.
Si
1
U i V ,N Ti
i
i
Example – The Schottky Deferct
• At absolute zero all atoms in a crystal are
perfectly ordered on a crystal lattice.
• Raising the temperature introduces point defects
• Schottky defects are atoms displaced from the
lattice that end up on the surface leaving
vacancies
Schottky Defect
• What is the concentration of defects in a crystal at
thermal equilibrium?
• Creation of a defect costs energy .
• The energy U associated with n of these defects = n
• Assumptions?
• Defects are dilute and so do not interact.
• We can now use our understanding of probability to
investigate the configurational entropy.
Schottky Defects
• Configurational entropy – how many ways to distribute
n defects in a crystal of N atoms?
N!
• We can calculate the number of microstates W
ni !
i
N!
W
( N n)! n!
N!
S ( n) k ln(W ) k ln
( N n)! n!
Schottky Defects
• At equilibrium, and remembering U = n,
1 S S ( n) S ( n) dn 1 S ( n)
T U U ( n)
n dU n
• Leaves us with just the differential of the entropy to
calculate. As crystals have large numbers we can
approximate the factorial functions with Stirling’s
formula.
ln N! N ln N N
Schottky Defects
ln N! N ln N N
1 1 S ( n)
T n
N!
S ( n) k ln(W ) k ln
( N n)! n!
S ( n) kN ln N n ln n ( N n) ln( N n)
S ( n)
k ln n ln( N n)
n
1 1 S ( n) k N n
ln
T n
n
Schottky Defects
1 1 S ( n) k N n
ln
T n
n
n
N
1
e
kT
n Ne
1
kT
• For typical values of = 1eV, and kT at room temp ~
1/40 eV we find the density of Schottky defects to be
n/N = e-40 = 10-17. At 1000K n/N~10-6
Dr Roger Bennett
[email protected]
Rm. 23 Xtn. 8559
Lecture 17
Equilibrium for an isolated system
• Take an isolated system which is partitioned into two
subsystems.
• U=U1+U2
U2, V2, N2
U
,
V
,
N
1
1
1
• V=V1+V2
• N=N1+N2
• The statistical weight W of the entire system (total
number of microstates) is the product of the weights of
the subsystems.
• W(U,V,N,U1,V1,N1)=W1(U1V1N1) × W2(U2V2N2)
• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)
Equilibrium for an isolated system
• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)
• Now let us allow the walls to move thus changing
volumes to maximise entropy (same methodology
as before).
• Use Clausius Entropy principle (at equilibrium
entropy is a maximum) – independent variable is
V1 holding all other terms fixed.
S
S1
S2
dV2
0
V1 U ,V , N , N1 ,U1 V1 N1 ,U1 V2 N 2 ,U 2 dV1
Equilibrium for an isolated system
S1
S2
V1 U1 , N1 V2 U 2 , N 2
• This is the condition for equilibrium – the two subsystems
must be at the same pressure as the wall has moved to
maximise entropy.
• We can now define pressure:
Si
Pi Ti
Vi Ui , N i
Equilibrium for an isolated system
• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)
• Now let us exchange particles through the wall
(same methodology as before)
• Use Clausius Entropy principle (at equilibrium
entropy is a maximum) – independent variable is
N1 holding all other terms fixed.
S
S1
S2
dN 2
0
N1 U ,V , N ,V1 ,U1 N1 V1 ,U1 N 2 V2 ,U 2 dN1
Equilibrium for an isolated system
S1
S2
N1 U1 ,V1 N 2 U 2 ,V2
• This is the condition for particle equilibrium – the two
subsystems must have no driving force to change particle
numbers.
• We can now define the driving force to exchange particles
as the Chemical Potential:
Si
i Ti
N i Ui ,Vi
Equilibrium for system in a heat bath
• Take a system which is partitioned into two subsystems
–same problem as before so far.
• U0=U+UR
TR, VR, NR
U, V, N
• The combined system is again totally isolated and we
assume T,V,N describe the macrostate of the system.
• The system will possess a discrete set of microstates,
however, which we could group and label according to
the energy of that microstate.
Equilibrium for system in a heat bath
• By grouping the microstates by
energy we can associate a
statistical weight to each energy
level. I.e. U1<U2<U3<U4…<Ur
• The total energy of the composite
system is conserved so
U0=U+UR.
T
U, V, N
• The probability of finding our system with U = Ur must be
proportional to the number of microstates associated with
the reservoir having energy UR = U0-Ur.
• pr = const × W(U0-Ur) (all volumes and particle numbers
constant)
Equilibrium for system in a heat bath
• pr = const × W(U0-Ur)
• The constant of proportionality must just depend upon all
the available microstates and so can be properly
normalised:
W (U 0 U r )
pr
W (U0 U r )
r
• We can also write W(U0-Ur) in terms of entropy:
W (U 0 U r ) e
S (U 0 U r )
k
pr const W (U 0 U r ) const e
S ( U 0 U r )
k
Equilibrium for system in a heat bath
pr const W (U 0 U r ) const e
S ( U 0 U r )
k
• So far we haven’t used the fact that the reservoir is a
heat bath. It has average energy U0 >> Ur our system
energy.
• This is not true for all states r but is true for all
overwhelmingly likely states!
• We expand S(U0-Ur) as a Taylor expansion:-
1
1
U r S (U 0 ) 1 U r 2 S (U 0 )
S (U 0 U r ) S (U 0 )
.....
2
k
k
k U 0
2 k
U 0
2
Equilibrium for system in a heat bath
1
1
U S (U ) 1 U S (U )
S (U U ) S (U )
.....
2
r
k
0
r
k
0
0
U 0
k
2
r
2 k
0
U 02
• The first term is simple
• The second term is related to the temperature as before
through:
Si
1
U i V ,N Ti
i
i
• The third term therefore describes changes in
temperature of a heat bath due to temperature exchange
with the system. By definition this and higher terms must
be negligible. We keep terms up to linear in Ur.
Equilibrium for system in a heat bath
pr const W (U 0 U r ) const e
S ( U 0 U r )
k
1
1
Ur
S (U0 U r ) S (U0 )
k
k
kT
pr const e
S ( U0 ) U r
k
kT
e
S ( U0 ) U r
k
kT
e
S ( U0 ) U r
k
kT
r
Z e
r
Ur
kT
1 UkT
e
Z
r
The Boltzmann Distribution
1 UkT
pr e
Z
r
• This is the Boltzmann distribution and gives
“the probability that a system in contact with
a heat bath at temperature T should be in a
particular state”.
• The only property of the heat bath on which
it depends is the temperature.
• The function Z is called the partition function
of the system. It is fundamental to the study
of systems at fixed temperature.
Dr Roger Bennett
[email protected]
Rm. 23 Xtn. 8559
Lecture 18
The Boltzmann Distribution
1 UkT
pr e
Z
r
Z e
r
Ur
kT
• This is the Boltzmann distribution and gives “the
probability that a system in contact with a heat bath
at temperature T should be in a particular state”.
• r labels all the states of the system. At low
temperature only the lowest states have any chance
of being occupied. As the temperature is raised
higher lying states become more and more likely to
be occupied.
• In this case, in contact with the heat bath, all the
microstates are therefore not equally likely to be
populated.
The Boltzmann Distribution -Example
U
1 UkT
pr e
Z e kT
Z
r
Take a very simple system that has only three
energy levels each corresponding to one
microstate (non-degenerate).
The energies of these states are:
– U1 = 0 J, U2 = 1.4×10-23 J, U3 = 2.8×10-23 J
If the heat bath has a temperature of 2K
Z = e0 + e-1/2 + e-1 = 1.9744
The probability of being in state p1 = 0.506, p2
= 0.307 and p3 = 0.186
r
•
•
•
•
•
r
The Boltzmann Distribution
1 UkT
pr e
Z
r
Z e
Ur
kT
r
• Usually there are huge numbers of microstates that can
all have the same energy. This is called degeneracy.
• In this case we can do our summations above over each
individual energy level rather than sum over each
individual microstate.
1
p(U r ) g (U r )e
Z
U
r
kT
Z g (U r )e
Ur
kT
Ur
• The summation is now over all the different energies Ur
and g(Ur) is the number of states possessing the energy
Ur. The probability is that of finding the system with
energy Ur.
Entropy in ensembles
• Our system embedded in a heat bath is called a
canonical ensemble (our isolated system on its own
from Lecture 16 is termed a microcanonical
ensemble).
• When isolated the microcanonical ensemble has a
defined internal energy so that the probability of
finding a system in a particular microstate is the
same as any other microstate.
• In a heat bath the energy of the system fluctuates
and the probability of finding any particular
microstate is not equal. Can we now calculate the
entropy for such a system and hence derive
thermodynamic variables from statistical properties?
Entropy in the canonical ensemble
• Embed our system in a heat
bath made up of (M-1)
replica subsystems to the
one were interested in.
• Each subsystem may be in
one of many microstates.
The number of subsystems
in the ith microstate is ni.
• The number of ways of arranging n1 systems of
µstate 1, n2 systems of µstate 2, n3….
M!
W
ni !
i
Entropy in the canonical ensemble
• If we make M huge so that
all ni are also large then we
M!
can (eventually) use
W
ni !
Stirling’s approximation in
i
calculating the entropy for
the entire ensemble of M
systems SM
M!
k ln M ! k ln n !
S M k ln(W ) k ln
i
i
ni !
i
SM k ln M! k ln ni !
i
Entropy in the canonical ensemble
SM k ln M! k ln ni !
i
S M k M ln M M ni ln ni ni
i
S M k M ln M M ni ln ni ni
i
i
S M k M ln M ni ln ni
i
S M k ni ln M ni ln ni
i
i
ni
ni
S M kM ln
i M M
Entropy in the canonical ensemble
ni
ni
S M kM ln
i M M
• As M becomes very large the ratios ni/M tend to
a probability pi of finding the subsytem in state i.
• SM is the entropy for the ensemble of all the
subsystems. But we know that entropy is
extensive and scales with the size of the system.
So the entropy per system is:
SM
S
k pi ln pi
M
i
Entropy in the canonical ensemble
S k pi ln pi
i
• This is the general definition of entropy and holds even if
the probabilities of each individual microstate are
different.
• If all microstates are equally probable pi= 1/W
(microcanonical ensemble)
1
1
S k pi ln pi k ln k ln W
W
i
i 1 W
W
• Which brings us nicely back to the Boltzmann relation
Entropy in the canonical ensemble
S k pi ln pi
i
U
1 kT
pi e
Z
i
Z e
Ui
kT
i
• The general definition of entropy, in combination
with the Boltzmann distribution allows us to
calculate real properties of the system.
Ui
ln pi
ln Z
kT
Ui
S k pi ln pi k pi
ln Z
i
i
kT
1
U
S piU i k ln Z pi k ln Z
T i
T
i
Helmholtz Free Energy
U
S k ln Z
T
U TS k ln Z
• Ū is the average value of the internal energy of the
system.
• (Ū – TS) is the average value of the Helmholtz free
energy, F. This is a function of state that we briefly
mentioned in earlier lectures. It is central to statistical
mechanics.
• The Partition function Z has appeared in our result –it
seems to be much more than a mere normalising factor.
Z acts as a bridge linking the microscopic world of
microstates (quantum states) to the free energy and
hence to all the large scale properties of a system.