thermodynamic probability
Download
Report
Transcript thermodynamic probability
Chapter 12
Statistical Thermodynamics
1
Introduction to statistical mechanics
Statistical mechanics was developed alongside macroscopic
thermodynamics. Macroscopic thermodynamics has great generality,
but does not explain, in any fundamental way, why certain processes
occur. As our understanding of the molecular nature of matter
developed this knowledge was used to obtain a deeper understanding
of thermal processes. Some uses:
1) ideal gas:- very successful
2) real gases:- more difficult, but some success
3) liquids:- very difficult, not much success
4) crystalline solids:- since they are highly organized
they can be treated successfully
5) electron gas:- electrical properties of solids
6) photon gas:- radiation
7) plasmas:- very important
As the results of kinetic theory can be obtained from statistical
mechanics, we will not discuss kinetic theory.
2
Stat mech adds something very useful to thermodynamics, but does
not replace it.
Can we use our knowledge of the microscopic nature of a gas
to, say, violate the 2nd law? Maxwell investigated this possibility and
invented an intelligent being, now called a Maxwell demon, who
does just that. As an example, imagine a container with a partition
at the center which has a small trapdoor.
The demon opens, momentarily, the trapdoor
when a fast molecule approaches the trapdoor
adiabatic
from the right. She also opens it when a slow
walls
molecule approaches it from the left. As a
demon result the gas on the left becomes hotter and
the gas on the right becomes cooler. One can
then consider operating a heat engine between the two sides to produce
work, violating the second law.
3
H
adiabatic wall
demon
QC
QH
C
The demon, clever lady, can keep the
energy content of the cold reservoir
constant (for a time). The net result is
that energy is removed from a single
reservoir (H) and is used to do some
work. This violates the 2nd law.
QC
E
work
Of course(?) no demon exists, but could some clever mechanical
device be used?The demon must have information about the
molecules if she is to operate successfully. Is there a connection
between information and entropy? Yes!
4
The subject of information theory uses the concept of entropy.
Let us consider another example:- free expansion.
The demon removes the partition, free expansion
occurs and the entropy of the system increases.
gas vacuum Because of random motion of the molecules, there is
some probability that, at some instant, they will all
be in the region initially occupied by the gas. For
demon this to occur, you will probably have to wait 1010 yrs
The demon could, at this instant, slide in the partition and we would
have a decrease in entropy of the universe. Again the demon must
have some information about the location of the molecules. No such
demon has been sighted.
Before starting Ch. 12 I should warn you that there are two different
types of statistics that have some similarities and have similar names.
These two types of statistics are easily confused.
(1) Maxwell-Boltzmann Statistics:-”classical limit” applies
to dilute gases. The particles are indistinguishable.
5
(2) Boltzmann Statistics: particles are distinguishable
10
Some jargon:
assembly (or system): N identical submicroscopic entities, such
as molecules.
macrostate (or configuration): number of particles in each of
the energy levels.
microstate: number of particles in each energy state.
thermodynamic probability: number of different microstates
leading to a given macrostate.
k th macrostate
wk is the thermodynamic probability
Basic postulate: All possible microstates are equally probable.
6
RECALL: In statistics, probabilities are multiplicative. As an
example, consider a true die. The probability of throwing a one
is 1/6. Now if there are two dies, the probability of one coming
up on both dies is 1 1 1
6 6
36
7
Elementary Statistics
We begin by considering 3 distinguishable coins (N D Q)
The possible macrostates are HHH HHT HTT TTT
Let us consider the microstates for the macrostate HHT
H
N
D
H
D
N
T
Q
Q
N
Q
D
Q
N
Q
D
D
N
Q
D
N
The table shows the possible selection of coins.
There are 6 possibilities. However the pairs
shown are not different microstates (the order
does not matter). Hence we have 3 microstates.
8
More generally, suppose that we have N distinct coins and we wish
to select N1 heads (a particular macrostate).
There are N choices for the first head.
There are (N-1) choices for the second head.
There are [N ( N1 1)] [N N1 1] choices for the N 1th head
The thermodynamic probability (w) is the number of microstates for a
given macrostate. We are then tempted to write
N!
w ( N )( N 1)( N N1 1)
( N N1 )!
However permuting the N1 heads results in the same microstate, so
N!
w
N1!( N N1 )!
3!
3
For the above simple example with 3 coins: w
2!(3 2)!
This is the number of microstates for the HHT macrostate.
9
Suppose we plot w as a function of N1
a number of cases (Thermocoin.mws).
for a given N. We plot
(N1 is the number of heads.)
10
Notice that the peak occurs at N1 N / 2
wmax
N!
NN
! !
22
For large N we can use Stirling’s formula
ln( N! ) N ln( N ) N
N
N N N
ln( wmax ) ln( N !) 2 ln ! N ln( N ) N 2 ln
2
2 2 2
N N
ln( wmax ) N ln( N ) 2 ln N ln( 2) ln( 2 N )
wmax 2 N
2 2
For N=1000
wmax 10
300
This is the number of distinct microstates for the most probable
macrostate (N1=500). Note that it is a very large number!
11
For large N the plot of w versus N1 is very sharp (see next slide)
The total thermodynamic probability is obtained by summing over
all macrostates. Let k indicate a particular macrostate: wk
k
Since, for large N, the peak is very sharp:
w
k
wmax
k
12
13
Now we consider N distinguishable particles placed in n boxes with
N1 in the first box, N 2 in the second box, etc.
We wish to calculate w( N1, N2 , N n )
(a particular macrostate)
Before doing a general calculation, we consider the case of
4 particles (ABCD) with 3 boxes and N1 2 N2 1
N3 1
We begin by indicating the possibilities for the first box.
Since the order is irrelevant, there are
A B B C
6 possible microstates.
B A C B
Now suppose A and B were selected for
the first box. This leaves C and D when
A C B D
we consider filling the second box. We
C A D B
obviously have only two possibilities,
C or D. Suppose that C was selected.
A D C D
That leaves only one possibility (D) for
D A D C
the third box. The total number of
14
possibilities for this macrostate is (6)(2)(1)=12
Now we consider the general problem (macrostate(N1,N2,N3,…..)):
Consider placing N1 of the N distinguishable particles in the first box.
1st box
2nd box
N ( N 1)( N 2)( N N1 1)
N!
N1!
N1!( N N1 )!
( N N1 )!
N2!( N N1 N2 )!
( N N1 N2 )!
N3!( N N1 N2 N3 )!
3rd box
The thermodynamic probability for this macrostate is:
w
( N )!
( N N1 )!
( N N1 N2 )!
N1!( N N1 )! N2!( N N1 N2 )! N3!( N N1 N2 N3 )!
We have been considering distinguishable
particles, such as atoms rigidly set in the
lattice of a solid. For a gas, the statistics will
be different.
w
N!
n
Nk!
k 1
15
Example (Problem 12.6) We will do an example illustrating the use
of the formula on the previous slide.
We have 4 distinguishable particles (ABCD). We wish to place them
in 4 energy levels (“boxes”) 0, , 2 , 3 subject to the constraint
that the total energy is U 6
A macrostate will be labeled by k and wk is the thermodynamic
probability for the kth macrostate.
k 1 2
3
2
0
1
wk
2 1
1
1
3
4
5
4!
4!
4!
4!
w1
w2
w3
w4
2!2!
1!1!1!1!
1!3!
2!2!
2
2
3
The most probable state, k=2, is the most
random distribution.
1
{Students should explicitly display one
of the macrostates.}
1
3
2 1
6 24 4
6
4
16
Now consider an isolated system of volume V containing N
distinguishable particles. The internal energy U is then fixed and
the macrostate will be characterized by (N,V,U). There are n
energy levels (like boxes) available and we wish to know the set
N i at equilibrium. There are the following restrictions:
n
N
i 1
n
i
N
i 1
i i
N
U
Conservation of particles
Conservation of energy
The central problem is then to determine the most probable
distribution. Since the system is isolated the total entropy must be
a maximum with respect to all possible variations within the ensemble.
The actual distribution of particles amongst the energy levels will be
the one that maximizes the entropy of the system.
Can we make a connection between the entropy and some
specification of the macrostate?
17
A study of simple systems suggests that there is a connection between
entropy and disorder. For example if one considers the free expansion
of a gas, the entropy of the gas increases and so does the disorder. We
know less about the distribution of the molecules after the expansion.
The thermodynamic probability is also a measure of disorder.
The larger the value of w, the greater the disorder. A simple example
is as follows:
Suppose we distribute 5 distinguishable particles among 4
boxes. We can use the equation developed to determine
w.
N1 N 2 N 3
N 4 wk
5
4
3
0
1
2
0
0
0
0
0
0
1
5
10
3
2
2
1
2
1
1
1
1
0
0
1
20
30
60
5!
30
2!2!1!0!
18
The most ordered state, that is with all the particles in a single box,
has the lowest w. The most disordered state, that is with the particles
distributed amongst all the boxes, has the largest thermodynamic
probability. As a system approaches equilibrium not only does the
entropy approach a maximum, but the thermodynamic probability
also approaches a maximum.
Is there a relationship between entropy and thermodynamic
probability? If so we would expect that S would be a monotonically
increasing function of w: as the probability increases, so does S.
19
Thermodynamic Probability and Entropy
Ludwig Boltzmann made many important contributions to
thermodynamics. His most important contribution to physics is the
relationship between w and the classical concept of entropy. His
argument was as follows. Consider an isolated assembly which
undergoes a spontaneous, irreversible process. At equilibrium S has
its maximum value consistent with U and V. But w also increases and
approaches a maximum when equilibrium is achieved. Boltzmann
therefore assumed that there must be some connection between w and
S. He therefore wrote S=f(w), and S and w are state variables. To be
physically meaningful f(w) must be a single-valued monotonically
increasing function. Now consider two systems, A and B, in thermal
contact. (Such a system of two or more assemblies is called a
canonical ensemble.) Entropy is an extensive property and so S for
the composite system is the sum of the individual entropies:
S S A SB
Hence
f (w) S A S B or
20
f (w) f (wA ) f (wB )(1)
On the other hand independent probabilities are multiplicative so
w wA wB
Hence
f (w) f (wA wB )(2)
From (1) and (2) we obtain:
f (wA wB ) f (wA ) f (wB )
The only appropriate function for which this relationship is true is
a logarithm. Hence Boltzmann wrote
S k ln w
The constant k has the units of entropy and is, in fact, the Boltzmann
constant that we have previously introduced.
This celebrated equation provides the connecting link between
statistical and classical thermodynamics. (One can begin with
statistical mechanics and define S by the above equation.)
21
Quantum States and Energy Levels
We consider a closed system containing a monatomic
ideal gas of N particles. They are in some macroscopic
volume V. According to quantum mechanics only
certain discrete energy levels are permitted for the
particles. These allowed energy states are given by
h2
8 mV2 / 3
n
2
x
2
n y n2z
where the nj are integers commencing with 1.
The symbol h represents Planck’s constant, which is a fundamental
constant. The symbol m is the mass of a molecule.
The symbol n is called a quantum number.
22
At ordinary temperatures the ’s of the particles
are such that the n - values are extremely large (109 is a
typical value). When n changes by 1, the change in is
so small that may be treated as a continuous variable.
This will later permit us to replace sums by integrals.
Example: A Hg atom moves in a cubical box whose edges are 1m
long. Its kinetic energy is equal to the average kinetic energy of an
atom of an ideal gas at 1000K. If the quantum numbers in the
three directions are all equal to n, calculate n.
Hg atom:
m 201amu (201)(1.66 10 27 )kg
n n n 3n
2
x
2
y
2
z
2
h2
2
(
3
n
)
2
8mL
1
n 2 (4mL2 kT )
h
2
m 3.34 10 25 kg
3
3h2 n2
kT
2
8mL2
2L
n
mkT
h
23
n
2(1.00 m)
25
23 J
3
(
3
.
34
10
kg
)(
1
.
38
10
)(
10
K)
34
K
6.63 10 J s
n 2.05 1011
24
Each different
represents a quantum level.
Each specification of (nx, ny, nz) represents a quantum state.
The energy levels are degenerate in that a number of different
states have the same energy.
The degree of degeneracy of level i will be specified by gi. There
is only one way to form the level 1 (nx = ny = nz = 1) so g1 = 1
that is, the ground state is not degenerate. The next level 2
occurs when one of the n’s assumes the value 2 so g2 = 3 and so forth.
As one goes to higher energy levels gi increases very rapidly.
25
In the terminology of statistical mechanics a number N of
identical particles is called an assembly or a system.
Let us now consider an assembly of N indistinguishable
particles. A macrostate is a given distribution of particles
in the various energy levels. A microstate is a given
distribution of particles in the energy states.
Basic Postulate of statistical mechanics: All
accessible microstates of an isolated system are equally
probable of occurring.
We are interested in the macrostates N i In particular, what is the
macrostate when the system is in equilibrium? We address this
problem in succeeding chapters.
26
Density of Quantum States. A concept that is important for later
work is that of the density of states. Under conditions in which the
n’s are large and the energy levels close together, we regard n,
as continuous variables. From
2/3
8
m
V
2
2
2
2
h
h
8 mV2 / 3 nx n y nz 8 mV2 / 3 n
n
h2
We consider a quantum- number space, (nx , n y , nz )
2
2
2
Each point in this space represents an energy state. Each unit volume
in this space will contain one state. All the states are in the first
quadrant. We then consider a radius R (which is n) in this space and
a second radius (R+dR). The volume between these two surfaces is
1
(4 R2 dR) This gives the number of states between and d
8
1
We represent this number by g( )d g( )d (4 R2 dR)
8
2/ 3
2/ 3
8mV
4mV
2
2
RdR
d Substitute in
But R n
2
2
h
h
27
8mV 4mV
g( )d
2
h2
h2
2/3
2/3
4
d 3 2 Vm d
h
4 2 V
g( )d
m
3
h
3
2
3
2
d
This result is correct for only certain particles. We have assumed
that a state is uniquely specified by the quantum numbers (nx , n y , nz )
In many cases other quantum numbers play a role in the unique
specification of a state. Particles fall into two categories which are
radically different.
Bosons:
have integral spin quantum number
Fermions:
have odd half-integral spin quantum number
Examples are:
Bosons
photons, gravitons, pi mesons
Fermions
electrons, muons, nucleons, quarks
28
For electrons, two spin states are possible for each translational
state. Thus each point in space represents two distinctly different
states. This leads to a multiplicative factor of 2 in the density of
states formula. To be completely general we write
4 2 V 2
g( )d s
m d
3
h
3
For s=(1/2) fermions,
s =2
The density of states replaces the degeneracy when we go from
discrete energy levels to a continuum of energy levels.
Notice that g depends on V, but not on N.
{Students: Show that the unit of g is J-1.}
29
Problem 12.1 Consider N “honest” coins.
(a) How many microstates are possible?
Consider the coins lined up in a row. Each coin has two possibilities
(H or T). For the N coins w 2 N
As an example consider 3 coins, so
w2 8
3
We will show these microstates explicitly by considering the
possibilities for the 2nd and 3rd coins and then adding H or T for
the first coin. The possibilities are displayed in the next slide.
30
We use MAPLE to calculate
COIN 3
the factorials.
H
50
15
N
50
w
2
w
1
.
13
10
H
T
(b) How many microstates for
T
the most probable macrostate?
H
The most probable macrostate has
the same number of heads and
H
tails. (slide 9)
T
T
N!
50!
wmax
wmax 1.26 1014
N N 25!25!
! !
2 2
14
w
1
.
26
10
(c) True probability: P max
Pmax 0.112
max
15
w
1.13 10
31
COIN 1
H
T
H
T
H
T
H
T
COIN 2
H
H
H
H
T
T
T
T
Problem 12.2 This is the same problem as 12.1 except that N=1000
The results are:
w 1.07 10301
wmax 2.70 10 299
Pmax 0.0252
{Students: Consider 4 identical coins in a row. Display all
the possible microstates and indicate the various macrostates.}
32
Problem 12.5 We have N distinguishable coins. The thermodynamic
probability for a particular microstate is (slide 9)
N!
w
N1!( N N1 )!
(a) ln( w) ln( N!) ln( N1!) ln(( N N1 )!)
(Stirling’s
Formula)
ln( w) N ln( N ) N N1 ln( N1 ) N1
( N N1 ) ln( N N1 ) ( N N1 )
ln( w) N ln( N ) N1 ln( N1 ) ( N N1 ) ln( N N1 )
d ln( w)
ln( N1 ) 1 ln( N N1 ) 1 ln( N N1 ) ln( N1 )
dN1
N N1 N N1
N {Maximum}
0 ln
1
N1
N1
2
N1
(b) Now for the number of microstates at the maximum
33
wmax
N!
NN
! !
2 2
N
ln( wmax ) ln( N !) 2 ln !
2
N N N
ln( wmax ) N ln( N ) N 2 ln
2 2 2
N
ln( wmax ) N ln( N ) N ln N ln 2
2
wmax e
N ln 2
34
Problem 12.8. In this problem we show explicity the microstates
associated with each macrostate. There are two distinguishable
particles and three energy levels, with a total energy of U 2
(a) A macrostate is labeled k.
k
0 1 2
1 1
2 0
0
2
1
0
A
B
wk
2
0
2
0
2
1
1
1
2
1
0
w=3
S=k ln(w)
S=k ln(3)
(b) Now we have 3 particles with the restriction that at least
one particle is in the ground state. (This is obviously necessary.)
35
k
0 1
2
A
B
C
w
1
2
1
2
0
0
3
0
2
0
0
0
0
1
2
1
0
2
0
1 0
1 1
2
1
1
3
0
For this case, S=k ln(6)
k ln( 6)
1.63
k ln( 3)
36
What have we accomplished in this chapter?
We have started to consider the statistics of the microscopic
particles (atoms, molecules,…….) of a system.
The thermodynamic probability, w, was introduced. For a given
macrostate k, wk is the number of different microstates that give
rise to this particular macrostate. A larger value of w for a
macrostate means that the macrostate is more likely to occur.
We also saw the link between macroscopic thermodynamics (S)
and statistical mechanics (w). S k ln w
The basic postulate of statistical mechanics was also introduced:
Basic Postulate of statistical mechanics: All accessible microstates
of an isolated system are equally probable of occurring.
37
We will be considering situations for which the energy levels
are so closely spaced that they may be considered to form a
continuum. The degeneracy of isolated states is then replaced
by the density of states:
3
4 2 V 2
g( )d s
m d
3
h
We now apply what we have developed in this chapter to
different situations.
38