Chapter 2 Statistical Thermodynamics 1

Download Report

Transcript Chapter 2 Statistical Thermodynamics 1

Chapter 2
Statistical
Thermodynamics
1- Introduction
- The object of statistical thermodynamics is to present a particle
theory leading to an interpretation of the equilibrium properties
of macroscopic systems.
- The foundation upon which the theory rests is quantum mechanics.
- A satisfactory theory can be developed using only the quantum
mechanics concepts of quantum states, and energy levels.
- A thermodynamic system is regarded as an assembly of
submicroscopic entities in an enormous number of every-changing
quantum states. We use the term assembly or system to denote a
number N of identical entities, such as molecules, atoms, electrons,
photons, oscillators, etc.
1- Introduction
- The macrostate of a system, or configuration, is specified by the
number of particles in each of the energy levels of the system.
Nj is the number of particles that occupy the jth energy level.
n
If there are n energy levels, then
N
i 1
i
N
A macrostate is defined by ( N1 , N 2 ,..., Ni ,..., N n )
- A microstate is specified by the number of particles in each
quantum state. In general, there will be more than one quantum
state for each energy level, a situation called degeneracy.
1- Introduction
- In general, there are many different microstates corresponding
to a given macrostate.
- The number of microstates leading to a given macrostate is called
the thermodynamic probability. It is the number of ways in which
a given macrostate can be achieved.
- The thermodynamic probability is an “unnormalized” probability,
an integer between one and infinity, rather than a number between
zero and one.
- For a kth macrostate, the thermodynamic probability is taken to
be ωk.
1- Introduction
- A true probability pk could be obtained as
pk 
k

where Ω is the total number of microstates available to the system.
   k
k
2- Coin model example and the most probable
distribution
- We assume that we have N=4 coins that we toss on the floor and
then examine to determine the number of heads N1 and the number
of tails N2 = N-N1.
- Each macrostate is defined by the number of heads and the number
of tails.
- A microstate is specified by the state, heads or tails, of each coin.
- We are interested in the number of microstates for each macrostate,
(i.e., thermodynamic probability).
- The coin-tossing model assumes that the coins are distinguishable.
Macrostate
Label
Macrostate
Microstate
k
Pk
k
N1
N2
Coin 1
Coin 2
Coin 3
Coin 4
1
4
0
H
H
H
H
1
1/16
2
3
1
H
H
H
T
4
4/16
H
H
T
H
H
T
H
H
T
H
H
H
H
H
T
T
6
6/16
H
T
H
T
H
T
T
H
T
T
H
H
T
H
T
H
T
H
H
T
H
T
T
T
4
4/16
T
H
T
T
T
T
H
T
T
T
T
H
T
T
T
T
1
1/16
3
4
5
2
1
0
2
3
4
2- Coin model example and the most
probable distribution
The average occupation number is
N 


ik
Ni
k
k
k

N
ik
k

k
  N ik pk
k
k
where Nik is the occupation number for the kth macrostate.
For our example of coin-tossing experiment, the average number
of heads is therefore
1
N1  (4  1)  (3  4)  (2  6)  (1  4)  (0  1)  2
16
Similarly, N 2  2. Then N1  N 2  4  N
2- Coin model example and the most
probable distribution
N1 k
1
1
4
2
6
3
4
6
Figure. Thermodynamic probability
versus the number of heads for a
coin-tossing experiment with 4 coins.
5
Omega k
0
7
4
3
2
1
0
4
1
0
1
2
3
4
N1
Suppose we want to perform the coin-tossing experiment with a
larger number of coins. We assume that we have N distinguishable
coins.
Question: How many ways are there to select from the N candidates
N1 heads and N-N1 tails?
2- Coin model example and the most
probable distribution
The answer is given by the binomial coefficient
N
N!
    
(1)
 N1  N1!( N  N1 )!
Example for N = 8
k
80
0
1
70
1
8
60
2
28
50
3
56
4
70
5
56
6
28
Omega k
N1
Figure. Thermodynamic
Probability versus the
number of heads for a
coin-tossing experiment
with 8 coins.
40
30
20
10
0
7
8
8
1
0
1
2
3
4
N1
5
6
7
8
The peak has become
considerably sharper
2- Coin model example and the most
probable distribution
What is the maximum value of the thermodynamic probability (max)
for N=8 and for N=1000?
The peak occurs at N1=N/2. Thus, Equation (1) gives
8!
For N  8
max 
 70
4!4!
1000!
For N  1000
max 
 ???
500!500!
For such large numbers we can use Stirling’s approximation:
ln( n!)  n ln( n)  n
2- Coin model example and the most
probable distribution
max 
1000!
 ln( max )  ln( 1000!)  2 ln( 500!)
500!500!
ln( max )  1000 ln( 1000)  1000  2500 ln( 500)  500
 1000 
ln( max )  1000 ln 
  693
 500 
log 10 (max )  log 10 (e)  ln( max )  0.4343  693  300
max  10300
For N = 1000 we find that max is an astronomically large number
2- Coin model example and the most
probable distribution
k
For N  1000 coins
Figure. Thermodynamic
probability versus the
number of heads for a
coin-tossing experiment
with 1000 coins.
10300
ordered region
ordered region
N1
Totally random region
(the most probable distribution is
a macrostate for which we have a
maximum number of microstates)
The most probable distribution is that of total randomness
The “ordered regions” almost never occur; ω is extremely small
compared with ωmax.
2- Coin model example and the most
probable distribution
For N very large
   k  max
k
Generalization of equation (1)
Question: How many ways can N distinguishable objects be
arranged if they are divided into n groups with N1 objects in the
first group, N2 in the second, etc?
Answer:
N!
N!
 ( N1 , N 2 ,..., N i ,...N n )  k 

N1! N 2 !...! N i !...! N n !  N i !
i
3- System of distinguishable particles
The constituents of the system under study (a gas, liquid, or solid)
are considered to be:
- a fixed number N of distinguishable particles
n
N
i 1
i
N
(conservati on of particles)
- occupying a fixed volume V.
We seek the distribution (N1, N2,…, Ni,…, Nn) among energy levels
(ε1, ε2,…, εi,…, εn) for an equilibrium state of the system.
3- System of distinguishable particles
We limit ourselves to isolated systems that do not exchange energy
in any form with the surroundings. This implies that the internal
energy U is also fixed
n
N
i 1
i i
U
(conservati on of energy)
Example: Consider three particles, labeled A, B, and C, distributed
among four energy levels, 0, ε, 2ε, 3ε, such that the total energy is
U=3ε.
a) Tabulate the 3 possible macrostates of the system.
b) Calculate ωk (the number of microstates), and pk (True probability)
for each of the 3 macrostates.
c) What is the total number of available microstates, Ω, for the system
3- System of distinguishable particles
Macrostate
Label
Macrostate
Specification
Microstate
Specification
Thermod.
Prob.
True
Prob.
k
N0
N1
N2
N3
A
B
C
ωk
pk
1
2
0
0
1
0
0
3ε
0
3ε
0
3ε
0
0
3
0.3
2
1
1
1
0
0
0
ε
ε
2ε
2ε
ε
2ε
0
2ε
0
ε
2ε
ε
2ε
0
ε
0
6
0.6
3
0
3
0
0
ε
ε
ε
1
0.1
  3  6  1  10
3- System of distinguishable particles
- The most “disordered” macrostate is the state of highest
probability.
- this state is sharply defined and is the observed equilibrium state
of the system (for the very large number of particles.)
4- Thermodynamic probability and Entropy
In classical thermodynamics: as a system proceeds toward a state
of equilibrium the entropy increases, and at equilibrium the entropy
attains its maximum value.
In statistical thermodynamics (our statistical model): system tends
to change spontaneously from states with low thermodynamic
probability to states with high thermodynamic probability (large
number of microstates).
It was Boltzmann who made the connection between the classical
concept of entropy and the thermodynamic probability:
S  f ()
S and Ω are properties of the state of the system (state variables).
4- Thermodynamic probability and Entropy
Consider two subsystems, A and B
S A  f ( A )
Subsystem A
SB  f (B )
SA
A
Subsystem B
SB
B
The entropy is an extensive property, it is doubled when the mass or
number of particles is doubled.
Consequence: the combined entropy of the two subsystems is
simply the sum of the entropies of each subsystem:
Stotal  S A  S B
or
f (total )  f ( A )  f ( B ) (3)
4- Thermodynamic probability and Entropy
One subsystem configuration can be combined with the other to
give the configuration of the total system. That is,
total   A B
(4)
Example of coin-tossing experiment: suppose that the two
subsystems each consist of two distinguishable coins.
Macrostate
( N1 , N2 )
Subsystem A
Subsystem B
Coin 1 Coin 2 Coin 1 Coin 2
ωkA
ωkB
pkA
pkB
(2,0)
H
H
H
H
1
1
1/4
1/4
(1,1)
H
T
H
T
2
2
2/4
2/4
T
H
T
H
T
T
T
T
1
1
1/4
1/4
(0,2)
total   A   B  4  4  16
4- Thermodynamic probability and Entropy
Thus Equation (4) holds, and therefore
f (total )  f ( A   B ) (5)
Combining Equations (3) and (5), we obtain
f ( A )  f (B )  f ( A  B )
The only function for which this statement is true is the logarithm.
Therefore
S  k ln( )
Where k is a constant with the units of entropy. It is, in fact,
Boltzmann’s constant:
k  1.38  1023 J  K 1
5- Quantum states and energy levels
- In quantum theory, to each energy level there corresponds one or
more quantum states described by a wave function Ψ(r,t).
- When there are several quantum states that have the same energy,
the states are said to be degenerate.
- The quantum states associated with the lowest energy level are
called the ground states of the system; those that correspond to
higher energies are called excited states.
3
g3  5
2
g2  3
1
g1  1
Figure. Energy levels and quantum states.
5- Quantum states and energy levels
3
g3  5
2
g2  3
1
g1  1
- For each energy level εi the number of quantum states is given by
the degeneracy gi (or the degeneracy gi is the number of quantum
states whose energy level is εi).
- The energy levels can be thought of as a set of shelves at different
heights, while the quantum states correspond to a set of boxes on
each shelf.
5- Quantum states and energy levels
Example:

Consider a particle of mass m in a
one-dimensional box with
infinitely high walls. (The particle
is confined within the region
0 ≤ x ≤ L.) Within the box the particle
is free, subjected to no forces except
those associated with the walls of the
Container.

V
x0
x
d 2
2m
Time-independent Shrödinger equation:
  2   V 
2
dx

2
 ( x) dx is the probabilit y that the particle will be found in the
infinitesi mal interval dx about the point x.
xL
5- Quantum states and energy levels
0 < x < L: V = 0
d 2
2m
2





k

2
2
dx

where k 
2m

General solution:  ( x)  A sin( kx)
 (0)  0
(continuity at x  0)
 ( L)  A sin( kL)  0 (continuity at x  L)
A sin( kL)  0  kL  n
n  1,2,3,...
Using k = nπ/L, we find that the particle energies are quantized:
 2 k 2  2 2 2
n 

n
2
2m 2mL
n  1,2,...
5- Quantum states and energy levels
The integer n is the quantum number of the one-dimensional box.
- Stationary states for the particle in the box:
 n x 
 n ( x)  A sin 
 for 0  x  L and n  1,2,...
 L 
For each value of the quantum number n there is a specific
wavefunction n(x) describing the state of the particle with energy εn.
- For the case of a three-dimensional box with dimensions Lx, Ly, and
Lz , the energy becomes
2
 2 2  nx2 n y nz2 

 2  2
2

2m  Lx Ly Lz 
Any particular quantum state is designated by three quantum
numbers nx, ny, and nz.
5- Quantum states and energy levels
If Lx = Ly = Lz = L, then
i 
 

 2 2
2
2mL
n
2
x
 n y2  nz2

nx  1,2,...
2 2
2
n
i
2mL2
where ni2  nx2  ny2  nz2
n y  1,2,...
nz  1,2,...
- ni is the total quantum number for states whose energy level is εi.
- εi depend only on the values of ni2 and not on the individual
values of the integers (nx,ny,nz).
- The volume V of a cubical box equals L3, so L2 = V2/3 and hence
i 
 2 2
2mV
2
n
23 i
as V decreases, the value of εi increases.
5- Quantum states and energy levels
Table. First three states of a three-dimensional infinite potential well
Level Energy State
(nx,ny,nz)
ni2
gi
(1,1,1)
3
1
i=1
Ground state
i=2
First excited state
(1,1,2); (1,2,1); (2,1,1)
6
3
i=3
Second excited state (1,2,2); (2,1,2); (2,2,1)
9
3
6- Density of Quantum States
When the quantum numbers ni are large, the energy levels εi are very
closely spaced and that the discrete spectrum may be treated as an
energy continuum.
Consequence:
We can regard the n’s and the ε’s as forming a continuous function
rather than a discrete set of values.
We are interested in finding the density of states g(ε)
We have:
i 
 2 2
2mV
2
n
23 i
Dropping the subscripts in equation above, using   h / 2 , and
solving for n2, we obtain
6- Density of Quantum States
23


8
mV
2
2
2
2
2


n  nx  ny  nz  


R
2

h


According to equation above, for a given value of ε, the values of
nx, ny, nz that satisfy this equation lie on the surface of a sphere of
radius R.
g ( )d  number of quantum states whose energy lies in the range
n
 to   d .
y
g(ε)dε is the number of states whose quantum
numbers (nx,ny,nz) lie within the
infinitesimally thin shell of the octant of a
sphere with radius proportional to the square
root of the energy.
  d

0
nx
6- Density of Quantum States
Evidently,
dn( )
g ( )d  n(  d )  n( ) 
d
d
n(ε) is the number of states contained within the octant of the sphere
of radius R; that is,
1 4 3   8m  3 2
 R  V  2  
8 3
6 h 
32
n( ) 
g ( )d 
dn( )
4 2 V 3 2 1 2
d 
m  d
3
d
h
6- Density of Quantum States
The last equation takes into account the translational motion only of
a particle of the system. But quantum particles may have spin as well.
Thus we must multiply the above equation by a spin factor γs:
g ( )d   s
4 2 V 3 2 1 2
m  d
3
h
where γs = 1 for spin zero bosons and γs = 2 for spin one-half
fermions.
Bosons are particles with integer spin. Fermions are particles with
half-integer spin.
For molecules, states associated with rotation and vibration may
exist in addition to translation and spin.