presentation source

Download Report

Transcript presentation source

Entropy, probability and
disorder
Thermal equilibrium
Experience tells us: two objects in thermal
contact will attain the same temperature and
keep that temperature
Why? More than just energy conservation!
Involves concept of entropy
Entropy and disorder
It is often said that entropy is a measure of
disorder, and hence every system in isolation
evolves to the state with “most disorder”
Consider a box sliding on a floor:
internal energy due to disorderly motion of the
molecules
kinetic energy (of the box) due to the collective,
orderly, motion of all the molecules
Entropy and disorder II
Now the box comes to rest due to friction
Temperature rise in both floor and box so the
internal energy increases
No more collective motion: all K.E. has been
transferred into internal energy
More disorder, so entropy has increased
A vessel of two halves
Large number of identical molecules –
distribution?
About 50% in left half, 50% in right half
Why?
Definitions
Microstate: position and momentum of each
molecule accurately specified
Macrostate: only overall features specified
Multiplicity: the number of microstates
corresponding to the same macrostate
Fundamental assumption
Statistical Mechanics is built around this one
central assumption:
Every microstate is equally likely to occur
This is just like throwing dice:
A throw of the dice
Roll one die: 1/2/3/4/5/6 all equally likely
Roll a pair of dice:
for each 1/2/3/4/5/6 equally likely
the sum 7 is most likely, then 6 and 8, etc.
Why? 6 combinations (microstates) give 7 (the
macrostate): 1+6, 2+5, 3+4, 4+3, 5+2, 6+1.
There are 5 combinations that give 6 or 8, etc.
Four identical molecules
4 molecules ABCD
5 macrostates:
Four identical molecules (2)
left:
A&B&C&D
right:
multiplicity: 1
A&B&C
A&B&D
A&C&D
B&C&D
D
C
B
A
multiplicity: 4
Four identical molecules (3)
left:
A&B
A&C
A&D
B&C
B&D
C&D
right:
C&D
B&D
B&C
A&D
A&C
A&B
multiplicity: 6
Four identical molecules (4)
#left
4
3
2
1
0
#right
0
1
2
3
4
multiplicity
1
4
6
4
1
16
probability
1/16
4/16
6/16
4/16
1/16
Ten identical molecules
Multiplicity to find 10 –...– 0 molecules on
left
1–10–45–120–210–252–210–120–45–10–1
Probability of210
finding
 252 #left
210 = 4, 5 or 6:
1024
 0.66
For large N: extremely likely that #left is very
close to N/2
Generalisation
Look at a gas of N molecules in a vessel with
two “halves”.
The total number of microstates is 2N:
two possible locations for each molecule
we’ve just seen the N=4 example
Binomial distribution I
A gas contains N molecules, N1 in the left half
(“state 1”) and N2 = N – N1 in state 2 (the right
half). How many microstates correspond to
this situation?
N1
N2
Binomial distribution II
Pick the molecules one by one and place them
in the left hand side:
choose from N molecules for the first molecule
choose from N – 1 for the second
choose from N – 2 for the third, …
choose from N – N1 + 1 for the N1-th molecule
Binomial distribution III
Number of ways of getting N1 molecules into
left half:
N!
N  ( N  1)  ( N  2)  ...  ( N  N1  1) 
( N  N1)!
The macrostate doesn’t depend on the order of
picking these molecules; there are N1! ways of
picking them. Multiplicity W is mathematical
N
N!
“combination”:
W  C  
 N1 
( N  N1)! N1!
Verification
Look at a gas with molecules A,B,C,D,E.
Look at the number of ways of putting 2
molecules into the left half of the vessel.
So: N = 5, N1 = 2, N – N1 = 3
Verification II
The first molecule is A, B, C, D, or E.
Pick the second molecule. If I first picked A
then I can now pick B, C, D or E, etc:
AB
AC
AD
AE
BA
BC
BD
BE
CA
CB
CD
CE
DA
DB
DC
DE
EA
EB
EC
ED
5  4  3  2  1 5!

That is 5  4 
possibilities
3  2 1
3!
Verification III
In the end I don’t care which molecule went in
first. So all pairs AB and BA, AC and CA, etc,
really correspond to the same situation. We
must divide by 2!=2.
A
B
=
B
A
Binomial distribution plotted
Look at N=4, 10, 1000:
P(N1)
0
1
2
N1
3
40
2
4
6
N1
8
10 0 200 400 600 8001000
N1
Probability and equilibrium
As time elapses, the molecules will wander all
over the vessel
After a certain length of time any molecule
could be in either half with equal probability
Given this situation it is overwhelmingly
probable that very nearly half of them are in
the left half of the vessel
Second Law of
Thermodynamics
Microscopic version:
If a system with many molecules is permitted
to change in isolation, the system will evolve
to the macrostate with largest multiplicity and
will then remain in that macrostate
Spot the “arrow of time”!
Boltzmann’s Epitaph: S = k
logW
Boltzmann linked heat, temperature, and
multiplicity (!)
Entropy defined by
S = k ln W
W: multiplicity; k: Boltzmann’s constant
s = “dimensionless entropy” = ln W
Second Law of
Thermodynamics
Macroscopic version:
A system evolves to attain the state with
maximum entropy
Spot the “arrow of time”!
Question 1
Is entropy a state variable?
a) Yes
b) No
c) Depends on the system
Question 2
The total entropy of two systems, with
respective entropies S1 and S2, is given by
a) S = S1 + S2
b) S = S1 · S2
c) S = S1 – S2
d) S = S1 / S2
Entropy and multiplicity
Motion of each molecule of a gas in a vessel
can be specified by location and velocity
 multiplicity due to location and velocity
Ignore the velocity part for the time being and
look at the multiplicity due to location only
Multiplicity due to location I
Divide the available space up into c small
cells. Put N particles inside the space: W=cN.
For c=3, N=2: W=32=9
AB
A
A
B
B
A
B
AB
B
A
A
B
B
A
AB
Multiplicity due to location II
Increasing the available space is equivalent to
increasing the number of cells c.
The volume is proportional to the number of
cells c
Hence W  VN
“Slow” and “fast” processes
Slow processes are reversible: we’re always
very close to equilibrium so we can run things
backwards
Fast processes are irreversible: we really
upset the system, get it out of equilibrium so
we cannot run things backwards (without
expending extra energy)
Slow isothermal expansion
Slow isothermal expansion of ideal
gas; small volume change
Wfinal V  V 
V 


 1 

N
Winitial
V 

V
N
N
“velocity part” of multiplicity
doesn’t change since T is constant
V
V
Slow isothermal expansion (2)
Use the First Law:
N
Q  pV  kTV
V
Wfinal  V 
 1 

Winitial 
V 
N
Q 

 1 

 NkT 
Big numbers  take logarithm
N
V
V
Slow isothermal expansion (3)
Manipulation:
N
 Wfinal 
Q 

ln 
  ln 1 

 NkT 
 Winitial 
Q 
Q
Q

 N ln 1 

N
NkT kT
 NkT 
or
k ln Wfinal  k ln Winitial
Q

T
V
V
Slow isothermal expansion (4)
Use definition of entropy:
Q
k ln Wfinal  k ln Winitial 
T
Q
Sfinal  Sinitial  S 
T
V
V
valid for slow isothermal expansion
Example
To melt an ice cube of 20 g at 0 °C we slowly
add 6700 J of heat. What is the change in
entropy? In multiplicity?
24.5 J
K-1;
Wfinal
 10770 ,000 ,000 ,000 ,000 ,000 ,000 ,000
Winitial
Very fast adiabatic expansion
Expand very rapidly into same volume V+V which is
now empty
Isothermal: same #collisions, #molecules, etc.
Q
Entropy change: S   0 ???
T
Q
NO! Entropy is a state variable and therefore S 
T
S = same as for slow isothermal expansion
Slow adiabatic expansion
Same volume change, but need to push air out
of the way so temperature drops
Q
Again we ask: S   0 ???
T
YES!
The “location part” of multiplicity increases as
with slow isothermal expansion
The “velocity part” decreases as temperature
drops
The two exactly cancel
Constant volume process
Heat is added to any (ideal or non-ideal) gas
whose volume is kept constant. What is the
change in entropy?
dQ nCV dT

 dQ  nCV dT ; dS 
T
T
Integrate (assuming CV is constant)
T2
T2
T1
T1
S   dS 

nCV dT
T2
 nCV ln
T
T1
Constant pressure processes
Heat is added to an ideal gas under constant
pressure. What is the change in entropy?
T2
a) S  nCV ln
T1
V2
c) S  nC p ln
V1
p2
b) S  nC p ln
p1
d) 0
Entropy and isothermal
processes
An ideal gas expands isothermally. What is the
change in entropy?
Q
Constant temperature so S 
T
V2
First Law: Q  W  nRT ln
(done previously)
V1
V2
p1
Therefore S  nR ln  nR ln
V1
p2
Entropy and equilibrium
We have established a link between
multiplicity and thermodynamic properties
such as heat and temperature
Now we see how maximum entropy
corresponds to maximum probability and
hence to equilibrium
Equilibrium volume
In general the number of microstates depends
on both the volume available and the
momentum (velocity) of the molecules
Let’s ignore the momentum part and look at
the spatial microstates only.
Equilibrium volume II
Say we have 3 molecules in a vessel which we
split up into 6 equal parts. A partition can be
placed anywhere between the cells. One
molecule is on the left-hand side, the other two
on the right-hand side. What is the equilibrium
volume?
Look for maximum entropy!
Equilibrium volume III
Number of cells on the left c1, on the right c2.
We’ll look at c1=4, c2=2:
A
BC
A
B
A
A
C
C
B
BC
Equilibrium volume IV
Left: W1= c1=4.
Right: W2= (c2)2=4.
s = ln 4 + ln 4 = ln 16 = 2.77
A
BC
A
B
A
A
C
C
B
BC
Question
The dimensionless entropy of this system of 6
cells and one partition dividing it into c1 and c2
cells is
a) s = ln (c1+ c2)
b) s = ln (c1+ c22)
c) s = ln c1+ln 2c2
d) s = ln c1+2·ln c2
Equilibrium volume V
c1
1
2
3
4
5
total
W
25
32
24
16
5
102
s
3.22
3.47
3.18
2.77
1.61
P(W)
0.25
0.31
0.24
0.16
0.05
1
Maximum entropy and
probability
s1
s2
s
dimensionless entropy
4
3
0.3
2
0.2
1
0.1
0
1
2
3
c1
4
5
Probability
0.4
0.0
1
2
3
c1
4
5
Maximum probability
Probability maximum coincides with entropy
maximum
Volume V1 = c1·dV where dV is the cell size
ds
0
Most likely situation when
dV1
N1 1 N1 2 1
Same density on both sides: V  2 ; V  4  2
1
1
Question
Which relationship holds for the probabilities
of finding a the system in a microstate
corresponding to c1=2,3,4 ?
a) P(2) < P(3) < P(4)
b) P(2) = P(3) = P(4)
c) P(2) > P(3) > P(4)
Entropy and mixing
Suppose we remove the partition. What is the
entropy of this system?
Answer: ln 63 = ln 216 = 5.38
The additional entropy of 1.91 is called “the
entropy of mixing”
Generalisation I
Look at N1 particles occupying c1 cells on the
left, N2 particles occupying c2 cells on right.
Volume of each cell = dV.
N1
N2




W

W
W

c
c
Multiplicity:
1 2
1
2
N1
N2
V1   V2 

   
 dV   dV 
V1
V2
V1N1  (V  V1 ) N 2
N1
N2

(dV ) N
Generalisation II
V N1  (V  V1 ) N 2 
Entropy: s  ln W  ln  1

N
(dV )


 N1 ln V1  N 2 ln(V  V1 )  N ln dV
Maximum entropy for equal densities:
V1
V2
N1
N2
ds N1
N2
N1 N 2


0 

dV1 V1 V  V1
V1 V2
Multiplicity and energy
According to quantum mechanics, atoms in a
crystal have energies 0, e, 2e, 3e, … (This is
called the Einstein model of solids)
Say we have three atoms with total energy 3e
Microstates are distinguished by the different
energies E1, E2, E3.
The microstates
3e
2e
E1=3e e
0
Energy
3e
E1=2e 2e
e
0
3e
E1=e 2e
e
0
E1=0
3e
2e
e
0
E1,E2,E3 E1,E2,E3 E1,E2,E3 E1,E2,E3
Question
What is the probability for any of these three
atoms to have energy 0? e? 2e? 3e?
a) 4/10,3/10,2/10,1/10
b) 1/4,1/4,1/4,1/4
c) 1/10,2/10,3/10,4/10
d) not sure
Generalisation
If there are n atoms and the total energy is qe,
then the number of microstates is given by
(q  n  1)!
W( q , n ) 
q!(n  1)!
Works for previous example (n=3,q=3):
5!
W
 10
3!2!
Partition I
Look at 10 particles, with energy 20e.
n1=3 particles on the left-hand side, n2=7 on
right-hand-side
What is the most likely energy distribution?
Plot W as a function of q1.
Partition II
W against energy on left-hand side
1000000
W
W1
W2
W
500000
0
0 2
4
6 8 10 12 14 16 18 20
q1
Partition III
Entropy against energy on left-hand side
s1
s2
s
15
s
10
5
0
0
2
4
6
8 10 12 14 16 18 20
q1
Partition IV
You expect the atoms on the left to have the
same energy on average as the atoms on the
right
Calculation/plot shows this: W maximum for
n1
q1 q2
q1 
 (q1  q2 ) 

n1  n2
n1 n2
Entropy, energy, temperature I
The internal energy on the left U1 = q1e.
ds
0
Equilibrium/entropy maximum when
dU1
Use s = s1 + s2:
ds1 ds 2

0
dU1 dU1
ds 2 ds 2 dU1
ds 2



Use U2 = U – U1:
dU 2 dU1 dU 2
dU1
Entropy, energy, temperature II
It follows that the most likely distribution of
energy corresponds to a situation where
ds1 ds 2
dS1 dS2

or

dU1 dU 2
dU1 dU 2
We know that in this situation T1 = T2
Clearly the two are linked!
Entropy, energy, temperature III
Remember: we kept V, N constant so the only
way in which energy could be exchanged was
through heat transfer
Q  U 
Remember: S   

T  T due to heating only
1  S 
S 



 T   U 

due to heating only  U fixed external parameters
Entropy, energy, temperature
IV
In our example the only external parameter
was the volume
In general, gravitational, electric or magnetic
fields, elastic energy etc. could all change
The definition of temperature only holds if all
of these are held fixed
PS225 – Thermal Physics
topics
The atomic hypothesis
Heat and heat transfer
Kinetic theory
The Boltzmann factor
The First Law of Thermodynamics
Specific Heat
Entropy
Heat engines
Phase transitions