A Brief History of Planetary Science
Download
Report
Transcript A Brief History of Planetary Science
Partition Function
Physics 313
Professor Lee Carkner
Lecture 24
Exercise #23 Statistics
Number of microstates from rolling 2 dice
Which macrostate has the most microstates?
7 (1,6 6,1 5,2 2,5 3,4 4,3 total = 6)
Entropy and dice
Since the entropy tends to increase, after rolling a
non-seven your next roll should have higher
entropy
Why is 2nd law violated?
Partition Function
We can write the partition function as:
Z (V,T) = Sgi e -ei/kT
Z is a function of temperature and volume
We can find other properties in terms of the
partition function
(dZ/dT)V = ZU/NkT2
we can re-write in terms of U
U = NkT2 (dln Z/dT)V
Entropy
We can also use the partition function in
relation to entropy
but W is a function of N and Z,
S = Nk ln (Z/N) + U/T + Nk
We can also find the pressure:
P = NkT(dlnZ/dV)T
Ideal Gas Partition Function
To find ideal gas partition function:
Result:
Z = V (2pmkT/h2)3/2
We can use this to get back our ideal gas
relations
ideal gas law
Equipartition of Energy
The kinetic energy of a molecule is:
Other forms of energy can also be written in
similar form
The total energy is the sum of all of these
terms
e = (f/2)kT
This represents equipartition of energy since
each degree of freedom has the same energy
associated with it (1/2 k T)
Degrees of Freedom
For diatomic gases there are 3 translational
and 2 rotational so f = 5
Energy per mole u = 5/2 RT (k = R/NA)
At constant volume u = cV T, so cV = 5/2 R
In general degrees of freedom increases with
increasing T
Speed Distribution
We know the number of particles with a specific
energy:
Ne = (N/Z) ge e -e/kT
We can then find
dNv/dv = (2N/(2p)½)(m/kT)3/2 v2 e-(½mv2/kT)
Maxwellian Distribution
What characterizes the Maxwellian
distribution?
The tail is important
Maxwell’s Tail
Most particles in a Maxwellian distribution
have a velocity near the root-mean squared
velocity:
vrms = (3kT/m)1/2
We can approximate the high velocities in the
tail with:
Entropy
We can write the entropy as:
Where W is the number of accessible states to
which particles can be randomly distributed
We have no idea where an individual particle
may end up, only what the bulk distribution
might be
Entropy and Information
More information = less disorder
I = k ln (W0/W1)
Information is equal to the decrease in
entropy for a system
Information must also cause a greater
increase in the entropy of the universe
The process of obtaining information
increases the entropy of the universe
Maxwell’s Demon
If hot and cold are due to the relative
numbers of fast and slow moving particles,
what if you could sort them?
Could transfer heat from cold to hot
But demon needs to get information about
the molecules which raises entropy