Transcript PowerPoint

Lecture 39: Review for Final
Chapters 1-10, Monday April 21st
• Announcements
• Exam 3 statistics
• Review third exam
• Quiz (not necessarily in this order)
• Review Chapters 3 & 4
Reading:
Chapters 1-10 (pages 1 - 207)
Final: Wed. 30th, Chs 5:30-7:30pm in here
Exam will be cumulative
Exam 3 statistics
Number of students
Mean = 64%
Median = 60%
1 perfect score
Q1 - 5.1
Q2 - 7.5
Q3 - 6.5
4
2
0
20
40
60
Score (%)
80
100
Exam statistics with 1 drop
6
Mean = 75%
Median = 76%
scale factors:
Exam 1
1.18
Exam 2
1.00
Exam 3
1.08
Number of students
2 scores > 100
4
2
0
20
40
60
Score (%)
80
100
Review of
Chapters 3 & 4
Classical and statistical probability
Classical probability:
•Consider all possible outcomes (simple events) of a process
(e.g. a game).
•Assign an equal probability to each outcome.
Let W = number of possible outcomes (ways)
Assign probability pi to the ith outcome
1
pi 
W
&
1
i pi  W  W  1
Classical and statistical probability
Statistical probability:
•Probability determined by measurement (experiment).
•Measure frequency of occurrence.
•Not all outcomes necessarily have equal probability.
•Make N trials
•Suppose ith outcome occurs ni times
 ni 
pi  lim  
N  N
 
Statistical fluctuations
 N
0.0
1/ 2
log    a log  N   b
-0.5
log( )
-1.0
-1.5
-2.0
-2.5
-3.0
a  0.516
N

1
0.5
10
0.15
100
0.04
1000 0.0132
10000 0.00356
100000 0.00145
0
1
2
3
log(N)
4
5
The axioms of probability theory
1. pi ≥ 0, i.e. pi is positive or zero
2. pi ≤ 1, i.e. pi is less than or equal to 1
3. For mutually exclusive events, probabilities add,
i.e.
p  p1  p2  ........  pr
• In general, for r mutually exclusive events, the probability that one
of the r events occurs is given by:
• Compound events, (i + j): this means either event i occurs, or event
j occurs, or both.
• Mutually exclusive: events i and j are said to be mutually exclusive
if it is impossible for both outcomes (events) to occur in a single
trial.
Independent events
Example:
What is the probability of
rolling two sixes?
Classical probabilities:
p6 
1
6
Two sixes:
p6,6  16  16 
1
36
•Truly independent events always satisfy this property.
•In general, probability of occurrence of r independent
events is:
p  p1  p2  ........  pr
Statistical distributions
6
7
8
9 10
ni
xi
Mean:
nx

x
,
i
N
i i
where N   i ni
Statistical distributions
16
ni N  
xi
Mean:
x   i pi xi , where
ni
pi  lim
N  N
Statistical distributions
16
ni
xi
Standard
deviation

 x 
2

 p x  x
i
i
i
2
Statistical distributions
Gaussian distribution
(Bell curve)
2

1
  x  x  
p( x ) 
exp  

2
2
 2


Statistical Mechanics – ideas and definitions
A quantum state, or microstate
• A unique configuration.
• To know that it is unique, we must specify it as
completely as possible...
Classical probability
• Cannot use statistical probability.
• Thus, we are forced to use classical probability.
An ensemble
• A collection of separate systems prepared in
precisely the same way.
Statistical Mechanics – ideas and definitions
The microcanonical ensemble:
Each system has same:
# of particles
Total energy
Volume
Shape
Magnetic field
Electric field
............
and so on....
These variables (parameters) specify the ‘macrostate’
of the ensemble. A macrostate is specified by ‘an
equation of state’. Many, many different microstates
might correspond to the same macrostate.
Ensembles and quantum states (microstates)
Volume V
10 particles, 36 cells
10
 1 
pi   
 36 
16
 3  10
Cell volume, V
Ensembles and quantum states (microstates)
Volume V
10 particles, 36 cells
10
 1 
pi   
 36 
16
 3  10
Cell volume, V
Entropy
Boltzmann hypothesis: the entropy of a system is related to
the probability of its being in a state.
1
p
W

S  f n W    W 
S  kB ln W 
Rubber band model
d
N!
N!
W  N , n  

n ! n ! n ! N  n !
Sterling’s approximation: ln(N!) = NlnN  N
lnW  N ln N  n ln n   N  n  ln  N  n 
 1  x   1  x   1  x   1  x  
  N 
 ln 

 ln 

 2   2   2   2  