Transcript Part I

Chapter 1
Introduction
to Statistical
Methods
Chapter 1
• Some pure math discussion for a while!
The math of probability
& statistics.
“The true logic of this
world is in the calculus
of probabilities”.
James Clerk Maxwell
Relevance of Probability to Physics
• In this course, we’ll discuss The
Physics of systems containing HUGE
numbers ( >> 1023) of particles:
 Solids, Liquids, Gases, EM Radiation,
(photons & other quantum particles), ..
The Challenge
• Describe a system’s Macroscopic
characteristics starting from a
Microscopic theory.
• Formulate a theory to describe a
system’s Macroscopic characteristics
starting from a Microscopic theory.
•Classical Mechanics: Newton’s Laws
Need to solve >>1023 coupled differential
equations of motion! (ABSURD!!)
•Quantum Mechanics: Schrödinger’s
Equation: Need solution for >>1023
particles! (ABSURD!!)
• Historically, this led to the use of a
Statistical description of such a system.
 So, we’ll talk about
Probabilities &
Average System Properties
• We are NOT concerned with the
detailed behavior of individual
particles.
Definitions:
Microscopic: ~ Atomic dimensions
or ~ ≤ a few Å
Macroscopic: Large enough to be
“visible” in the “ordinary” sense
•An Isolated System is in Equilibrium when it’s
Macroscopic parameters are time-independent.
This is the usual case in this course!
•But, note! Even if it’s Macroscopic parameters
are time-independent, a system’s Microscopic
parameters can still vary with time!
Random Walk  Binomial Distribution
Section 1.1
Elementary Statistical Concepts & Examples
• Math preliminaries (methods) for the next few lectures.
• To treat statistical physics problems, we must
first know something about the mathematics of
Probability & Statistics
• The following should hopefully be a review! (?)
• Keep in mind: Whenever we want to describe a
situation using probability & statistics, we must
consider an assembly of a large number N (in
principle, N ∞) of “similarly prepared systems”.
This assembly is called an
ENSEMBLE
(“Ensemble” = French word for Assembly).
• The Probability of an occurrence of a
particular event is DEFINED with
respect to this particular ensemble & is
given by the fraction of systems in the
ensemble characterized by the
occurrence of this event.
Example
• In throwing a pair of dice, we can give a
statistical description by considering that a
very large number N of similar pairs of dice
are thrown under similar circumstances.
• Alternatively, we could imagine the same
pair of dice thrown N times under similar
circumstances.
• The probability of obtaining two 1’s is then
given by the fraction of these experiments
in which two 1’s is the outcome.
• Note that this probability depends
strongly on the type of ensemble to
which we are referring.
• See Reif’s flower seed example (p. 5).
• To quantitatively introduce probability
concepts, we use a specific, simple
example, which is actually much more
general than you first might think.
• The example is called
“The Random Walk Problem”
The 1-Dimensional Random Walk
“The most important questions
of life are indeed, for the most
part, really only problems of
probability.”
Pierre Simon Laplace
“Théorie Analytique des
Probabilités”, 1812
The One-Dimensional Random Walk
• In it’s simplest, crudest, most idealized form,
The random walk problem can
be viewed as in the figure
• A story about this is that a drunk that starts out
from a lamp post on a street. Obviously, he wants
to move down the sidewalk to get somewhere!!
• So the drunk starts out from a lamp post on a street.
Each step he takes is of equal length ℓ. He is SO
DRUNK, that the direction of each step (right or left)
is completely independent of the preceding step.
• The (assumed known) probability of stepping to the right is p
& of stepping to the left is q = 1 – p. In general, q ≠ p.
• The x axis is along the sidewalk, the lamp post is at x = 0.
Each step is of length ℓ, so his location on the x axis must
be x = mℓ where m = a positive or negative integer.
• Question: After N steps, what is the
probability that the man is at a specific
location x = mℓ (m specified)?
• To answer, we first consider an
ensemble of a large number N of drunk
men starting from similar lamp posts!!
• Or repeat this with the same drunk man
walking on the sidewalk N times!!
• This is “easily generalized” to 2 dimensions,
as shown schematically in the figure.
• The 2 dimensional random
walk corresponds to a
PHYSICS problem of adding
N, 2 dimensional vectors of
equal length (figure) &
random directions & asking:
“What is the probability that the
resultant has a certain magnitude
& a certain direction?”
Physical Examples to which the
Random Walk Problem applies:
1. Magnetism (Quantum Treatment)
• N atoms, each with magnetic moment μ.
Each has spin ½. By Quantum
Mechanics, each magnetic moment can
point either “up” or “down”. If these are
equally likely, what is the Net magnetic
moment of the N atoms?
Physical Examples to which the
Random Walk Problem applies:
2. Diffusion of a Molecule of Gas
(Classical Treatment)
• A molecule travels in 3 dimensions
with a mean distance ℓ between
collisions. How far is it likely to
have traveled after N collisions?
Answer using Classical Mechanics.
The Random Walk Problem
• The Random Walk Problem
illustrates some fundamental results of
Probability Theory.
• The techniques used are
Powerful & General.
• They are used repeatedly throughout
Statistical Mechanics.
• So, it’s very important to spend some time
on this problem & to understand it!
Section 1.2: 1-Dimensional Random Walk
• Forget the drunk, let’s get back to Physics!
Think of a particle moving in 1 dimension in
steps of length ℓ, with
1. Probability p of stepping to the right &
2. Probabilty q = 1 – p of stepping to the left.
• After N steps, the particle is at position:
x = mℓ (- N ≤ m ≤ N).
Let n1 ≡ # of steps to the right (out of N)
Let n2 ≡ # of steps to the left.
•
•
•
•
•
After N steps, x = mℓ (- N ≤ m ≤ N).
Let n1 ≡ # of steps to the right (out of N)
Let n2 ≡ # of steps to the left.
Clearly,  N = n1 + n2
(1)
Clearly also, x ≡ mℓ = (n1 - n2)ℓ
or,
m = n1 - n2
(2)
• Combining (1) & (2) gives: 
m = 2n1 – N
(3)
• If N is odd, so is m. If N is even, so is m.
A Fundamental Assumption is that
Successive Steps are
Statistically Independent
• Let p ≡ the probability of stepping to the
right & q = 1 – p ≡ the probability of
stepping to the left.
• Since each step is statistically independent,
the probability of a given sequence of n1
steps to the right followed by n2 steps to the
left is given by multiplying the respective
probabilities for each step:
p ≡ Probability of stepping to the right
q = 1 – p ≡ Probability of stepping to the left.
• Since each step is statistically independent, the
probability of a given sequence of n1 steps to the
right followed by n2 steps to the left is given by
multiplying the respective probabilities for each step:
p·p·p·p·p· · · · · · · p·p· ···· q·q·q·q·q·q·q·q·q·q· · · q·q
≡
n1
n2
p q
 n1 factors 
 n2 factors  
• But, also, clearly, there are MANY different
possible ways of taking N steps so n1 are to right
& n2 are to left!
• The # of distinct possibilities is the SAME
as counting the # of distinct ways we
can place N objects, n1 of one type & n2
of another in N = n1 + n2 places:
1st place: Can be occupied any one of N ways
2nd place: Can be occupied any one of N - 1 ways
3rd place: Can be occupied any one of N - 2 ways
…….
(N – 1)th place: Can be occupied only 2 ways
Nth place: Can be occupied only 1 way
 All available places
can be occupied in:
N(N-1)(N-2)(N-3)(N-4) ······
······ (3)(2)(1) ≡ N! ways.
Here, N! ≡ “N-Factorial”
Note However!
• This analysis doesn’t take into account the fact
that there are only 2 distinguishable kinds of
objects: n1 of the 1st type & n2 of the 2nd type.
All n1! possible permutations of the 1st type of
object lead to exactly the same N! possible
arrangements of the objects. Similarly, all n2!
possible permutations of the 2nd type of object
also lead to exactly the same N! arrangements.
 So, we need to divide
the result by n1!n2!
 So, the # of distinct ways in which N
objects can be arranged with n1 of the
1st type & n2 of the 2nd type is
≡ N!/(n1!n2!)
• This is the same as the # of distinct
ways of taking N steps, with n1 to the
right & n2 to the left.
Summary:
• The probability WN(n1) of taking N steps;
n1 to the right & n2 (= N - n1) to the left is
WN(n1) = [N!/(n1!n2!)]pn1qn2 or
WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2
• Often, this is written as
Remember that q = 1- p
WN(n1)  N pn1qn2
n1
WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2
• Often, this is written as
WN(n1)  N pn1qn2
n1
q = 1- p
• This probability distribution is called the
Binomial Distribution.
• This is because the Binomial
Expansion has the form
(p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2
We really want the probability PN(m) that x
= mℓ after N steps. This really the same as
WN(n1) if we change notation:
• PN(m) = WN(n1). But m = 2n1 – N, so
n1 = (½)(N + m) &
n2 = N - n1 = (½)(N - m).
• So the probability PN(m) that x = mℓ after
N steps is:
PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m)
• For the common case of p = q = ½, this is:
PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N
• The probability PN(m) that x = mℓ after N steps is:
PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m)
• For the common case of p = q = ½, this is:
PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}(½)N
• This is the usual form of the
Binomial Distribution
which is probably the most
elementary (discrete) probability
distribution.
• As a trivial example, suppose that
p = q = ½, N = 3 steps:
• This gives,
P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3
• So
P3(3) = P3(-3) = (3!/[3!0!](⅛) = ⅛
P3(1) = P3(-1) = (3!/[2!1!](⅛) = ⅜
Table of Possible
Step Sequences
n1
3
2
1
0
n2 m = n1 – n2
0
3
1
1
2
-1
3
-3
• As another example, suppose that: p = q = ½, N = 20.
• This gives:
P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3
• Calculation of this gives the histogram results in the figure below.
P20(20) = [20!/(20!0!)](½)20
P20(20)  9.5  10-7
P20(0) = [20!/(10!)2](½)20
P20(0)  1.8  10-1
• For this same case:
P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3
Note: The “envelope” of the histogram is a “bell-shaped”
curve. The significance of this is that, after N random
steps, the probability of a particle being a distance of N
steps away from the start is very small & the probability
of it being at or near the origin is relatively large:
P20(20) = [20!/(20!0!)](½)20
P20(20)  9.5  10-7
P20(0) = [20!/(10!)2](½)20
P20(0)  1.8  10-1
“It is remarkable that a science
which began with the consideration
of games of chance should have
become the most important object
of human knowledge.”
Pierre Simon Laplace
“Théorie Analytique des
Probabilités”, 1812