Lecture 2 Monte Carlo

Download Report

Transcript Lecture 2 Monte Carlo

Lecture 2
• Molecular dynamics simulates a system by
numerically following the path of all particles in
phase space as a function of time
• the time T must be long enough to allow the system
to explore all accessible regions of phase space
• the time average of a quantity A is calculated from
A  limT 
1
T
t0 T

A(t )dt
t0
  ... A(r1 , r2 ,..., p1 , p2 ,...)dr dp
3N
3N
Monte Carlo
• no dynamics but random motion in
configuration space due to random but
uncorrelated forces
• generate configurations or states with a
weight proportional to the canonical or
grand canonical probability density
• actual steps of calculation depend on the
model
Review of
Probability and
Statistics
Introduction
• Probability and statistics are the foundations of
both statistical mechanics and the kinetic theory of
gases
• what does the notion of probability mean?
• Classical notion: we assign, a priori, equal
probabilities to all possible outcomes of an event
• Statistical notion: we measure the relative
frequency of an event and call this the probability
Classical Probability
• Count the number W of outcomes and assign them
equal probabilities pi = 1/W
• for example: a coin toss
• each “throw” is a trial with W=2 outcomes
•
pH = pT = 1/2
• for N consecutive trials, a particular sequence of
heads and tails constitutes an event HTTHHHTT...
• there are 2N possible outcomes and the probability
of each “event” is pi = 1/2N
Classical Probability
• We cannot predict which sequence (event) will
occur in a given trial
• hence we need a statistical description of the system
=> a description in terms of probabilities
• instead of focusing on a particular system or
sequence, we can think of an assembly of systems
called an ensemble
• repeat the N coin flips a large number (M) of times
• if event ‘i’ occurs mi times in these M members of
the ensemble, then the fraction mi/M is the
probability of the event ‘i’
Probability of a head (H) is
the number of coins nH with
a H divided by the total
number M in the ensemble
pH  lim M 
nH
M
Classical Probability
• in statistical mechanics we use this idea by
assuming that all accessible quantum states of a
system are equally likely
• basic idea is that if we wait long enough, the
system will eventually flow through all of the
microscopic states consistent with any constraints
imposed on the system
• measurements must be treated statistically
• the microcanonical ensemble corresponds to an
isolated system with fixed total energy E
• however this is not the most convenient approach
Statistical Probability
• Experimental method of assigning
probabilities to events by measuring the
relative frequency of occurrence
• if event ‘i’ occurs mi times in M trials, then
pi  lim M 
mI
F
G
J
HM K
i
Independent Events
• If events are independent, then the probability that
both occur pi,j = pi pj
• e.g coin toss with 2 trials => 4 outcomes
• pH,H=pT,T=pH,T=pT,H= (1/2)(1/2)=1/4
• but probability of getting one head and one tail
in 2 trials = 1/4 + 1/4 = 1/2 (order unimportant!)
• probability of 2 heads and 2 tails (independent of
order) in 4 tosses is
4! 2 2
prob 
pH pT
2! 2!
Random Walks
a
• Consider a walker confined to one dimension
starting at the point x=0
• the probability of making a step to the right is p
and to the left is q=1-p
( p+q=1)
• each step is independent of the preceding step
• let the displacement at step i be denoted as si
where si= ±a
si  p(a)  q(a)  ( p  q)a
• each step is of the same magnitude
• where is the walker after N steps?
Random Walk
N
x ( N )   si
Net displacement
i 1
F
I
x (N)  G
s J s s

H K
2
N
N
N
2
i
i j
i 1
i 1 j 1
N
N
  s   si s j
2
i
i 1
i j
Averages
The average of a sum of independent random
variables is equal to the sum of the averages
N
x ( N )   si
i 1
x( N ) 
N

si  N ( pa  q (  a ))
i 1
 Na ( p  q )
Averages
The average of the product of two statistically
independent random variables is equal to the
product of the averages
N
N
x ( N )   s   si s j
2
2
i
i 1
N
x (N)   s
2
2
i
i 1
i j
N
  si
sj
i j
 Na  N ( N  1) a ( p  q )
2
 x( N )
2
 4 pqa N
2
2
Dispersion or Variance
x ( N ) 
2
cx( N ) 
h
2
x( N )
   x    x   4 pqa N
2
Note:
2

 x( N ) 
2
x ( N )
2

2
1/ 2
x( N )
2a pqN
1


p  q aN
N
b g
Walker does not get very far from its mean value if N>>1 !
Probability Distribution
• What is the probability P(x,N) that walker
ends up at point x in N steps?
• Total number of steps N= nR + nL
• probability of nR steps to right is pnR
• probability of nL steps to left is qnL
• number of ways = N!/nR!nL!
• but x = (nR - nL)a
• hence nR = (N+ x/a)/2
•
nL= (N - x/a)/2
set a=1
Set a=1
N!
p( x , N ) 
p
Nx
Nx
!
!
2
2
F
I
F
I
G
J
G
J
H KH K
N x
(
)
2
q
N x
(
)
2
N=20
N=40
-N<x<N
N!
p( x , N ) 
p
Nx Nx
!
!
2
2
F
I
F
I
G
J
G
J
H KH K
N x
(
)
2
q
N x
(
)
2
Define r=x/N where -1<r<1
N!
p( r , N ) 
p
N (1  r ) N (1  r )
!
!
2
2
F
I
F
I
G
J
G
J
H KH K
N (1 r )
N (1 r )
(
) (
)
2
2
q
p (r , N )
p(0, N )
N!
p( x , N ) 
p
Nx
Nx
!
!
2
2
F
I
F
I
G
J
G
J
H KH K
N
 x 
n
x
x  N
N
n
p( x , N )
 p( x , N )
x  N
N x
(
)
2
q
N x
(
)
2
Show
N
 p( x , N )  1
x  N
 x  N ( p  q )
  x    x 
2
2
 4 pqN
2
For large N,
p(x,N) approaches a continuous
distribution
lim N  p( x , N ) 
x  x  g
b

2
p( x ) 
e
2 2
2
   x  
2
 x  ( p  q) N
  4 pqN
2