Lesson_8-RandomVariateGeneration

Download Report

Transcript Lesson_8-RandomVariateGeneration

ETM 607 – Random-Variate
Generation
• Inverse-transform technique
- Exponential Distribution
- Empirical Continuous Distributions
- Normal Distribution?
• Discrete Distributions
- Empirical
- Discrete Uniform
- Geometric
• Acceptance-Rejection Technique
- Poisson Process
- Non-stationary Poisson Process
ETM 607 – Random Number and
Random Variates
Random-Variate Generation – Chapter 8:
Random-Variate generation is converting from a random
number (Ri) to a Random Variable, Xi ~ some distribution.
Inverse transform method:
Step 1 – compute cdf of the desired random variable X
Step 2 – Set F(X) = R where R is a random number ~U[0,1)
Step 3 – Solve F(X) = R for X in terms of R. X = F-1(R).
Step 4 – Generate random numbers Ri and compute desired
random variates:
Xi = F-1(Ri)
ETM 607 – Random-Variate
Generation
Recall Exponential distribution: l is often thought of as the
arrival rate, and 1/l as the time between arrivals
f ( x) 
l e  lx , x  0
0, otherwise
0,
x0
F ( x) 
1  e  lx , x  0
E[ X ] 
V[X ] 
1
l
1
l2
Insert figure 5.9
ETM 607 – Random Number and
Random Variates
Inverse transform method – Exponential Distribution:
Step 1 – compute cdf of the desired random variable X
0,
x0
F ( x) 
1  e  lx , x  0
Step 2 – Set F(X) = R where R is a random number ~U[0,1)
F ( x)  R  1  e lx
Step 3 – Solve F(X) = R for X in terms of R. X = F-1(R).
1  R  e  lx , X  
1
l
ln( 1  R), or X  
1
l
ln( R)
Step 4 – Generate random numbers Ri and compute desired
1
random variates:
X   ln( R )
i
l
i
ETM 607 – Random Number and
Random Variates
Empirical Continuous Distribution:
2 distinct methods:
• Few observations
• Many observations
ETM 607 – Random Number and
Random Variates
Empirical Continuous Distribution – Few Observations:
Given a random variable X which has an empirical continuous distribution,
and the following observed values:
0.7 1.6 2.8 1.9
First sorting the values smallest to largest, a cdf can be generated,
assuming X can take on values between 0 and the largest observed value.
CDF for Continuous Empirical
Distribution (X)
1
CDF
0.8
0.6
0.4
0.2
0
0
0.5
1
1.5
X
2
2.5
3
ETM 607 – Random Number and
Random Variates
Empirical Continuous Distribution – Few Observations:
Considering the inverse-transform method, one needs to find the slope of
each line, ai:
x x
x x
ai 
i
( i 1)
i / n  (1  i) / n

( i 1)
i
1/ n
And knowing that each line
segment begins at x(i-1) , then:
CDF for Continuous Empirical
Distribution (X)
i  1 X i  x(i 1)
F ( X )  Ri 

n
ai
(i  1) 

X i  x(i 1)  ai  R 

n 

0.8
CDF
and
1
0.6
0.4
0.2
0
0
0.5
1
1.5
X
2
2.5
3
ETM 607 – Random Number and
Random Variates
Empirical Continuous Distribution – Few Observations:
In-class Exercise:
Create a random-variate generator for the variable X which
has an empirical continuous distribution, but assume there are
technological reasons the value can never be below 2.5, or
above 5.0.
Also, the following values were observed:
2.7 3.6 4.8 3.9 4.2
What are the values of X, given random numbers: .05, .27,
.75, and .98
ETM 607 – Random Number and
Random Variates
Empirical Continuous Distribution – Many Observations:
Major difference from “few observations” approach – don’t use 1/n to
construct cdf, use n/N, where N is the total number of observations, and n is
the observed number of observations for a given interval of the random
variable X.
For the slope, ai:
(length of the interval) / (relative frequency of observations within the interval)
X i  x( i 1)  ai R  ci 
Where ci is the value of the cumulative distribution at xi.
See example next slide.
ETM 607 – Random Number and
Random Variates
Empirical Continuous Distribution – Many Observations:
Insert example 8.3
ETM 607 – Random Number and
Random Variates
Distributions with no closed form cdf:
What if a distribution has no closed form cdf (e.g. normal,
gamma, beta)?
An inverse transform approximation for the Normal Dist.:
R 0.135  (1  R) 0.135
X  F ( R) 
.1975
1
Insert Table 8.4
ETM 607 – Random Number and
Random Variates
Discrete Distributions – Empirical:
Recall discrete distribution means the random variable X, takes
on countable values (integers).
For example, observations for the number of shipments per day
x
p(x)
F(x)
were:
Then,
0
0.5
0.5
1
0.3
0.8
2
0.2
1.0
x0
 0
0.5 0  x  1

F ( x)  
0.8 1  x  2
 1.0 2  x
insert fig 8.6
ETM 607 – Random Number and
Random Variates
Discrete Distributions – Empirical:
Since,
Then,
x0
 0
0.5 0  x  1

F ( x)  
0.8 1  x  2
 1.0 2  x
R  0.5
 0,

X  1, 0.5  R  0.8
2, 0.8  R  1.0

insert fig 8.6
ETM 607 – Random Number and
Random Variates
Discrete Uniform Distribution:
Inclass Exercise
Consider a random variable X which follows a uniform discrete
distribution between and including the numbers [5,10]. Develop
an inverse transform for X with respect to R.
ETM 607 – Random Number and
Random Variates
Geometric Distribution:
Probability mass function (pmf):
p( x)  (1  p) x 1 p, x  1,2,3...
 1  p  p, x  0,1,2...
x
Cumulative distribution function (cdf):
x
F ( x)   p(1  p) j
j 0
 1  1  p 
x 1
ETM 607 – Random Number and
Random Variates
Geometric Distribution:
Knowing the cdf:
F ( x)  1  1  p 
x 1
F ( x  1)  1  1  p 
x
Using the inverse transform method:
F ( x  1)  R  F ( x)
1  1  p   R  1  (1  p ) x 1
x
(1  p ) x 1  R  1  (1  p ) x
ln( 1  R)
ln( 1  R )
1  x 
ln( 1  p )
ln( 1  p )
 ln( 1  R ) 
X 
 1
 ln( 1  p ) 
ETM 607 – Random Number and
Random Variates
Random Variate generation for a discrete random variable X
with a Poisson Distribution:
Recall the pmf:
p( x) 
lx e l
, x  0,1,2...
x!
p( x)  0, otherwise
l– referred to as the rate parameter (or mean number of occurrences per unit
time)
Poisson process is the number of arrivals from an exponential inter-arrival
stream with mean time (1/l).
Then over one unit of time, x arrivals occur if and only if:
A1  A2  ....  Ax  1  A1  A2  ....  Ax  Ax1
Where Ai is the inter-arrival time for the ith unit.
ETM 607 – Random Number and
Random Variates
Random Variate for a Poisson Distribution:
Recall inverse transform method for exponential distribution:
1
X i   ln( Ri )
l
Then,
A1  A2  ....  Ax  1  A1  A2  ....  Ax  Ax1
Becomes:
x
1

l
i 1
x 1
ln( Ri )  1  
i 1
1
l
x 1
x
 ln( R )  l   ln( R )
i
i 1
i
i 1
x
x 1
i 1
i 1
ln  Ri  l  ln  Ri
x
R
i
i 1
e
l
x 1
  Ri
i 1
This leads to a acceptance-rejection algorithm.
ln( Ri )
ETM 607 – Random Number and
Random Variates
x
Using:
R
i
i 1
e
l
x 1
  Ri
i 1
Step 1: Set n = 0, P = 1.
Step 2: Generate a random number Rn+1, and replace P by
P*Rn+1.
Step 3: If P < el, then accept N = n. Otherwise, reject the
current n, increase n by one, and return to step 2.
In Class Exercise, Given X~Poisson(l = 6 arrivals per unit time), use table
A.1 (94737, 08225, 35614, 24826, 88319, 05595, 58701, 57365, 74759,
etc…) to generate the first value for X for unit time.
ETM 607 – Random Number and
Random Variates
Nonstationary Poisson Process (NSPP):
A poisson arrival process whose arrival rate (li) changes over
time.
Think “fast food”. Arrival rates at the lunch and dinner hour
much greater than arrival rates during “off hours”.
Thinning Process:
Generates Poisson arrivals at the fastest rate, but “accept” only a
portion of the arrivals, in effect thinning out just enough to get
the desired time-varying rate.
ETM 607 – Random Number and
Random Variates
Nonstationary Poisson Process (NSPP):
Suppose arrival rates of 10, 5 and 15 for the first three hours.
Thinning (in effect) generates arrivals at a rate of 15 for each of
the three hours, but accepts approximately 10/15th’s of the
arrivals in the first hour and 5/15th’s of the arrivals in the second
hour.
Original
0
1
2
3
0
1
2
3
Thinned
ETM 607 – Random Number and
Random Variates
Nonstationary Poisson Process (NSPP) – Thinning Algorithm:
To generate successive arrival time (Ti) when rates vary:
Step 1 – Let l*  max{lt} be the maximum arrival rate, and set t = 0and i = 1.
l* ,
E
1
ln( R )
Step 2 – Generate E from the exponential distribution with rate
l
and let t = t+E (the arrival time of the next arrival using max rate).
Step 3  Generate random number R from U[0,1). If R < l(t) / l* then Ti = t
and i = i + 1.
Step 4 – Go to step 2.