Probability - Missouri Western State University

Download Report

Transcript Probability - Missouri Western State University

Poisson Random Variables
Number of occurrences
• Let Y represent the number of occurrences of an
event in an interval of size s.
• Here we may be referring to an interval of time,
distance, space, etc.
• For example, we may be interested in the number
of customers Y arriving during a given time
interval.
• We call Y a Poisson random variable.
Poisson R. V.
• A random variable has a Poisson distribution with
parameter l if its probability function is given by
p( y) 
y l
l e
y!
where y = 0, 1, 2, …
We’ll see that l is the “average rate” at which
the events occur. That is, E(Y) = l .
Queries
• If the number of database queries processed by a
computer in a time interval is a Poisson random
variable with an average of 6 queries per minute,
find the probability that 4 queries occur in a one
minute interval.
64 e 6
p (4) 
 0.13385
4!
poissonpdf(l, y)
poissoncdf(l, y) is also provided on the TI-83
Fewer Queries
• As before, for the Poisson random variable with
an average of 6 queries per minute…
• find the probability there are less than 6 queries in
a one minute interval:
P(Y  6)  P(Y  5)
 poissoncdf (6,5)  0.44568
Some PoissonVariables
• Number of incoming telephone calls to a
switchboard within a given time interval;
• Number of errors (incorrect bits) received by a
modem during a given time interval;
• Number of chocolate chips in one of Dr. Vestal’s
chocolate chip cookies;
• Number of claims processed by a particular
insurance company on a single day;
• Number of white blood cells in a drop of blood;
• Number of dead deer along a mile of highway.
Many short intervals
• To derive the Poisson probability distribution,
think of the interval as being comprised of many,
say n, very short successive intervals.
x x
0
x
xx
xx x
x
x
s
• Suppose in each interval,
either there is an occurrence or there’s not.
“like Bernoulli trials”
• So Y = y occurrences is like y successes in n trials.
• Treat it like a binomial experiment, but let the time
intervals get very short (i.e., let n get very large).
As n goes to infinity…
• Let l = np, the expected number of successes.
Taking the limit
lim C p (1  p)
n 
n
y
y
n y

lim C
n 
n
y
n
l y
(1  ln )n y
 n(n  1) (n  y  1)   l y 
l n
l y
 lim 
(1

)
(1




n
n)
y
n 
y!

 n 
 ly 
 n(n  1) (n  y  1) 
l n
l y
   lim 
(1

)
(1


n
n)
n(n)(n) (n) 
 y !  n 
 ly 
  n  n  1  n  2   n  ( y  1)  
l n
l y
   lim   
(1

)
(1


 

n
n)
n

n

  n  n  n  
 y! 
1
-l
e
1
1
1
1
As n goes to infinity…
• With l = np held constant, as n gets large,
we have found
y l
l
e
n y
n y
Cy p (1  p) 
y!
• Consequently, we may use a Poisson probability to
approximate binomial probabilities when n is large
(and p is small).
( Suggests large n and l = np < 7.)
Compare
• Consider a binomial experiment with
n = 200 and p = 0.03, so that l = np = 6.
• Determine the probability of 4 successes. Also,
approximate the probability using Poisson
distribution.
Poisson mean, variance
• If Y is a Poisson random variable with
parameter l, the expected value and variance
for Y are given by
E(Y )  l and V (Y )  l
( and the proof is too good to pass up)
Expected number of arrivals
• The expected value for a Poisson random variable:


y 0
y 0
E (Y )   yp( y )   y

y
l y e l
y!
l y e l
y 1


y 1
y!
l y el
( y  1)!

 l  
y 1
l y 1e l
( y  1)!
since first term is zero,
start with y = 1
cancelling the common factor
distributing out one l
Expected number of arrivals
• The expected value for a Poisson random variable:

E(Y )   l  
y 1
l y 1e l
( y  1)!

l z el
z 0

z!
 l  
  l   p( z )
z 0
l
If y = 1, 2, 3,…
then y – 1 = 0, 1, 2, …
Let z = y – 1.
a sum of Poisson probabilities
=1
, as claimed.
Deriving Variance
• Deriving the variance for a Poisson random
variable proceeds in a similar manner.
• As we’ve seen before, to get E(Y2), you first
determine that E[Y(Y – 1)] = l2. (how?)
• And so, E(Y2) = E[Y(Y – 1)] + E(Y) = l2 + l.
• Finally,
V(Y) = E(Y2) – [E(Y)]2 = (l2 + l - l2 = l.