Transcript Lecture 9
Physics 114: Lecture 9
Probability Density Functions
Dale E. Gary
NJIT Physics Department
Binomial Distribution
If you raise the sum of two variables to a power, you get:
( a b) 0 1
(a b)1 a b
(a b) 2 a 2 2ab b 2
(a b)3 a 3 3a 2b 3ab 2 b3
(a b) 4 a 4 4a 3b 6a 2b 2 4ab3 b 4
Writing only the coefficients, you begin to see a pattern:
1
1
1
1
2
1
1
3
3
1
1
4
6
4
1
February 12, 2010
Binomial Distribution
Remarkably, this pattern is also the one that governs the possibilities of
tossing n coins:
1
1
1
1
1
1
2
3
4
1
3
6
1
4
1
n
0
1
2
3
4
2n
1
2
4
8
16
n
n!
(n choose x)
x !(n - x)! x
With 3 coins, there are 8 ways for them to land, as shown above.
In general, there are 2n possible ways for n coins to land.
How many permutations are there for a given row, above, e.g. how
many permutations for getting 1 head and 2 tails? Obviously, 3.
How many permutation for x heads and n - x tails, for general n and x?
Number of combinations in each row: C (n, x)
February 12, 2010
Probability
With fair coins, tossing a coin will result in equal chance of 50%, or ½, of
its ending up heads. Let us call this probability p. Obviously, the
probability of tossing a tails, q, is q = (1 - p).
With 3 coins, the probability of getting any single one of the
combinations is 1/2n = 1/8th, (since there are 8 combinations, and each is
equally probable). This comes from (½) (½) (½), or the product of each
probability p = ½ to get a heads.
If we want to know the probability of getting, say 1 heads and 2 tails,
we just need to multiply the probability of any combination (1/8th) by the
number of ways of getting 1 heads and 2 tails, i.e. 3, for a total
probability of 3/8.
To be really general, say the coins were not fair, so p ≠ q. Then the
probability to get heads, tails, tails would be (p)(q)(q) = p1q2.
Finally the probability P(x; n, p) of getting x heads given n coins each of
n!
which has probability p, is P( x; n, p) n p x q n - x
p x (1 - p)n - x .
x !(n - x)!
x
February 12, 2010
Binomial Distribution
This is the binomial distribution, which we write PB:
n!
p x (1 - p) n- x .
x !(n - x)!
Let’s see if it works. For 1 heads with a toss of 3 fair coins, x = 1, n = 3, p
= ½, we get
3! 1 1 1 2
PB (1;3, 12 )
3 / 8.
1!(3 - 1)! 2 2
For no heads, and all tails, we get
3! 1 0 1 3
Note: 0! 1
PB (0;3, 12 )
2 2 1/ 8.
0!(3)!
Say the coins are not fair, but p = ¼. Then the probability of 2 heads
and 1 tails is:
3!
11 3 2
PB (2;3, 14 )
4 4 3 9 / 64 27 / 64.
2!(3 - 2)!
You’ll show for homework that the sum of all probabilities for this (and
any) case is 1, i.e. the probabilities are normalized.
PB ( x; n, p)
February 12, 2010
Binomial Distribution
To see the connection of this to the sum of two variables raised to a
power, replace a and b with p and q: ( p q )0 1
( p q )1 p q
( p q ) 2 p 2 2 pq q 2
( p q )3 p 3 3 p 2 q 3 pq 2 q 3
Since p + q = 1, each of these powers also equals one on the left side,
while the right side expresses how the probabilities are split among the
different combinations. When p = q = ½, for example, the binomial
1
1
triangle
becomes
1/
1/
1
1
2
2
1
2
1
2
1
/4
/4 1/4
1/
3/
3
1/
1
3
3
1
8
8 /8
8
1
4
6
4
1
4
6
4
1
/16 /16 /16 /16 1/16
In MatLAB, use binopdf(x,n,p) to calculate one row of this triangle,
e.g. binopdf(0:3,3,0.5) prints 0.125, 0.375, 0.375, 0.125.
February 12, 2010
Binomial Distribution
Let’s say we toss 10 coins, and ask how many
heads we will see. The 10th row of the
triangle would be plotted as at right.
The binomial distribution applies to yes/no
cases, i.e. cases where you want to know the
probability of something happening, vs. it not
happening.
Say we want to know the probability of
getting a 1, rolling five 6-sided dice. Then p =
1/6 (the probability of rolling a 1 on one die),
and q = 1 – p = 5/6 (the probability of NOT
rolling a 1). The binomial distribution applies
to this case, with PB(x,5,1/6). The plot is
shown at right.
Binomial PDF for 10 coins
0.4
>> binopdf(0:5,5,1/6.)
ans = 0.4019 0.4019 0.1608 0.0322 0.0032 0.0001
Binomial Distribution
0.3
0.2
0.1
0
0
2
4
6
8
10
x
PDF for rolling a 1 on five 6-sided dice
Binomial Distribution
0.5
0.4
0.3
0.2
0.1
0
0
1
2
3
Number of 1's rolled
February 12, 2010
4
5
Binomial Distribution Mean
Let’s say we toss 10 coins N = 100 times.
Then we would multiple the PDF by N, to find
out how many times we would have x number
of heads.
The mean of the distribution is, as before:
1
n
m lim xi xPB ( x; n, p)
N N
x 0
Binomial PDF for 10 coins
40
0.4
Binomial Distribution
30
0.3
0.2
20
m
0.1
10
000
2
4
6
8
10
x
n
n!
p x (1 - p) n - x np
x !(n - x)!
x 0
For 10 coins, with p = ½, we get m = np = 5.
For 5 dice, with p = 1/6, we get m = np = 5/6.
x
PDF for rolling a 1 on five 6-sided dice
Binomial Distribution
0.5
0.4
0.3
0.2
0.1
0
0
m
1
2
3
Number of 1's rolled
February 12, 2010
4
5
Binomial Standard Deviation
The standard deviation of the distribution is
the “second moment,” given by the variance:
n
2
2
1
2
lim xi - m x - m PB ( x; n, p )
N N
x 0
n
n!
2
x - m
p x (1 - p ) n - x np (1 - p )
x !(n - x)!
x 0
For 10 coins, with p = ½, we get
np(1 - p) 2.5 1.58
For 5 dice, with p = 1/6, we get
Binomial Distribution
Binomial PDF for 10 coins
40
0.4
30
0.3
0.2
20
m
0.1
10
000
2
4
6
8
10
x
Binomial Distribution
np(1 - p) 25 / 36 0.83
PDF for rolling a 1 on five 6-sided dice
0.5
0.4
0.3
0.2
0.1
0
0
m
1
2
3
Number of 1's rolled
February 12, 2010
4
5
Summary of Binomial Distribution
The binomial distribution is PB: PB ( x; n, p)
The mean is m np.
The standard deviation is np(1 - p)
n!
p x (1 - p) n- x .
x !(n - x)!
February 12, 2010
Poisson Distribution
An approximation to the binomial distribution is very useful for the case
where n is very large (i.e. rolls with a die with infinite number of sides?)
and p is very small—called the Poisson distribution.
This is the case of counting experiments, such as the decay of radioactive
material, or measuring photons in low light level.
To derive it, start with the binomial distribution with n large and p << 1,
but with a well defined mean m = np. Then
1 n!
PB ( x n; n, p 1)
p x (1 - p) n - x .
x ! (n - x)!
n!
n x . because x is small, so most of the terms cancel
The term
(n - x)!
leaving a total of x terms each approximately equal to n.
This gives
1
mx
x
n- x
PB ( x n; n, p 1) np (1 - p)
(1 - p) - x (1 - p) n .
x!
x!
February 12, 2010
Poisson Distribution
Now, the term (1 – p)-x 1, for small p, and with some algebra we can
show that the term (1 – p)n e-m.
Thus, the final Poisson distribution depends only on x and m, and is defined
as
m x -m
PP ( x; m )
e .
x!
The text shows that the expectation value of x (i.e. the mean) is
n
x x
mx
e- m m.
x!
Remarkably, the standard deviation is given by the second moment as
x 0
n
(x - m) (x - m)
2
2
2
mx
e - m m.
x!
These are a little tedious to prove, but all we need for now is to know that
the standard deviation is the square-root of the mean.
x 0
February 12, 2010
Example 2.3
Some students measure some background counts of cosmic rays. They
recorded numbers of counts in their detector for a series of 100 2-s
intervals, and found a mean of 1.69 counts/interval. They can use the
standard deviation formula from chapter 1, which is
1
s 2 ( xi - x )2 ,
N
to get a standard deviation directly from the data. They do this and get s =
1.29. They can also estimate the standard deviation by 1.69 1.30.
Now they change the length of time they count from 2-s intervals to 15-s
intervals. Now the mean number of counts in each interval will increase.
Now they measure a mean of 11.48, which implies 11.48 3.17, while
they again calculate s directly from their measurements to find s = 3.39.
We can plot the theoretical distributions using MatLAB poisspdf(x,mu),
e.g. poisspdf(0:8,1.69) gives
ans = 0.1845 0.3118 0.2635 0.1484 0.0627 0.0212 0.0060 0.0014 0.0003
February 12, 2010
Example 2.3, cont’d
Probability of counts per interval
Poisson Distribution for mean 1.69
0.3
0.25
0.2
0.15
0.1
m
0.05
0
0
2
4
6
8
Number of Counts
Probability of counts per interval
Poisson Distribution for mean 11.48
The plots of the distributions is shown
for these two cases in the plots at right.
You can see that for a small mean, the
distribution is quite asymmetrical. As
the mean increases, the distribution
becomes somewhat more symmetrical
(but is still not symmetrical at 11.48
counts/interval).
I have overplotted the mean and
standard deviation. You can see that
the mean does not coincide with the
peak (the most probable value).
0.35
0.12
0.1
0.08
0.06
0.04
m
0.02
0
0
5
10
15
20
Number of Counts per 15 s
February 12, 2010
25
Example 2.3, cont’d
Here is the higher-mean plot with the equivalent Gaussian (normal
distribution) overlaid.
Poisson Distribution for mean 11.48
Probability of counts per interval
0.15
0.1
0.05
0
0
5
10
15
20
25
Number of Counts per 15 s
For large means (high counts), the Poisson distribution approaches the
Gaussian distribution, which we will describe further next time.
February 12, 2010