rolling case

Download Report

Transcript rolling case

Physics 114: Lecture 7
Probability Density Functions
John F. Federici
NJIT Physics Department
New Jersey TRIVIA QUESTION!
Which Actor/ Actress for the following characters from
GAME OF THRONES is from New Jersey?
(a) Tyrion Lannister
(b) Cersai Lannister
(c) Arys Stark
(d) Jon Snow
(e) Samwell Tarly
New Jersey TRIVIA QUESTION!
Which Actor/ Actress for the following characters from
GAME OF THRONES is from New Jersey?
(a) Tyrion Lannister –
Peter Dinklage was born in Morristown, NJ
Probability Distribution Functions
Most important – Binomial, Poisson, Gaussian
• Binomial Distribution – Applied in experiments where smaller
number of FINAL states is important rather than details of
HOW the state is exactly created. Final state answers are
usually ‘yes’ or ‘no’
EXAMPLE: If one flips 10 coins, how often does 6 heads and
4 tails occur? …. Note state of INDIVIDUAL coin not
important.
EXAMPLE: Rolling of dice (6 sized or N-sided). In the game
of Craps, how often is a “7” rolled with two six-sided dice?
Poisson and Gaussian distributions are limiting cases of
Binomial
Probability Distribution Functions
Most important – Binomial, Poisson, Gaussian
• Poisson Distribution – Average number of ‘successes’ (eg.
counting something) is much smaller than the possible number
of events.
EXAMPLE: Counting the number of alpha particles which are
emitted during radioactive decay.
EXAMPLE: Counting the number of photons (particles of
light) emitted by a light source.
Probability Distribution Functions
Most important – Binomial, Poisson, Gaussian
• Gaussian Distribution – Number of different observations is
large AND the probability of a ‘successes’ (eg. counting
something) is large as well.
Used to describe the distribution of random observations for
many types of experiments.
Binomial Distribution

If you raise the sum of two variables to a power, you
get:
( a  b) 0  1
(a  b)1  a  b
(a  b) 2  a 2  2ab  b 2
(a  b)3  a 3  3a 2b  3ab 2  b3
(a  b) 4  a 4  4a 3b  6a 2b 2  4ab3  b 4

Writing only the coefficients, you begin to see a
1
pattern:
1
1
1
1
1
2
3
4
1
3
6
1
4
1
February 12, 2010
Binomial Distribution
Remarkably, this pattern
tossing n coins:
n
1
0
1
1
1
1
2
1
2
1
3
3
1
3
1
4
6
4
1 4

is also the one that governs the possibilities of
2n
1
2
4
8
16
n
n!
 
Number of combinations in each row: C (n, x) 
x !(n - x)!  x 
(n is TOTAL number of coins, x is number of tails)




With 3 coins, there are 8 ways for them to land, as shown above.
In general, there are 2n possible ways for n coins to land.
How many permutations are there for a given row, above, e.g. how
many permutations for getting 1 head and 2 tails? Obviously, 3.
How many permutation for x heads and n - x tails, for general n and x?
Probability





With fair coins, tossing a coin will result in equal chance of 50%,
or ½, of its ending up heads. Let us call this probability p.
Obviously, the probability of tossing a tails, q, is q = (1 - p).
With 3 coins, the probability of getting any single one of the
combinations is 1/2n = 1/8th, (since there are 8 combinations, and
each is equally probable). This comes from (½) (½) (½), or the
product of each probability p = ½ to get a heads.
If we want to know the probability of getting, say 1 heads and 2
tails, we just need to multiply the probability of any combination
(1/8th) by the number of ways of getting 1 heads and 2 tails, i.e.
3, for a total probability of 3/8.
To be really general, say the coins were not fair, so p ≠ q. Then
the probability to get heads, tails, tails would be (p)(q)(q) = p1q2.
Finally the probability P(x; n, p) of getting x heads given n coins
each of which has probability p, is
n
n!
P( x; n, p)    p x q n - x 
p x (1 - p)n - x .
x !(n - x)!
 x
Binomial Distribution

This is the binomial distribution, which we write PB:
n!
p x (1 - p) n - x .
x !(n - x)!
Let’s see if it works. For 1 heads with a toss of 3 fair coins, x = 1, n = 3, p
= ½, we get
3! 1 1 1 2
PB (1;3, 12 ) 
 3 / 8.
1!(3 - 1)! 2 2
For no heads, and all tails, we get
3! 1 0 1 3
Note: 0!  1
PB (0;3, 12 ) 
2 2  1/ 8.
0!(3)!
Say the coins are not fair, but p = ¼. Then the probability of 2 heads
and 1 tails is:
3!
11 3 2
PB (2;3, 14 ) 
4 4  3  9 / 64  27 / 64.
2!(3 - 2)!
You’ll show for homework that the sum of all probabilities for this (and
any) case is 1, i.e. the probabilities are normalized.
PB ( x; n, p) 




Binomial Distribution

To see the connection of this to the sum of two variables raised to a
power, replace a and b with p and q: ( p  q)0  1
( p  q )1  p  q
( p  q ) 2  p 2  2 pq  q 2
( p  q )3  p 3  3 p 2 q  3 pq 2  q 3

Since p + q = 1, each of these powers also equals one on the left side,
while the right side expresses how the probabilities are split among the
different combinations. When p = q = ½, for example, the binomial
1
1
triangle
becomes
1
1
1
1

2
3
4
1/
1
1
3
6
1/
1/
1
4
2
1
4
3/
1/
2/
2
4
3/
1/
4
1/
8
8
8
8
1/
4/
6/
4/
1/
16
16
16
16
16
In MatLAB, use binopdf(x,n,p) to calculate one row of this triangle,
e.g. binopdf(0:3,3,0.5) prints 0.125, 0.375, 0.375, 0.125.
Binomial Distribution

0.3
0.2
0.1
0
0
2
4
6
8
10
x
PDF for rolling a 1 on five 6-sided dice
0.5
Binomial Distribution

Let’s say we toss 10 coins, and ask how
many heads we will see. The 10th row of
the triangle would be plotted as at right.
The binomial distribution applies to
yes/no cases, i.e. cases where you want
to know the probability of something
happening, vs. it not happening.
Say we want to know the probability of
getting a 1, rolling five 6-sided dice.
Then p = 1/6 (the probability of rolling a 1
on one die), and q = 1 – p = 5/6 (the
probability of NOT rolling a 1). The
binomial distribution applies to this case,
with PB(x,5,1/6). The plot is shown at
right.
>> binopdf(0:5,5,1/6.)
ans = 0.4019 0.4019 0.1608
0.0322 0.0032 0.0001
Binomial Distribution

Binomial PDF for 10 coins
0.4
0.4
0.3
0.2
0.1
0
0
1
2
3
Number of 1's rolled
4
5
Binomial Distribution
Y = binopdf(X,N,P) computes the binomial pdf
X : Number of ‘1’s in final state
Binomial PDF for 10 coins
0.4
Binomial Distribution
>> binopdf(0:5,5,1/6.)
ans = 0.4019 0.4019 0.1608
0.0322 0.0032 0.0001
0.3
0.2
0.1
0
0
2
4
6
8
10
x
PDF for rolling a 1 on five 6-sided dice
N: Number of die which are rolled.
Binomial Distribution
P: Probably of success. For a 6 sided die, there
is only one ‘1’, so probability is only 1/6.
0.5
0.4
0.3
0.2
0.1
0
0
1
2
3
Number of 1's rolled
4
5
Binomial Distribution Mean
1
m  lim 
N  N


x
xPB ( x; n, p)
 i   
x 0
n
n!
x
p x (1 - p ) n - x  np
x !(n - x)!
x 0


Binomial PDF for 10 coins
40
0.4
Binomial Distribution

Let’s say we toss 10 coins N = 100 times.
Then we would multiple the PDF by N, to
find out how many times we would have x
number of heads.
The mean of the distribution is, as before:
30
0.3
0.2
20
m
0.1
10
n
For 10 coins, with p = ½, we get m = np = 5.
For 5 dice, with p = 1/6, we get m = np = 5/6.
000
2
4
6
8
10
x
PDF for rolling a 1 on five 6-sided dice
0.5
Binomial Distribution

0.4
0.3
0.2
0.1
0
0
m
1
2
3
Number of 1's rolled
4
5
Binomial Standard Deviation
The standard deviation of the distribution
is the “second moment,” given by the
variance:

2
 n
 x - m  PB ( x; n, p)
  xi - m    
x 0
n
n!
2
 x - m
p x (1 - p ) n - x  np (1 - p )
x !(n - x)!
x 0
2
Binomial Distribution
1
  lim 
N  N

2
Binomial PDF for 10 coins
40
0.4

30
0.3
0.2
20
m
0.1
10
000
2
4
6
8
10
x
For 10 coins, with p = ½, we get
  np(1 - p)  2.5  1.58
 For 5 dice, with p = 1/6, we get
  np(1 - p)  25 / 36  0.83
PDF for rolling a 1 on five 6-sided dice
0.5
Binomial Distribution

0.4
0.3

0.2
0.1
0
0
m
1
2
3
Number of 1's rolled
4
5
Summary of Binomial Distribution
x : number of times the ‘die number’ (eg. a “1”) comes up on ‘n’ dies.
n : number of dies
p : probability that the ‘die number’ comes up on a single die. (eg. 1/M
where M is the number of faces on the die).

The binomial distribution is PB: PB ( x; n, p) 

The mean is

The standard deviation is
m  np.
  np(1 - p)
n!
p x (1 - p) n - x .
x !(n - x)!
Example from Exam
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6)
and roll the collection simultaneously.
a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of
rolling all 3 die such that only ONE of the die has a “4”?
b) How many different ways are there to roll a die with only ONE of the die showing a
“4”?
c) On any given roll of all 3 dice, what would be the probability of rolling two “4s”?
d) Given your answer above, if you were to roll the collection of die 1000 times, how
many times would you expect to roll two “4s”?
Let’s go through this example……
Example from Exam
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6)
and roll the collection simultaneously.
a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of
rolling all 3 die such that only ONE of the die has a “4”?
x =1 : number of times the ‘correct’ die number (eg. a “4”) comes up on ‘n’ dies.
n =3 : number of dies thrown SIMULTANEOUSLY (NOT the total number of rolls).
p =1/6: for a ‘fair’ 6-sided die.
b) How many different ways are there to roll a die with only ONE of the die showing a
“4”?
PB ( x; n, p) 
n!
p x (1 - p )n - x
x !(n - x )!
3!
35
1
3-1
PB (1;3,1 / 6) 
(1 / 6) (1 - 1 / 6)   
1!(3 - 1)!
66
2
Example from Exam
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6)
and roll the collection simultaneously.
a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of
rolling all 3 die such that only ONE of the die has a “4”?
b) How many different ways are there to roll a die with only ONE of the die showing a
“4”?
PB (1;3,1 / 6) 
3!
35
(1 / 6)1 (1 - 1 / 6)3-1   
1!(3 - 1)!
66
2
3 5
N  PB (1;3,1 / 6)  63     75
66
Total # of possible
combinations
Probability of the
“correct” combination
2
Example from Exam
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6)
and roll the collection simultaneously.
a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of
rolling all 3 die such that only ONE of the die has a “4”?
x=1, n=3,p=1/6
b) How many different ways are there to roll a die with only ONE of the die showing a
“4”?
75
c) On any given roll of all 3 dice, what would be the probability of rolling two “4s”?
x=2, n=3,p=1/6
PB (2;3,1 / 6) 
3!
3 5
(1 / 6) 2 (1 - 1 / 6) 3-2     0.0694
2!(3 - 2)!
36  6 
Example from Exam
Problem 3. Assume you have 3 marked, 6-sided dice (faces are numbered 1 through 6)
and roll the collection simultaneously.
a) Assuming fair dice, what are the arguments (x, n and p) of PB(x; n, p) in the case of
rolling all 3 die such that only ONE of the die has a “4”?
x=1, n=3,p=1/6
b) How many different ways are there to roll a die with only ONE of the die showing a
“4”?
75
c) On any given roll of all 3 dice, what would be the probability of rolling two “4s”?
x=2, n=3,p=1/6
d) Given your answer above, if you were to roll the collection of die 1000 times, how
many times would you expect to roll two “4s”?
N  PB (2;3,1 / 6)  1000  0.0694 69
Total # of possible
combinations
Probability of the
“correct” combination
Poisson Distribution

An approximation to the binomial distribution is very useful for the case
where n is very large (i.e. rolls with a die with infinite number of sides?)
and p is very small—called the Poisson distribution.



Equivalently, a LARGE number of possible outcomes, but the probability of any
individual outcome is small.
This is the case of counting experiments, such as the decay of
radioactive material, or measuring photons in low light level.
To derive it, start with the binomial distribution with n large and p << 1,
but with a well defined mean m = np. Then
PB ( x

n; n, p
1 n!
1) 
p x (1 - p) n - x .
x ! (n - x)!
n!
 n x . because x is small, so most of the terms cancel
The term
(n - x)!
N -N
Stirling’s Approximation

N! N e
leaving a total of x terms each approximately equal to n.
x
1
m
x
This gives
n- x
PB ( x
n; n, p
1) 
x!
 np 
(1 - p)

x!
2 N
(1 - p) - x (1 - p) n .
Poisson Distribution


Now, the term (1 – p)-x  1, for small p, and with some algebra we can show
that the term (1 – p)n  e-m.
Thus, the final Poisson distribution depends only on x and m, and is defined
as
m x -m
PP ( x; m ) 

The text shows that the expectation value of x (i.e. the mean) is
n
mx
x 0
x!
x  x

x!
e .
e- m m.
Remarkably, the standard deviation is given by the second moment as
n
 2  ( x - m )2   ( x - m )
x 0

x
m
2
x!
e - m  m.
These are a little tedious to prove, but all we need for now is to know that
the standard deviation is the square-root of the mean.
Example 2.4
Note: SMALL number of counts so
Poisson Distribution appropriate

Some students measure some background counts of cosmic rays. They
recorded numbers of counts in their detector for a series of 100 2-s
intervals, and found a mean of 1.69 counts/interval. They can use the
standard deviation formula from chapter 1, which is
1
s 2   ( xi - x ) 2 ,
N
to get a standard deviation directly from the data. They do this and get s =
1.29. They can also estimate the standard deviation by   1.69  1.30.
 Now they change the length of time they count from 2-s intervals to 15-s
intervals. Now the mean number of counts in each interval will increase.
Now they measure a mean of 11.48, which implies   11.48  3.17, while
they again calculate s directly from their measurements to find s = 3.39.
 We can plot the theoretical distributions using MatLAB poisspdf(x,mu),
e.g. poisspdf(0:8,1.69) gives
ans = 0.1845 0.3118 0.2635 0.1484 0.0627 0.0212 0.0060 0.0014 0.0003
Example 2.4, cont’d
Probability of counts per interval
Poisson Distribution for mean 1.69
0.3
0.25

0.2
0.15
0.1
m
0.05
0
0
2
4
6
8
Number of Counts
Probability of counts per interval
Poisson Distribution for mean 11.48
The plots of the distributions is shown
for these two cases in the plots at right.
 You can see that for a small mean, the
distribution is quite asymmetrical. As
the mean increases, the distribution
becomes somewhat more symmetrical
(but is still not symmetrical at 11.48
counts/interval).
 I have overplotted the mean and
standard deviation. You can see that
the mean does not coincide with the
peak (the most probable value).

0.35
0.12
0.1
0.08

0.06
0.04
m
0.02
0
0
5
10
15
Number of Counts per 15 s
20
25
Example 2.4, cont’d

Here is the higher-mean plot with the equivalent Gaussian (normal
distribution) overlaid.
Poisson Distribution for mean 11.48
Probability of counts per interval
0.15
0.1
0.05
0
0
5
10
15
Number of Counts per 15 s

20
25
For large means (high counts), the Poisson distribution approaches the
Gaussian distribution, which we will describe further next time.
Rolling Dice - Electronically
For the homework assignment for this week, you will be ‘rolling
dice’ a large number of times to generate some data for analysis.
Rather than rolling a physical dice, we are going to ‘roll’ the
dice in software.
Write a matlab function that mimics the rolling of a 6-sided die.
• Use the ‘randi’ function (NOT randn) function to generate a vector of 1000
random integers between 1 and 6. The syntax is
DieNum=randi(6,[1000,1]);
• Mimic the rolling of 2 die. Add the results of both die together.
• ‘Roll’ the two die together a total of 100 times.
• How often does a ‘7’ come up? How does this compare to the probability of
rolling a 7?
• ‘Roll’ the two die together a total of 1000 times.
• How often does a ‘7’ come up? How does this compare to the probability of
rolling a 7?