Random Variable
Download
Report
Transcript Random Variable
Stat 31, Section 1, Last Time
•
Big Rules of Probability
–
–
–
•
•
The not rule
The or rule
The and rule
P{A & B} = P{A|B}P{B} = P{B|A}P{A}
Bayes Rule
(turn around Conditional Probabilities)
Independence
Independence
(Need one more major concept at this level)
An event A does not depend on B, when:
Knowledge of B does not change
chances of A:
P{A | B} = P{A}
New Ball & Urn Example
H
RRRRGG
T
RRG
Again toss coin, and draw ball:
2
PR | I
3
PR PR | I PI PR | II PII 0
2 1 2 1 2
3 2 3 2 3
Same, so R & I are independent events
Not true above, but works here, since proportions of
R & G are same
Independence
Note, when A in independent of B:
PA & B
PA PA | B
PB
so
And thus
PAP{B} PA & B
PA & B
P{B}
PB | A
PA
i.e. B is independent of A
Independence
Note, when A in independent of B:
It follows that: B is independent of A
I.e. “independence” is symmetric in A and B
(as expected)
More formal treatments use symmetric version
as definition
(to avoid hassles with 0 probabilities)
Independence
HW:
4.33
Special Case of “And” Rule
For A and B independent:
P{A & B} = P{A | B} P{B} = P{B | A} P{A} =
= P{A} P{B}
i.e. When independent, just multiply
probabilities…
Independent “And” Rule
E.g. Toss a coin until the 1st Head appears,
find P{3 tosses}:
Model: tosses are independent
(saw this was reasonable last time, using
“equally likely sample space ideas)
P{3 tosses} = PT1 & T2 & H 3
When have 3: group with parentheses
Independent “And” Rule
E.g. Toss a coin until the 1st Head appears,
PT1 & T2 & H 3
PT1 & T2 & H 3
find P{3 tosses}
PH 3 | T1 & T2 PT1 & T2
(by indep:)
PH 3 PT2 | T1PT1
PH 3 PT2 PT1
I.e. “just multiply”
Independent “And” Rule
E.g. Toss a coin until the 1st Head appears,
P{3 tosses} PH 3 PT2 PT1
•
Multiplication idea holds in general
•
So from now on will just say:
“Since Independent, multiply probabilities”
•
Similarly for Exclusive Or rule,
Will just “add probabilities”
Independent “And” Rule
HW:
4.31
4.35
Overview of Special Cases
Careful:
OR
AND
these can be tricky to keep separate
works like adding, for mutually exclusive
works like multiplying, for independent
Overview of Special Cases
Caution:
special cases are different
Mutually exclusive
independent
For A and B mutually exclusive:
P{A | B} = 0
Thus not independent
P{A}
Overview of Special Cases
HW: C13 Suppose events A, B, C all have
probability 0.4, A & B are independent,
and A & C are mutually exclusive.
(a) Find P{A or B}
(0.64)
(b) Find P{A or C}
(0.8)
(c) Find P{A and B}
(0.16)
(d) Find P{A and C}
(0)
Random Variables
Text, Section 4.3 (we are currently jumping)
Idea: take probability to next level
Needed for probability structure of political
polls, etc.
Random Variables
Definition:
A random variable, usually denoted as X,
is a quantity that “takes on values at
random”
Random Variables
Two main types
(that require different mathematical models)
•
Discrete, i.e. counting
(so look only at “counting numbers”, 1,2,3,…)
•
Continuous, i.e. measuring
(harder math, since need all fractions, etc.)
Random Variables
E.g:
X = # for Candidate A in a randomly
selected political poll: discrete
(recall all that means)
Power of the random variable idea:
•
Gives something to “get a hold of…”
•
Similar in spirit to high school algebra:
Give unknowns a name, so can work with
Random Variables
E.g:
X = # that comes up, in die rolling:
Discrete
•
But not too interesting
•
Since can study by simple methods
•
As done above
•
Don’t really need random variable concept
Random Variables
E.g:
X = # that comes up, in die rolling:
Discrete
•
But not very interesting
•
Since can study by simple methods
•
As done above
•
Don’t really need random variable concept
Random Variables
E.g: Measurement error:
Let X = measurement:
Continuous
•
How to model probabilities???
Random Variables
HW on discrete vs. continuous:
4.40
((b) discrete, (c) continuous, (d)
could be either, but discrete is more
common)
And now for something
completely different
My idea about “visualization” last time:
•
30% really liked it
•
70% less enthusiastic…
•
Depends on mode of thinking
•
–
“Visual thinkers” loved it
–
But didn’t connect with others
So don’t plan to continue that…
Random Variables
A die rolling example
(where random variable concept is useful)
Win $9 if 5 or 6, Pay $4, if 1, 2 or 3,
otherwise (4) break even
Notes:
• Don’t care about number that comes up
• Random Variable abstraction allows
focussing on important points
• Are you keen to play? (will calculate…)
Random Variables
Die rolling example
Win $9 if 5 or 6, Pay $4, if 1, 2 or 4
Let X = “net winnings”
Note: X takes on values 9, -4
and 0
Probability Structure of X is summarized by:
P{X = 9} = 1/3 P{X = -4} = ½ P{X = 0} = 1/6
(should you want to play?, study later)
Random Variables
Die rolling example,
for X = “net winnings”:
Win $9 if 5 or 6, Pay $4, if 1, 2 or 4
Probability Structure of X is summarized by:
P{X = 9} = 1/3 P{X = -4} = ½ P{X = 0} = 1/6
Convenient form:
Winning
Prob.
a table
9
-4
0
1/3
1/2
1/6
Summary of Prob. Structure
In general: for discrete X, summarize
“distribution” (i.e. full prob. Structure) by a
table:
Values
x1
x2
…
xk
Prob.
p1
p2
…
pk
Where:
i.
All pi are between 0 and 1
k
ii. pi 1 (so get a prob. funct’n as above)
i 1
Summary of Prob. Structure
Summarize distribution, for discrete X,
by a table:
Values
x1
x2
…
xk
Prob.
p1
p2
…
pk
Power of this idea:
•
Get probs by summing table values
•
Special case of disjoint OR rule
Summary of Prob. Structure
E.g.
Die Rolling game above:
P{X = 9} = 1/3
Winning 9 -4 0
Prob.
1/3 1/2 1/6
P{X < 2} = P{X = 0} + P{X = -4} = 1/6 +½ = 2/3
P{X = 5} = 0
(not in table!)
Summary of Prob. Structure
E.g.
Die Rolling game above:
Winning 9 -4 0
Prob.
1/3 1/2 1/6
P X 9 & X 0
PX 9 | X 0
PX 0
1
3
1
PX 9
2
3
PX 0 1 1 1 3
6 3 2
Summary of Prob. Structure
HW:
4.47 & (d) Find P{X = 3 | X >= 2}
4.50
(0.144, …, 0.352)
(0.24)
Random Variables
Now consider continuous random variables
Recall: for measurements (not counting)
Model for continuous random variables:
Calculate probabilities as areas,
under “probability density curve”, f(x)
Continuous Random Variables
Model probabilities for continuous random
variables, as areas under “probability
density curve”, f(x):
Pa X b = Area(
b
f ( x )dx
a
)
a
b
(calculus notation)
Continuous Random Variables
e.g. Uniform Distribution
Idea:
choose random number from [0,1]
Use constant density:
f(x) = C
Models “equally likely”
To choose C, want:
Area
1 = P{X in [0,1]} = C
So want C = 1.
0
1
Uniform Random Variable
HW:
4.52
(0.73, 0, 0.73, 0.2, 0.5)
4.54
(1, ½, 1/8)
Continuous Random Variables
e.g. Normal Distribution
Idea: Draw at random from a normal
population
f(x) is the normal curve (studied above)
Review some earlier concepts:
Normal Curve Mathematics
The “normal
, density curve” is:
1
f ( x)
e
2
1 x
2
2
usual “function” of x
circle constant = 3.14…
natural number = 2.7…
Normal Curve Mathematics
Main Ideas:
•
•
Basic shape is:
e
“Shifted to mu”:
1
x2
2
e
1
x 2
2
1 x
2
2
•
“Scaled by sigma”:
•
Make Total Area = 1: divide by
•
f ( x ) 0 as x , but never 0
e
2
Computation of Normal Areas
EXCEL
Computation:
works in terms of
“lower areas”
E.g. for N (1,0.5)
Area < 1.3
Computation of Normal Probs
EXCEL Computation:
probs given by “lower
areas”
E.g. for X ~ N(1,0.5)
P{X < 1.3} = 0.73
Normal Random Variables
As above, compute probabilities as areas,
In EXCEL, use NORMDIST & NORMINV
E.g. above:
X ~ N(1,0.5)
P{X < 1.3} =NORMDIST(1.3,1,0.5,TRUE)
= 0.73
(as in pic above)
Normal Random Variables
HW:
4.55