Expectationsx

Download Report

Transcript Expectationsx

Random Variables
Math 480 Lecture
By Sara Billey
Outline
• Last time: Defined sample spaces,
probability distribution functions
independence, uniform distribution,
Bernoulli process, and histograms.
• Discussed Law of Large Numbers: Roughly,
histograms with enough data will approach
the graph of the probability distribution
function
P: S -> [0,1].
• Today: Random variables and Expectations.
• Model the game Pass the Pigs
Random Variables
• Def: Given a random process with sample space
S and probability P: S [0,1], a random variable
N is any function
N: S  Set of Numbers
such as the integers, reals, or complexes.
• Every random variable has associated
probabilities for each event of the form
(N=i) = { w in S : N(w)=i}.
P(N=i) =P({ w in S : N(w)=i}) = S P(w).
Random Variables
• Def: Given a random process with sample space S and
probability P: S [0,1], a random variable N is any
function
N: S  Set of Numbers
such as the integers, reals, or complexes.
• Example: Nbox = height in mm of object
Height of tennis ball = 68 mm
Height of ping pong ball = 40 mm
Height of marble= 22 mm
Height of zometool= 9 mm
Random Variables
• Example: Nbox = height in mm of object : S  {9,22,40,68}.
P(Nbox =
P(Nbox =
P(Nbox =
P(Nbox =
68) “=“ 16/20
40) “=“ 2/20
22) “=“ 1/20
9) “=“ 1/20
Random Variables
• What other random variables (RV’s) could we measure on
the same random process?
Expectation
• Def: The expectation of a random variable
X: S  {x1, x2, …, xk}
with P(X = xi) = pi is
• Ex:
E(Nbox) “=“ 68 (16/20) + 40 (2/20) + 22 (1/20) + 9(1/20) = 59.95
Expectation
• Def: The expectation of a random variable
X: S  {x1, x2, …, xk}
with P(X = xi) = pi is
• Ex:
E(Nbox) “=“ 59.95 = (68 *16 + 40 *2+ 22 + 9) /20.
By the Law of Large Numbers, E[X] should be close to the
average value of X on a large number of samples from the
random process.
Basic Fact of Expectations
• Basic Fact: If we have two random variables on the
same random process
X: S  {n1, n2, …, nk}
Y: S  {n1, n2, …, nk},
then E[X+Y] = E[X]+E[Y].
Proof: First, E[X] = S X(w) P(w) and E[Y] = S Y(w) P(w) as a consequence of
the definition. Therefore,
E[X+Y] = S (X(w)+Y(w)) P(w)
= S X(w) P(w) + S Y(w) P(w)
= E[X]+E[Y].
Doesn’t even matter if X and Y are independent! (Try E[X + X2])
Expectations in Modeling
Decision Problem: Should we buy a megamillions lottery ticket?
Cost = $1 to play. Pick 5 numbers between 1 and 75 plus one
powerball number between 1 and 15. Jackpot = $161,000,000.
Expectations in Modeling
Decision Problem: Should we buy a Megamillions Lottery ticket?
Next drawing is at 11pm tonight!
Let W = net profit from one ticket.
E[W] = $161M (1/258890850) + $1M (1/18492204) +
$5K(1/739688) + $500 (1/52835) + 50(1/10720) + 5(1/766) +
5(1/473) + 2(1/56) + 1(1/21) -1
= -0.20272
Check That!
Expectations in Modeling
Decision Problem: If Megamillions gets up to $686M again, how
many tickets should I buy?
Let W = net profit from one ticket.
E[W] = $686M (1/258890850) + $1M (1/18492204) +
$5K(1/739688) + $500 (1/52835) + 50(1/10720) + 5(1/766) +
5(1/473) + 2(1/56) + 1(1/21) -1
= 1.82516
Expectations in Modeling
Decision Problem: If Megamillions gets up to $686M again, how
many tickets should I buy?
If E[W]= 1.82516, then what is the expected net value of 5 tickets?
E[W+W+W+W+W] = 5 * 1.82516 = 9.12580.
E[50* W] = 50 * 1.82516 = 91.2580.
E[500* W] = 500 * 1.82516 = 912.580.
(Don’t forget to keep in mind the probability of winning!)
Pass the Pigs