Distribution of a function of a random variable

Download Report

Transcript Distribution of a function of a random variable

Joint and marginal distribution functions
• For any two random variables X and Y defined on the same
sample space, the joint c.d.f. is
F(a,b)  P{X  a, Y  b}.
• For an example, see next slide.
• The marginal distributions can be obtained from the joint
distributions as follows:
FX (a)  F(a,)
FY (b)  F(, b).
• When X and Y are both discrete, the joint probability mass
function is given by
p(x,y)  P{X  x, Y  y}.
The probability mass function of X, pX(x), is obtained
by “summing over y”. Similarly for pY(y).
C.D.F. for a Bivariate Normal (density shown later)
Example for joint probability mass function
• Consider the following table:
X=5
Y=0
Y=3
Y=4
1/7
1/7
1/7
3/7
pX
X=8
3/7
0
1/7
4/7
1/7
2/7
4/7
pY
• Using the table, we have
P{X  7, Y  3}  p(5,3) p(5,4) 2/7
P(X  7}  p X (5)  3 / 7
P{Y  3}  p Y (3)  p Y (4)  1 / 7  2 / 7  3 / 7.
Expected Values for Jointly Distributed Random Variables
• Let X and Y be discrete random variables with joint
probability mass function p(x, y). Let the sets of values of X
and Y be A and B, resp. We define E(X) and E(Y) as
E(X)   xpX ( x) and E(Y)   ypY ( y).
xA
yB
• Example. For the random variables X and Y from the previous
slide,
 3   4  47
E(X)  5   8   .
7 7 7
 1   2  11
E(Y)  3   4   .
7 7 7
Law of the Unconscious Statistician Revisited
• Theorem. Let p(x, y) be the joint probability mass function of
discrete random variables X and Y. Let A and B be the set of
possible values of X and Y, resp. If h is a function of two
variables from R2 to R, then h(X, Y) is a discrete random
variable with expected value given by
E(h(X,Y))   h(x,y)p(x,y),
xA yB
provided that the sum is absolutely convergent.
• Corollary. For discrete random variables X and Y,
E(X  Y)  E(X) E(Y).
• Problem. Verify the corollary for X and Y from two slides
previous.
Joint and marginal distribution functions for continuous r.v.’s
• Random variables X and Y are jointly continuous if there
exists a nonnegative function f(x, y) such that
P{(X, Y)  C} 
 f(x, y)dxdy
(x, y)C
for every well-behaved subset C of lR2. The function f(x, y) is
called the joint probability density function of X and Y.
• It follows that
 2 F(a,b)
F(a,b)    f(x,y)dxdy, and f(a,b) 
.
ab

b a
• Also,




f X (x)   f(x,y)dy, and f Y ( y)   f(x,y)dx.
Density for a Bivariate Normal (see page 449 for formula)
Example of joint density for continuous r.v.’s
• Let the joint density of X and Y be
f(x, y) 
2e x e 2 y , 0  x  , 0  y  
0,
otherwise
• Prove that
(1) P{X>1,Y<1} = e–1(1– e–2)
(2) P{X<Y} = 1/3
(3) FX(a) = 1 – e–a, a > 0, and 0 otherwise.
Expected Values for Jointly Distributed Continuous R.V.s
• Let X and Y be continuous random variables with joint
probability density function f(x, y). We define E(X) and E(Y) as




E(X)   xf X ( x)dx and E(Y)   yf Y ( y)dy.
• Example. For the random variables X and Y from the previous
slide,
x
2 y
f X ( x)  e , x  0 and f Y ( y)  2e
, y  0.
That is, X and Y are exponential random variables. It follows
that
1
E(X)  1 and E(Y)  .
2
Law of the Unconscious Statistician Again
• Theorem. Let f(x, y) be the joint density function of random
variables X and Y. If h is a function of two variables from lR2
to lR, then h(X, Y) is a random variable with expected value
given by
E[h(X,Y)]  



 
h(x, y)f(x, y)dxdy,
provided the integral is absolutely convergent.
• Corollary. For random variables X and Y as in the above
theorem,
E(X  Y)  E(X) E(Y).
• Example. For X and Y defined two slides previous,
3
E(X  Y)  E(X)  E(Y)  .
2
Random Selection of a Point from a Planar Region
• Let S be a subset of the plane with area A(S). A point is said to be
randomly selected from S if for any subset R of S with area
A(R), the probability that R contains the point is A(R)/A(S).
• Problem. Two people arrive at a restaurant at random times from
11:30am to 12:00 noon. What is the probability that their arrival
times differ by ten minutes or less?
Solution. Let X and Y be the minutes past 11:30 am that the two
people arrive. Let
S  {( x, y) : 0  x  30, 0  y  30},
R  {( x, y)  S : | x  y | 10}.
The desired probability is
area of R 5
P(| X  Y | 10) 
 .
area of S 9
Independent random variables
• Random variables X and Y are independent if for any two
sets of real numbers A and B,
P{X  A, Y  B}  P{X  A}P{Y B}.
That is, events EA ={X A}, EB={Y B} are independent.
• In terms of F, X and Y are independent if and only if
F(a,b)  FX (a)FY (b), for all a, b.
• When X and Y are discrete, they are independent if and only if
p(x,y)  p X (x)pY ( y),all x, y.
• In the jointly continuous case, X and Y are independent if and
only if
f(x,y)  f X (x)fY ( y),for all x, y.
Example for independent jointly distributed r.v.’s
• A man and a woman decide to meet at a certain location. If
each person independently arrives at a time uniformly
distributed between 12 noon and 1 pm, find the probability that
the first to arrive has to wait longer than 10 minutes.
Solution.
Let X and Y denote, resp., the time that the man and woman
arrive. X and Y are independent.
P{X  10  Y}  P{Y  10  X}  2P{X  10  Y} 
60 y 10
2 60
25
2
2  (1 / 60) dxdy 
(y  10)dy  .
2 10
10 0
36
(60)
Sums of independent random variables
• Suppose that X and Y are independent continuous random
variables having probability density functions fX and fY. Then
FX  Y (a)  P {X  Y  a}

 f
X
( x)fY ( y)dxdy
x  ya

  FX (a  y)fY ( y)dy.

• We obtain the density of the sum by differentiating:
 dF (a  y)
X
f X  Y (a)  
f Y ( y)dy

da

  f X (a  y)fY ( y)dy.

The right-hand-side of the latter equation defines the
convolution of fX and fY.
Example for sum of two independent random variables
• Suppose X and Y are independent random variables, both
uniformly distributed on (0,1). The density of X+Y is
computed as follows:
1
f XY (a)   f X (a  y)dy
0
a, 0  a  1
 2  a, 1  a  2
0, otherwise
• Because of the shape of its density function, X+Y is said to
have a triangular distribution.
Functions of Independent Random Variables
• Theorem. Let X and Y be independent random variables and
let g and h be real valued functions of a single real variable.
Then
(i) g(X) and h(Y) are also independent random variables
(ii) E[g(X)h(Y)]  E[g(X)]E[h(Y)].
• Example. If X and Y are independent, then
E[(sin X)eY ]  E[sin X]E[eY ].