Transcript C9_CIS2033
CIS 2033 based on
Dekking et al. A Modern Introduction to Probability and Statistics. 2007
Slides by Michael Maurizi
Instructor Longin Jan Latecki
C9: Joint Distributions and Independence
9.1 – Joint Distributions of Discrete
Random Variables
Joint Distribution: the combined distribution of two or more random
variables defined on the same sample space Ω
Joint Distribution of two discrete random variables:
The joint distribution of two discrete random variables X and Y can be
obtained by using the probabilities of all possible values of the pair (X,Y)
Joint Probability Mass function p of two discrete random variables X and Y:
p : R 2 [0,1]
p XY (a, b) p(a, b) P(X a, Y b)
for a, b
Joint Distribution function F of two random variables X and Y: Can be thought
of as the sum of the elements in box it makes with the upper-left corner.
p : R 2 [0,1]
F (a, b) P(X a, Y b)
for a, b
9.1 – Joint Distributions of Discrete
Random Variables
Marginal Distribution:
Obtained by adding up the
rows or columns of a joint
probability mass function table.
Literally written in the margins.
Let p(a,b) be a joint pmf of RVs
S and M. The marginal pmfs are
then given by
P ( S a ) pS ( a ) p ( a , b )
b
P ( M b ) p M ( b) p ( a , b)
a
Example: Joint Distribution of S and M.
S = The sum of two dice, M = The maximum of two dice.
b
pS(b)
a
1
2
3
4
5
6
2
1/36
0
0
0
0
0
1/36
3
0
2/36
0
0
0
0
2/36
4
0
1/36
2/36
0
0
0
3/36
5
0
0
2/36
2/36
0
0
4/36
6
0
0
1/36
2/36
2/36
0
5/36
7
0
0
0
2/36
2/36
2/36
6/36
8
0
0
0
1/36
2/36
2/36
5/36
9
0
0
0
0
2/36
2/36
4/36
10
0
0
0
0
1/36
2/36
3/36
11
0
0
0
0
0
2/36
2/36
12
0
0
0
0
0
1/36
1/36
pM(a)
1/36
3/36
5/36
7/36
9/36
11/36
1
9.2 – Joint Distributions of Continuous
Random Variables
Joint Continuous Distribution: Like an ordinary continuous random
variable, only works for a range of values. There must exist a function f
that fulfills the following properties for there to be a joint continuous
distribution:
2
f :R R
b1 b2
P(a1 X b1 , a2 Y b2 ) f ( x, y )dxdy
for all a1 b1 and a2 b2
a1 a2
f ( x, y ) 0
for all x and y
f ( x, y)dxdy 1
Marginal distribution function of X:
FX (a) P(X a) F (a,) lim F (a, b)
b
Marginal distribution function of Y:
FY (b) P(Y b) F (b,) lim F (a, b)
a
9.2 – Joint Distributions of Continuous
Random Variables
Joint distribution function:
F(a,b) can be constructed given f(x,y), and vice versa
a b
F ( a, b)
f ( x, y)dxdy
and
2
f(x,y)
F ( x, y )
xy
Marginal probability density function:
You need to integrate out the unwanted random variable to get the
marginal distribution.
f X ( x)
f ( x, y)dy
and
fY ( y )
f ( x, y)dx
9.3 – More than Two Random Variables
Assuming we have n random variables X1, X2, X3, … Xn. We can get the
joint distribution function and the joint probability mass functions.
for a1 , a2 ,, an
Joint distributi on function :
F (a1 , a2 ,, an ) P(X1 a1 , X 2 a2 ,, X n an )
Joint probabilit y mass function :
p(a1 , a2 ,, an ) P(X1 a1 , X 2 a2 ,, X n an )
9.4 – Independent Random Variables
Tests for Independence: Two random variables X and Y are independent if
and only if every event involving X is independent of every event involving Y.
This also applies to joint distributions using more than two random variables.
for all possible a and b
P(X a, Y b) P(X a ) P(Y b)
or
P(X a, Y b) P(X a ) P(Y b)
or
F ( a , b) FX ( a ) FY (b)
9.5 – Propagation of Independence
Independence after a change of variable:
If a function is applied to several independent random variables, the new
resulting random variables will also be independent.
Example 3. 6, p. 48, in Baron book