2.2 Multiple random variables and distribution

Download Report

Transcript 2.2 Multiple random variables and distribution

Probability
Theory
School of Mathematical Science
and
Computing Technology in CSU
Course groups of Probability and Statistics
§2.2 Multi-dimensional
random variables and their
distributions
So far, we have only discussed the onedimensional random variables and their distribution,
But using a random variable to describe some
random phenomenon is not enough ,so we need to
use some random variables to describe it
At target practice, the hit points are determined
by a pair of random variables (two coordinates).
The center gravity position of the plane in the air
are determined by three random variables (three
coordinates) and so on.
Two-dimensional discrete random variable
Definition:
If (X, Y) only take a real value of limited or
numbered pairs
( xi , y j ), i , j  1,2
So we call it two-dimensional discrete random variable.
Two-dimensional random variables One-dimensional random variable
(X, Y)
X
Discrete
Discrete
the joint probability distribution of
X and Y
P( X  xi ,Y  y j ) pij,
the probability distribution of X
P( X  xk ) pk ,
i, j =1,2, …
 pij  0, i , j  1,2,

  pij  1

i
j
k=1,2, …
 pk  0, k=1,2, …


pk 1

k
The table form of the joint probability
distribution of (X,Y) is as follows:
X
x1
x2
…
xi
…
Y
y1
y2
… yj …
p11
p21
…
pi1
…
p12
p22
…
pi2
…
…
…
…
…
…
p1j
p2j
…
p ij
…
…
…
…
…
…
The joint distribution of two-dimensional has a
comprehensive reflection of the value of two-dimensional
random variables (X, Y) and its probability law, while a
single random variable X, Y also has its own probability
distribution.
The probability distribution of X and Y are
known as the edge (probability) distribution of
(X,Y)on X or Y.
Then ask: what is the Relation between these two?
Could they be sure with each other?
First look at how to determine the two edge distribution
by the joint distribution
In general ,for two-dimensional discrete
random variable (X, Y) ,
the joint probability distribution of X
and Y is
P( X xi , Y  y j ) pij, i, j  1,2,
So the marginal probability distribution of
(X,Y) about X is



P ( X  xi )  P ( X  xi )    (Y  y j ) 

 j

  P ( X  xi )  (Y  y j )
j
  P ( X  xi , Y  y j )   pij
j
The same
P (Y  y j )   pij
j
i
in a general way, set:
(j=1,2,...)
P ( X  xi )  pi .
P (Y  y j )  p. j
We often write the marginal probability function
on the edge of the Joint probability function table, so
we get the term of marginal distribution.
Example 1 There are two white balls and three
black balls in the bag,to reap the ball twice
1 get the white ball the first time
Definition : X  
0 get the black ball the first time
1 get the white ball the second time
Y 
0 get the black ball the second time
question:try to get the joint distribution and
marginal distribution of (X,Y), please discuss it at
the Situation of back-extraction and no-back.
Solutions:with back no-back
X
Y
0
1
p j
Y
0
1
pi
X
3 3

5 5
2 3

5 5
3
5
3 2

5 5
2 2

5 5
2
5
3
5
2
5
0
1
1
p j
0
1
pi
3 2

5 4
2 3

5 4
3
5
3 2

5 4
2 1

5 4
2
5
3
5
2
5
1
When with back,X and Y is independent
with each other;
When with no-back, it’ not!
The relationship between joint
distribution and marginal distribution :
The marginal distribution can be determined by
joint distribution;
While the joint distribution can not be
determined by marginal distribution.
The definition of that case A and B is independent is:
If P ( AB )  P ( A) P ( B )
Then case A and B is independent.
The independence of the case will be extended
to the random variables
The independence of random variables in
probability theory is an important concept
If (X,Y) is discrete rand variables,the
definition of the independence above equals to:
For all the possible value (xi, yj) of (X,Y), there are:
P ( X  xi , Y  y j )  P ( X  xi ) P (Y  y j )
viz.
pij  pi p j
So call it is independence between X and Y.
Example 2
There are five products in the bag, two defective, three
authentic, remove two from the bag in arbitrary sequence,
each time after checking the products should be put back to
the pocket, if the possibility of each bag Products being put
out is equal to each other, define the following random
variables.
0, Defective at the first time
 
1, Genuine at the second time
0, Defective at the second time

1, Genuine at the second time
Try to get the distribution law of ,  。
Solution: According to the independence of the case and
the classical definition of probability, the distribution law
of ,  is
2 2 4
Pξ  0, η  0  Pξ  0 Pη  0   
5 5 25
2 3 6
P  0,   1  P  0 P  1   
5 5 25
3 2 6
P  1,   0  P  1 P  0   
5 5 25
3 3 9
P  1,   1  P  1 P  1   
5 5 25

0
1
0
4
25
6
25
1
6
25
9
25

Continue after a rest