Random Processes - RSLAB-NTU

Download Report

Transcript Random Processes - RSLAB-NTU

Random Processes
Introduction (2)
Professor Ke-Sheng Cheng
Department of Bioenvironmental Systems
Engineering
E-mail: [email protected]
Stochastic continuity
Stochastic Convergence

A random sequence or a discrete-time random
process is a sequence of random variables
{X1(), X2(), …, Xn(),…} = {Xn()},   .
 For a specific , {Xn()} is a sequence of
numbers that might or might not converge.
The notion of convergence of a random
sequence can be given several interpretations.
Sure convergence
(convergence everywhere)
 The
sequence of random variables
{Xn()} converges surely to the random
variable X() if the sequence of functions
Xn() converges to X() as n   for all
  , i.e.,
Xn()  X() as n   for all   .
Almost-sure convergence
(Convergence with probability 1)
Mean-square convergence
Convergence in probability
Convergence in distribution
Remarks




Convergence with probability one applies
to the individual realizations of the
random process. Convergence in
probability does not.
The weak law of large numbers is an
example of convergence in probability.
The strong law of large numbers is an
example of convergence with probability
1.
The central limit theorem is an example
of convergence in distribution.
Weak Law of Large Numbers
(WLLN)
Strong Law of Large Numbers
(SLLN)
The Central Limit Theorem
Venn diagram of relation of
types of convergence
Note that even
sure convergence
may not imply
mean square
convergence.
Example
Ergodic Theorem
The Mean-Square Ergodic
Theorem
The above theorem shows that one can
expect a sample average to converge to a
constant in mean square sense if and
only if the average of the means
converges and if the memory dies out
asymptotically, that is , if the covariance
decreases as the lag increases.
Mean-Ergodic Processes
Strong or Individual Ergodic
Theorem
Examples of Stochastic
Processes
 iid
random process
A discrete time random process {X(t), t =
1, 2, …} is said to be independent and
identically distributed (iid) if any finite
number, say k, of random variables X(t1),
X(t2), …, X(tk) are mutually independent
and have a common cumulative
distribution function FX() .
joint cdf for X(t1), X(t2), …, X(tk) is
given by
FX , X ,, X ( x1 , x2 ,, xk )  P X1  x1 , X 2  x2 ,, X k  xk 
 The
1
2
k
 FX ( x1 ) FX ( x2 ) FX ( xk )
 It
also yields
pX1 , X 2 ,, X k ( x1, x2 ,, xk )  pX ( x1 ) pX ( x2 ) pX ( xk )
where p(x) represents the common
probability mass function.
Random walk process
 Let
0 denote the probability mass
function of X0. The joint probability of
X0, X1,  Xn is
P ( X 0  x0 , X 1  x1 ,, X n  xn )
 P X 0  x0 , 1  x1  x0 ,,  n  xn  xn 1 
 P( X 0  x0 ) P (1  x1  x0 ) P ( n  xn  xn 1 )
  0 ( x0 ) f ( x1  x0 ) f ( xn  xn 1 )
  0 ( x0 ) P ( x1 | x0 ) P ( xn | xn 1 )
P ( X n 1  xn 1 | X 0  x0 , X 1  x1 ,, X n  xn )
P ( X 0  x0 , X 1  x1 ,, X n  xn , X n 1  xn 1 )

P ( X 0  x0 , X 1  x1 ,, X n  xn )
 0 ( x0 ) P ( x1 | x0 ) P ( xn | xn 1 )  P ( xn 1 | xn )

 0 ( x0 ) P ( x1 | x0 ) P( xn | xn 1 )
 P ( xn 1 | xn )
The property
P( X n 1  xn 1 | X 0  x0 , X1  x1,, X n  xn )  P( X n  xn 1 | X n  xn )
is known as the Markov property.
A special case of random walk: the
Brownian motion.
Gaussian process
 A random
process {X(t)} is said to be a
Gaussian random process if all finite
collections of the random process,
X1=X(t1), X2=X(t2), …, Xk=X(tk), are
jointly Gaussian random variables for all
k, and all choices of t1, t2, …, tk.
 Joint pdf of jointly Gaussian random
variables X1, X2, …, Xk:
Time series – AR random
process
The Brownian motion
(one-dimensional, also known as random walk)

Consider a particle randomly moves on a
real line.
 Suppose at small time intervals  the particle
jumps a small distance  randomly and
equally likely to the left or to the right.
 Let X (t ) be the position of the particle on
the real line at time t.
 Assume
the initial position of the
particle is at the origin, i.e. X (0)  0
 Position of the particle at time t can be
expressed as X (t )   Y1  Y2   Y[t /  ] 
where Y1,Y2 , are independent random
variables, each having probability 1/2 of
equating 1 and 1.
( t /   represents the largest integer not
exceeding t /  .)
Distribution of X(t)
 Let
the step length  equal
X (t )   Y1  Y2    Y[t /  ] 

, then
t, if  is small then the
distribution of X (t ) is approximately
normal with mean 0 and variance t,
X (t ) ~ N 0,.t 
i.e.,
 For fixed
Graphical illustration of
Distribution of X(t)
PDF of X(t)
X(t)
Time, t
t and h are fixed and  is sufficiently
small then
 If


X  (t  h)  X  (t )   Y1  Y2    Y[( t  h ) /  ]   Y1  Y2    Y[t /  ] 
  Y[t /  ]1  Y[ t /  ] 2    Y[(t  h ) /  ] 
   Yt    Yt  2     Yt  h  


 

Distribution of the
displacement X (t  h)  X (t )
random variable X (t  h)  X (t )
is normally distributed with mean 0
and variance h, i.e.
2
x

1
u 
du
P X  (t  h)  X  (t )   x 
exp

2h  
 2h 
 The
 Variance
of X (t ) is dependent on t,
while variance of X (t  h)  X (t ) is not.
 If 0  t1  t2    t2m , then X (t2 )  X (t1 ) ,
X (t4 )  X (t3 ),, X (t2m )  X (t2m1 )
are independent random variables.
X
t
Covariance and Correlation
functions of X (t )
CovX  (t ), X  (t  h)  E X  (t ) X  (t  h)
 E  Y1  Y2    Y t     Y1  Y2    Yt  h  

  
 

2







 E  Y1  Y2    Y t     Y1  Y2    Y t     Yt  1  Yt   2    Yt  h  
 
  
 


2



 E  Y1  Y2    Y t   
  

t
CorrelX (t ), X (t  h)
CovX (t ), X (t  h)
t


t  t  h
t  t  h