No Slide Title

Download Report

Transcript No Slide Title

PART 3
Random Processes
Huseyin Bilgekul
Eeng571 Probability and astochastic Processes
Department of Electrical and Electronic Engineering
Eastern Mediterranean University
EE571
‹#›
Random Processes
EE571
‹#›
Kinds of Random Processes
EE571
‹#›
Random Processes
• A RANDOM VARIABLE X, is a rule for
assigning to every outcome, w, of an
experiment a number X(w).
– Note: X denotes a random variable and X(w) denotes
a particular value.
• A RANDOM PROCESS X(t) is a rule for
assigning to every w, a function X(t,w).
– Note: for notational simplicity we often omit the
dependence on w.
EE571
‹#›
Conceptual Representation of RP
EE571
‹#›
Ensemble of Sample Functions
The set of all possible functions is called the
ENSEMBLE.
EE571
‹#›
Random Processes
• A general Random or Stochastic
Process can be described as:
– Collection of time functions
(signals) corresponding to various
outcomes of random experiments.
– Collection of random variables
observed at different times.
• Examples of random processes
in communications:
– Channel noise,
– Information generated by a source,
– Interference.
t1
t2
EE571
‹#›
Random Processes
Let  denote the random outcome of an experiment. To every such
outcome suppose a waveform
X (t,  )
X (t, ) is assigned.

The collection of such
X (t,  )
waveforms form a

X (t,  )
stochastic process. The
set of { k } and the time

X (t,  )
index t can be continuous
X (t,  )
or discrete (countably
t
0
t
t
infinite or finite) as well.
For fixed  i  S (the set of
all experimental outcomes), X (t, ) is a specific time function.
For fixed t,
X 1  X (t1 , i )
n
k
2
1
1
2
is a random variable. The ensemble of all such realizations
X (t, ) over time represents the stochastic
EE571
‹#›
Random Process for a Continuous Sample Space
EE571
‹#›
Random Processes
EE571
‹#›
Wiener Process Sample Function
EE571
‹#›
EE571
‹#›
Sample Sequence for Random Walk
EE571
‹#›
Sample Function of the Poisson Process
EE571
‹#›
Random Binary Waveform
EE571
‹#›
Autocorrelation Function of the Random Binary Signal
EE571
‹#›
Example
EE571
‹#›
EE571
‹#›
Random Processes
Introduction (1)
EE571
‹#›
Introduction
• A random process is a process (i.e., variation in
time or one dimensional space) whose behavior
is not completely predictable and can be
characterized by statistical laws.
• Examples of random processes
– Daily stream flow
– Hourly rainfall of storm events
– Stock index
EE571
‹#›
Random Variable
• A random variable is a mapping function which assigns outcomes of a
random experiment to real numbers. Occurrence of the outcome
follows certain probability distribution. Therefore, a random variable
is completely characterized by its probability density function (PDF).
EE571
‹#›
STOCHASTIC PROCESS
EE571
‹#›
STOCHASTIC PROCESS
EE571
‹#›
STOCHASTIC PROCESS
EE571
‹#›
STOCHASTIC PROCESS
• The term “stochastic processes” appears mostly in
statistical textbooks; however, the term “random
processes” are frequently used in books of many
engineering applications.
EE571
‹#›
STOCHASTIC PROC ESS
EE571
‹#›
DENSITY OF STOCHASTIC PROCESSES
• First-order densities of a random process
A stochastic process is defined to be completely or
totally characterized if the joint densities for the
random variables X (t1 ), X (t2 ), X (tn ) are known for all
times t1, t2 ,, tn and all n.
In general, a complete characterization is practically
impossible, except in rare cases. As a result, it is
desirable to define and work with various partial
characterizations. Depending on the objectives of
applications, a partial characterization often suffices to
ensure the desired outputs.
EE571
‹#›
DENSITY OF STOCHASTIC PROCESSES
• For a specific t, X(t) is a random variable with
distribution F ( x, t )  p[ X (t ) .x]
• The function F ( x, t ) is defined as the first-order
distribution of the random variable X(t). Its
derivative with respect to x
F ( x, t )
f ( x, t ) 
x
is the first-order density of X(t).
EE571
‹#›
DENSITY OF STOCHASTIC PROCESSES
• If the first-order densities defined for all time t, i.e. f(x,t),
are all the same, then f(x,t) does not depend on t and we call
the resulting density the first-order density of the random
process X (t ); otherwise, we have a family of first-order
densities.
• The first-order densities (or distributions) are only a partial
characterization of the random process as they do not
contain information that specifies the joint densities of the
random variables defined at two or more different times.
EE571
‹#›
MEAN AND VARIANCE OF RP
• Mean and variance of a random process
The first-order density of a random process, f(x,t), gives the
probability density of the random variables X(t) defined for all time
t. The mean of a random process, mX(t), is thus a function of time
specified by

mX (t )  E[ X (t )]  E[ X t ]   xt f ( xt , t )dxt

• For the case where the mean of X(t) does not depend on t, we have
mX (t )  E[ X (t )]  mX (a constant).
• The variance of a random process, also a function of time, is defined
by
 X2 (t )  E[ X (t )  mX (t )]2  E[ X t2 ]  [mX (t )]2
EE571
‹#›
HIGHER ORDER DENSITY OF RP
• Second-order densities of a random process
For any pair of two random variables X(t1) and X(t2),
we define the second-order densities of a random
process as f ( x1, x2 ; t1, t2 ) or f ( x1 , x2 ) .
• Nth-order densities of a random process
The nth order density functions for X (t )at times
t1, t2 ,, tn are given by
f ( x , x ,, x ; t , t ,, t ) or f ( x , x ,, x ) .
1
2
n
1
2
n
1
2
n
EE571
‹#›
Autocorrelation function of RP
• Given two random variables X(t1) and X(t2), a
measure of linear relationship between them is
specified by E[X(t1)X(t2)]. For a random process,
t1 and t2 go through all possible values, and
therefore, E[X(t1)X(t2)] can change and is a
function of t1 and t2. The autocorrelation function
of a random process is thus defined by
R(t1, t2 )  EX (t1 ) X (t2 )  R(t2 , t1 )
EE571
‹#›
Autocovariance Functions of RP
EE571
‹#›
Stationarity of Random Processes
f x1, x2 ,, xn ; t1, t2 ,, tn )  f x1, x2 ,, xn ; t1  , t2  ,, tn  )
• Strict-sense stationarity seldom holds for random
processes, except for some Gaussian processes.
Therefore, weaker forms of stationarity are needed.
EE571
‹#›
Stationarity of Random Processes
PDF of X(t)
X(t)
Time, t
EE571
‹#›
Wide Sense Stationarity (WSS) of Random Processes
EX (t )  m (constant)for all t.
R(t1, t2 )  Rt2  t1 )  Rt2  t1 ), for all t1 and t2 .
EE571
‹#›
Equality and Continuity of RP
• Equality
• Note that “x(t, wi) = y(t, wi) for every wi” is not
the same as “x(t, wi) = y(t, wi) with probability
1”.
EE571
‹#›
Equality and Continuity of RP
EE571
‹#›
Mean Square Equality of RP
• Mean square equality
EE571
‹#›
Equality and Continuity of RP
EE571
‹#›
EE571
‹#›
Random Processes
Introduction (2)
EE571
‹#›
Stochastic Continuity
EE571
‹#›
Stochastic Continuity
EE571
‹#›
Stochastic Continuity
EE571
‹#›
Stochastic Continuity
EE571
‹#›
Stochastic Continuity
EE571
‹#›
Stochastic Continuity
EE571
‹#›
Stochastic Convergence
• A random sequence or a discrete-time random
process is a sequence of random variables
{X1(w), X2(w), …, Xn(w),…} = {Xn(w)}, w  .
• For a specific w, {Xn(w)} is a sequence of
numbers that might or might not converge. The
notion of convergence of a random sequence
can be given several interpretations.
EE571
‹#›
Sure Convergence (Convergence Everywhere)
• The sequence of random variables {Xn(w)}
converges surely to the random variable X(w) if the
sequence of functions Xn(w) converges to X(w) as n
  for all w  , i.e.,
Xn(w)  X(w) as n   for all w  .
EE571
‹#›
Stochastic Convergence
EE571
‹#›
Stochastic Convergence
EE571
‹#›
Almost-sure convergence (Convergence with
probability 1)
EE571
‹#›
Almost-sure Convergence (Convergence with
probability 1)
EE571
‹#›
Mean-square Convergence
EE571
‹#›
Convergence in Probability
EE571
‹#›
Convergence in Distribution
EE571
‹#›
Remarks
•
•
•
•
Convergence with probability one applies to the
individual realizations of the random process.
Convergence in probability does not.
The weak law of large numbers is an example of
convergence in probability.
The strong law of large numbers is an example of
convergence with probability 1.
The central limit theorem is an example of convergence
in distribution.
EE571
‹#›
Weak Law of Large Numbers (WLLN)
EE571
‹#›
Strong Law of Large Numbers (SLLN)
EE571
‹#›
The Central Limit Theorem
EE571
‹#›
Venn Diagram of Relation of Types of Convergence
Note that even sure
convergence may not
imply mean square
convergence.
EE571
‹#›
Example
EE571
‹#›
Example
EE571
‹#›
Example
EE571
‹#›
Example
EE571
‹#›
Ergodic Theorem
EE571
‹#›
Ergodic Theorem
EE571
‹#›
The Mean-Square Ergodic Theorem
EE571
‹#›
The Mean-Square Ergodic Theorem
The above theorem shows that one can expect
a sample average to converge to a constant in
mean square sense if and only if the average of
the means converges and if the memory dies
out asymptotically, that is , if the covariance
decreases as the lag increases.
EE571
‹#›
Mean-Ergodic Process
EE571
‹#›
Strong or Individual Ergodic Theorem
EE571
‹#›
Strong or Individual Ergodic Theorem
EE571
‹#›
Strong or Individual Ergodic Theorem
EE571
‹#›
Examples of Stochastic Processes
• iid random process
A discrete time random process {X(t), t = 1, 2,
…} is said to be independent and identically
distributed (iid) if any finite number, say k, of
random variables X(t1), X(t2), …, X(tk) are
mutually independent and have a common
cumulative distribution function FX() .
EE571
‹#›
iid Random Stochastic Processes
• The joint cdf for X(t1), X(t2), …, X(tk) is given by
FX 1 , X 2 ,, X k ( x1 , x2 ,, xk )  P X1  x1 , X 2  x2 ,, X k  xk )
 FX ( x1 ) FX ( x2 ) FX ( xk )
• It also yields
pX1 , X 2 ,, X k ( x1, x2 ,, xk )  pX ( x1 ) pX ( x2 ) pX ( xk )
where p(x) represents the common probability
mass function.
EE571
‹#›
Bernoulli Random Process
EE571
‹#›
Random walk process
EE571
‹#›
Random walk process
• Let 0 denote the probability mass function of
X0. The joint probability of X0, X1,  Xn is
P ( X 0  x0 , X 1  x1 ,, X n  xn )
 P X 0  x0 , 1  x1  x0 ,,  n  xn  xn 1 )
 P( X 0  x0 ) P (1  x1  x0 ) P ( n  xn  xn 1 )
  0 ( x0 ) f ( x1  x0 ) f ( xn  xn 1 )
  0 ( x0 ) P ( x1 | x0 ) P ( xn | xn 1 )
EE571
‹#›
Random walk process
P ( X n 1  xn 1 | X 0  x0 , X 1  x1 ,, X n  xn )
P ( X 0  x0 , X 1  x1 ,, X n  xn , X n 1  xn 1 )

P ( X 0  x0 , X 1  x1 ,, X n  xn )
 0 ( x0 ) P ( x1 | x0 ) P ( xn | xn 1 )  P ( xn 1 | xn )

 0 ( x0 ) P ( x1 | x0 ) P( xn | xn 1 )
 P ( xn 1 | xn )
EE571
‹#›
Random walk process
The property
P( X n 1  xn 1 | X 0  x0 , X1  x1,, X n  xn )  P( X n  xn 1 | X n  xn )
is known as the Markov property.
A special case of random walk: the Brownian
motion.
EE571
‹#›
Gaussian process
• A random process {X(t)} is said to be a
Gaussian random process if all finite
collections of the random process, X1=X(t1),
X2=X(t2), …, Xk=X(tk), are jointly Gaussian
random variables for all k, and all choices of t1,
t2, …, tk.
• Joint pdf of jointly Gaussian random variables
X1, X2, …, Xk:
EE571
‹#›
Gaussian process
EE571
‹#›
Time series – AR random process
EE571
‹#›
The Brownian motion
(one-dimensional, also known as random walk)
• Consider a particle randomly moves on a real line.
• Suppose at small time intervals  the particle jumps a small
distance  randomly and equally likely to the left or to the
right.
• Let X (t ) be the position of the particle on the real line at
time t.
EE571
‹#›
The Brownian motion
• Assume the initial position of the particle is at the
origin, i.e. X (0)  0
• Position of the particle at time t can be expressed as
X (t )   Y1  Y2    Y[t /  ] ) where Y1,Y2 ,
are independent random variables, each having
probability 1/2 of equating 1 and 1.
( t /   represents the largest integer not exceeding
t /  .)
EE571
‹#›
Distribution of X(t)
• Let the step length  equal  , then
X (t )   Y1  Y2    Y[t /  ] )
• For fixed t, if  is small then the distribution of X (t )
is approximately normal with mean 0 and variance t,
i.e., X (t ) ~ N 0, t ) .
EE571
‹#›
Graphical illustration of Distribution of X(t)
PDF of X(t)
X(t)
Time, t
EE571
‹#›
• If t and h are fixed and  is sufficiently small
then
X  (t  h)  X  (t )
  Y1  Y2 
 Y[( t  h ) / ] )  Y1  Y2 
  Y[ t / ]1  Y[ t / ] 2 
 Y[( t  h ) / ] )
   Yt    Yt  2  
    
 Yt  h  
 

 Y[ t / ] ) 
EE571
‹#›
Graphical Distribution of the displacement of
X (t  h)  X (t )
• The random variable X (t  h)  X (t ) is normally
distributed with mean 0 and variance h, i.e.
x
  u2 
1


P X  (t  h)  X  (t ) )  x 
exp
du



2h  
 2h 
EE571
‹#›
• Variance of X (t ) is dependent on t, while variance
of X (t  h)  X (t ) is not.
• If 0  t1  t2    t2m , then X (t2 )  X (t1 ) ,
X (t4 )  X (t3 ),, X (t2m )  X (t2m1 )
are independent random variables.
EE571
‹#›
X
t
EE571
‹#›
Covariance and Correlation functions of
X (t )
CovX  (t ), X  (t  h)  E X  (t ) X  (t  h)
 E  Y1  Y2    Y t     Y1  Y2    Yt  h  

  
 

2







 E  Y1  Y2    Y t     Y1  Y2    Y t     Yt  1  Yt   2    Yt  h  
 
  
 


2



 E  Y1  Y2    Y t   
  

t
Correl  X  (t ), X  (t  h) 
=
Cov  X (t ), X  (t  h)
t  t  h)
t
t  t  h)
EE571
‹#›