(Random Processes) - faraday - Eastern Mediterranean University

Download Report

Transcript (Random Processes) - faraday - Eastern Mediterranean University

Chapter 6
Random Processes




Description of Random Processes
Stationarity and ergodicty
Autocorrelation of Random Processes
Properties of autocorrelation
Huseyin Bilgekul
EEE 461 Communication Systems II
Department of Electrical and Electronic Engineering
Eastern Mediterranean University
EEE 461 1
Homework Assignments
• Return date: November 8, 2005.
• Assignments:
Problem 6-2
Problem 6-3
Problem 6-6
Problem 6-10
Problem 6-11
EEE 461 2
Random Processes
• A RANDOM VARIABLE X, is a rule for
assigning to every outcome, w, of an
experiment a number X(w).
– Note: X denotes a random variable and X(w) denotes
a particular value.
• A RANDOM PROCESS X(t) is a rule for
assigning to every w, a function X(t,w).
– Note: for notational simplicity we often omit the
dependence on w.
EEE 461 3
Ensemble of Sample Functions
The set of all possible functions is called the
ENSEMBLE.
EEE 461 4
Random Processes
• A general Random or Stochastic
Process can be described as:
– Collection of time functions
(signals) corresponding to various
outcomes of random experiments.
– Collection of random variables
observed at different times.
• Examples of random processes
in communications:
– Channel noise,
– Information generated by a source,
– Interference.
t1
t2
EEE 461 5
EEE 461 6
EEE 461 7
Collection of Time Functions
• Consider the time-varying function representing a
random process where wi represents an outcome of a
random event.
• Example:
– a box has infinitely many resistors (i=1,2, . . .) of same
resistance R.
– Let wi be event that the ith resistor has been picked up from
the box
– Let v(t, wi) represent the voltage of the thermal noise
measured on this resistor.
EEE 461 8
Collection of Random Variables
• For a particular time t=to the value x(to,wi) is a random variable.
• To describe a random process we can use collection of random
variables {x(to,w1) , x(to,w2) , x(to,w3) , . . . }.
• Type: a random processes can be either discrete-time or continuoustime.
• Probability of obtaining a sample function of a RP that passes
through the following set of windows. Probability of a joint event.
EEE 461 9
Description of Random Processes
• Analytical description: X(t) =f(t,w) where w is an
outcome of a random event.
• Statistical description: For any integer N and any
choice of (t1, t2, . . ., tN) the joint pdf of {X(t1), X( t2),
. . ., X( tN) } is known. To describe the random
process completely the PDF f(x) is required.
x1  x(t1 ), x  [ x1 , x2 ,... xN ]
f  x )  f {x(t1 ), x(t2 ),.... x(t N )}
EEE 461 10
Example: Analytical Description
• Let X t )  A cos  2p f0t  q ) where q is a random variable
uniformly distributed on [0,2p].
• Complete statistical description of X(to) is:
– Introduce Y  2p f 0t0  q
– Then, we need to transform from y to x:
pX  x ) dx  pY  y1 ) dy  pY  y2 ) dy
– We need both y1 and y2 because for a given x the equation
x=A cos (y) has two solutions in [0,2p].
EEE 461 11
Analytical (continued)
• Note x and y are actual values of the random
variables X and Y.
• Since dx
2
2
dy
 A sin y 
A x
and pY is uniform in [2pf0t0, 2pf0t0 + 2p], we get
1


p X  x )   p A2  x 2
0

A  x  A
Elsewhere
• Using the analytical description of X(t), we
obtained its statistical description at any time t.
EEE 461 12
Example: Statistical Description
• Suppose a random process x(t) has the property that for any
N and (t0,t1, . . .,tN) the joint density function of {x(ti)} is a
jointly distributed Gaussian vector with zero mean and
covariance
 ij   2 min  ti , t j )
• This gives complete statistical description of the random
process x(t).
EEE 461 13
Activity: Ensembles
• Consider the random process: x(t)=At+B
• Draw ensembles of the waveforms:
– B is constant, A is uniformly distributed between [-1,1]
– A is constant, B is uniformly distributed between [0,2]
• Does having an “Ensemble” of waveforms give
you a better picture of how the system performs?
x(t)
x(t)
2
B
t
t
Slope Random
B intersect is Random
EEE 461 14
Stationarity
• Definition: A random process is STATIONARY to the
order N if for any t1,t2, . . . , tN,
fx{x(t1), x(t2),...x(tN)}=fx{x(t1+t0), x(t2+t0),...,x(tN +t0)}
• This means that the process behaves similarly (follows
the same PDF) regardless of when you measure it.
• A random process is said to be STRICTLY
STATIONARY if it is stationary to the order of N→∞.
• Is the random process from the coin tossing experiment
stationary?
EEE 461 15
Illustration of Stationarity
Time functions pass
through the corresponding
windows at different times
with the same probability.
EEE 461 16
Example of First-Order Stationarity
RANDOM PROCESS is x  t )  A sin w0t  q0 )
• Assume that A and w0 are constants; q0 is a
uniformly distributed RV from [p,p); t is time.
• From last lecture, recall that the PDF of x(t):
1


f x  x )   p A2  x 2
0

xA
x Elsewhere
• Note: there is NO dependence on time, the PDF is
not a function of t.
• The RP is STATIONARY.
EEE 461 17
Non-Stationary Example
RANDOM PROCESS is x  t )  A sin w0t  q0 )
• Now assume that A, q0 and w0 are constants; t
is time.
• Value of x(t) is always known for any time
with a probability of 1. Thus the first order
PDF of x(t) is
f  x )    x  A sin w0t  q0 ) )
• Note: The PDF depends on time, so it is
NONSTATIONARY.
EEE 461 18
Ergodic Processes
• Definition: A random process is ERGODIC if all time averages of any
sample function are equal to the corresponding ensemble averages
(expectations)
• Example, for ergodic processes, can use ensemble statistics to
compute DC values and RMS values
xDC  x  t )  [ x(t )]  mx
x t )
1
 lim
T  T


T
0
[ x(t )]dt
[ x(t )]   [ x] f ( x)dx mx

xRMS 
Time average
Ensmble average
x 2  t )  x 2   2  mx 2
• Ergodic processes are always stationary; Stationary processes are not
necessarily ergodic
Ergodic  Stationary
EEE 461 19
Example: Ergodic Process
RANDOM PROCESS is x  t )  A sin w0t  q0 )
• A and w0 are constants; q0 is a uniformly
distributed RV from [p,p); t is time.
• Mean (Ensemble statistics)

p

p
mx  x   x q ) fq q ) dq   A sin w0t  q )
1
dq  0
2p
• Variance
x 2  
p
p
2
1
A
A2 sin 2 w0t  q )
dq 
2p
2
EEE 461 20
Example: Ergodic Process
• Mean (Time Average) T is large
x t )
1
 lim
T  T

T
0
A sin w0t  q ) dt  0
• Variance
x2 t )
1 T 2 2
A2
 lim  A sin w0t  q ) dt 
0
2
T  T
• The ensemble and time averages are the same, so the
process is ERGODIC
EEE 461 21
EXERCISE
• Write down the definition of :
– Wide sense stationary
– Ergodic processes
• How do these concepts relate to each other?
• Consider: x(t) = K; K is uniformly distributed
between [-1, 1]
– WSS?
– Ergodic?
EEE 461 22
Autocorrelation of Random Process
• The Autocorrelation function of a real random
process x(t) at two times is:
) )
_________________
 
Rx  t1 , t2 )  x t x t    x1 x2 f x  x1 , x2 ) dx1dx2
1
2
 
EEE 461 23
Wide-sense Stationary
• A random process that is stationary to order 2 or greater is
Wide-Sense Stationary:
• A random process is Wide-Sense Stationary if:
________
x  t )  constant
Rx  t1 , t2 )  Rx t )
• Usually, t1=t and t2=t+t so that t2- t1 =t.
• Wide-sense stationary process does not DRIFT with time.
• Autocorrelation depends only on the time gap but not where the
time difference is.
• Autocorrelation gives idea about the frequency response of the
RP.
EEE 461 24
Autocorrelation Function of RP
• Properties of the autocorrelation function of wide-sense
stationary processes
_________
Rx  0 )  x 2  t )  Second Moment
Rx t )  Rx  t ) , Symmetric
Rx  0 )  Rx t ) , Maximum value at 0
Autocorrelation of slowly and rapidly fluctuating random processes.
EEE 461 25
Cross Correlations of RP
• Cross Correlation of two RP x(t) and y(t) is
defined similarly as:
) )
_________________
 
Rxy  t1 , t2 )  x t y t    x1 y2 f xy  x1 , y2 ) dx1dy2
1
2
 
• If x(t) and y(t) are Jointly Stationary processes,
Rxy  t1 , t2 )  Rxy t2  t1 )  Rxy t )
t  t2  t1
• If the RP’s are jointly ERGODIC,
_________________
Rxy t )  x  t ) y  t  t )  x(t ) y (t  t )
EEE 461 26
Cross Correlation Properties of Jointly
Stationary RP’s
• Some properties of cross-correlation functions are
Rxy t )  Rxy  t )
Rxy t )  Rx  0 ) Ry  0 )
Rxy t )  12  Rx  0 )  Ry  0 ) 
• Uncorrelated:
• Orthogonal:
__________________
Rxy t )  x  t ) y  t  t )  x y
Rxy t )  0
• Independent: if x(t1) and y(t2) are independent (joint
distribution is product of individual distributions)
EEE 461 27