Transcript Document
Discrete-time Random Signals
Until now, we have assumed that the signals are
deterministic, i.e., each value of a sequence is
uniquely determined.
In many situations, the processes that generate
signals are so complex as to make precise
description of a signal extremely difficult or
undesirable.
A random or stochastic signal is considered to be
characterized by a set of probability density
functions.
Stochastic Processes
Random (or stochastic) process (or signal)
A random process is an indexed family of random
variables characterized by a set of probability
distribution function.
A sequence x[n], <n< . Each individual sample x[n] is
assumed to be an outcome of some underlying random
variable Xn.
The difference between a single random variable and a
random process is that for a random variable the
outcome of a random-sampling experiment is mapped
into a number, whereas for a random process the
outcome is mapped into a sequence.
Stochastic Processes (continue)
Probability density function of x[n]: p xn , n
Joint distribution of x[n] and x[m]: pxn , n , xm , m
Eg., x1[n] = Ancos(wn+n), where An and n are
random variables for all < n < , then x1[n] is a
random process.
Independence and Stationary
x[n] and x[m] are independent iff
p x n , n , x m , m p x n , n p x m , m
x is a stationary process iff
p x n k , n k , x m k , m k p x n , n , x m , m
for all k.
That is, the joint distribution of x[n] and x[m]
depends only on the time difference m n.
Stationary (continue)
Particularly, when m = n for a stationary process:
p x n k , n k p x n , n
It implies that x[n] is shift invariant.
Stochastic Processes vs.
Deterministic Signal
In many of the applications of discrete-time signal
processing, random processes serve as models for
signals in the sense that a particular signal can be
considered a sample sequence of a random
process.
Although such a signals are unpredictable –
making a deterministic approach to signal
representation is inappropriate – certain average
properties of the ensemble can be determined,
given the probability law of the process.
Expectation
Mean (or average)
mxn xn xn pxn , ndxn
denotes the expectation operator
g xn g xn pxn , n dxn
For independent random variables
xn ym xn ym
Mean Square Value and
Variance
Mean squared value
{ xn }
2
xn pxn , ndxn
2
Variance
2
varxn xn mxn
Autocorrelation and
Autocovariance
Autocorrelation
xx{n,m}
xn xm
xn xm pxn , n , xm , mdxn dxm
Autocovariance
xx {n,m} xn m xn xm m xm
xx{n,m} m xn m xm
*
Stationary Process
For a stationary process, the autocorrelation is
dependent on the time difference m n.
Thus, for stationary process, we can write
m x mxn xn
2
x
x n m x
2
If we denote the time difference by k, we have
xx n k , n xx k
xnk xn
Wide-sense Stationary
In many instances, we encounter random
processes that are not stationary in the strict
sense.
If the following equations hold, we call the
process wide-sense stationary (w. s. s.).
m x mxn xn
2
x
k x
x n m x
xx n k , n xx
2
n k xn
Time Averages
For any single sample sequence x[n], define their
time average to be
L
1
xn lim
xn
l 2 L 1
n L
Similarly, time-average autocorrelation is
xn mxn
L
1
lim
xn mx n
l 2 L 1
n L
Ergodic Process
A stationary random process for which time
averages equal ensemble averages is called an
ergodic process:
xn mx
xn mxn
xx m
Ergodic Process (continue)
It is common to assume that a given sequence is
a sample sequence of an ergodic random
process, so that averages can be computed from
a single sequence.
1 L 1
ˆx
m
xn
L n 0
In practice, we cannot
compute with the limits, but
L 1
instead the quantities.
1
2
2
ˆ
x
n
m
x
x
Similar quantities are often
L n 0
computed as estimates of the
L 1
1
mean, variance, and
xn mx n xn mx n
L
L n 0
autocorrelation.
Properties of correlation and
covariance sequences
xx m xn m xn
xx m xn m m x xn m x
xy m xn m y n
xy m xn m m x y n m y
Property 1:
xx m xx m m x
2
xy m xy m m x m y
Properties of correlation and
covariance sequences (continue)
Property 2:
2
xx 0 E xn Mean SquaredValue
xy 0
Property 3
2
x
Variance
xx m
m
m
xx m xx
xx
xy m
m
m
xy m xy
xy
Properties of correlation and
covariance sequences (continue)
Property 4:
xy m xx 0 yy 0
2
xy m xx 0 yy 0
2
xx m xx 0
xx m xx 0
Properties of correlation and
covariance sequences (continue)
Property 5:
If
yn xn n0
yy m xx m
yy m xx m
Fourier Transform Representation
of Random Signals
Since autocorrelation and autocovariance
sequences are all (aperiodic) one-dimensional
sequences, there Fourier transform exist and are
bounded in |w|.
Let the Fourier transform of the autocorrelation
and autocovariance sequences be
e
e
xx m xx e jw
xy m xy e jw
xx m xx
xy m xy
jw
jw
Fourier Transform Representation
of Random Signals (continue)
Consider the inverse Fourier Transforms:
1
xx m
2
1
xx m
2
xx
e e
e e
xx
jw
jw
jwn
dw
jwn
dw
Fourier Transform Representation
of Random Signals (continue)
Consequently,
1
jw
xn xx 0
e
dw
xx
2
1
2
jw jwn
x xx 0
e
e dw
xx
2
2
Denote Pxx w xx e
to be the power density spectrum (or power
spectrum) of the random process x.
jw
Power Density Spectrum
xn
2
1
2
Pxx wdw
The total area under power density in [,] is the
total energy of the signal.
Pxx(w) is always real-valued since xx(n) is
conjugate symmetric
For real-valued random processes, Pxx(w) = xx(ejw)
is both real and even.
Mean and Linear System
Consider a linear system with frequency response
h[n]. If x[n] is a stationary random signal with
mean mx, then the output y[n] is also a stationary
random signal with mean mx equaling to
m y n yn
k
k
hk xn k hk mx n k
Since the input is stationary, mx[nk] = mx , and
consequently,
m y mx
j0
h
k
H
e
mx
k
Stationary and Linear System
If x[n] is a real and stationary random signal, the
autocorrelation function of the output process is
yy n , n m ynyn m
hk hr xn k xn m r
k r
k
r
hk hr xn k xn m r
Since x[n] is stationary , {x[nk]x[n+mr] }
depends only on the time difference m+kr.
Stationary and Linear System
(continue)
Therefore,
yy n , n m
k
r
hk hr xx m k r
yy m
The output power density is also stationary.
Generally, for a LTI system having a wide-sense
stationary input, the output is also wide-sense
stationary.
Power Density Spectrum and
Linear System
By substituting l = rk,
l
k
yy m
where
xx m l hk hk hl k
xx m l chh l
l
chh l
hk hl k
k
A sequence of the form of chh[l] is called a
deterministic autocorrelation sequence.
Power Density Spectrum and
Linear System (continue)
A sequence of the form of Chh[l] l = rk,
C e e
yy e
jw
jw
hh
jw
xx
where Chh(ejw) is the Fourier transform of chh[l].
For real h,
chh l hl h l
H e H e
e H e
Chh e
Thus
Chh
jw
jw
jw
jw
2
jw
Power Density Spectrum and
Linear System (continue)
We have the relation of the input and the output
power spectrums to be the following:
H e e
yy e
jw
jw
2
jw
xx
1
jw
xn xx 0
e
dw totalaveragepowerof the input
xx
2
2
1
2
jw
jw
yn yy 0
H
e
e
dw
xx
2
totalaveragepowerof the output
2
Power Density Property
Key property: The area over a band of
frequencies, wa<|w|<wb, is proportional to the
power in the signal in that band.
To show this, consider an ideal band-pass filter.
Let H(ejw) be the frequency of the ideal band
pass filter for the band wa<|w|<wb.
Note that |H(ejw)|2 and xx(ejw) are both even
functions. Hence, yy 0 averagepowerin output
1
2
wa
w
b
e
He
jw
2
jw
xx
1
dw
2
e dw
wb
wa
He
jw
2
jw
xx
White Noise (or White
Gaussian Noise)
A white noise signal is a signal for which
xx m x2 m
Hence, its samples at different instants of time are
uncorrelated.
The power spectrum of a white noise signal is a
constant
jw
2
xx e
x
The concept of white noise is very useful in
quantization error analysis.
White Noise (continue)
The average power of a white-noise is therefore
1
1 2
jw
xx 0
xx e dw
x dw x2
2
2
White noise is also useful in the representation
of random signals whose power spectra are not
constant with frequency.
A random signal y[n] with power spectrum yy(ejw) can
be assumed to be the output of a linear time-invariant
system with a white-noise input.
H e
yy e
jw
jw
2
2
x
Cross-correlation
The cross-correlation between input and output of
a LTI system: m xnyn m
xy
xn hk xn m k
k
hk xx m k
k
That is, the cross-correlation between the input
output is the convolution of the impulse response
with the input autocorrelation sequence.
Cross-correlation (continue)
By further taking the Fourier transform on both sides
of the above equation, we have
jw
jw
jw
xy e H e xx e
This result has a useful application when the input is
white noise with variance x2.
xy m x2hm,
xy e jw x2 H e jw
These equations serve as the bases for estimating the
impulse or frequency response of a LTI system if it is
possible to observe the output of the system in response to
a white-noise input.
Remained Materials Not Included
From Chap. 4, the materials will be
taught in the class without using
slides