Modulation, Demodulation and Coding Course

Download Report

Transcript Modulation, Demodulation and Coding Course

Digital Communications I:
Modulation and Coding Course
Spring - 2013
Jeffrey N. Denenberg
Lecture 3c: Signal Detection in AWGN
Last time we talked about:
Receiver structure
 Impact of AWGN and ISI on the
transmitted signal
 Optimum filter to maximize SNR



Matched filter and correlator receiver
Signal space used for detection


Orthogonal N-dimensional space
Signal to waveform transformation and vice
versa
Lecture 5
2
Today we are going to talk about:

Signal detection in AWGN channels



Minimum distance detector
Maximum likelihood
Average probability of symbol error


Union bound on error probability
Upper bound on error probability based
on the minimum distance
Lecture 5
3
Detection of signal in AWGN

Detection problem:

Given the observation vector z , perform a
ˆ of the
mapping from z to an estimate m
transmitted symbol, mi , such that the
average probability of error in the decision
is minimized.
n
mi
Modulator
si
Lecture 5
z
Decision rule
4
mˆ
Statistics of the observation Vector

AWGN channel model: z  si  n



(a
,a
,...,
a
) is deterministic.
Signal vector s
i
i1
i2
iN
,n
n
Elements of noise vector n(n
are i.i.d
1
2,...,
N)
Gaussian random variables with zero-mean and
variance N 0 / 2 . The noise vector pdf is
2


n
1


p
(
n
)

exp

n
N
/2
N




N
0
0


The elements of observed vector z(z1,z2,...,
zN)are
independent Gaussian random variables. Its pdf is
2


z

s
1
i


p
(
z
|s
)
 N
exp

z
i
/
2

N
N
 
0
0



Lecture 5
5
Detection

Optimum decision rule (maximum a
posteriori probability):
ˆ
Set
m

m
iif
Pr(
m
|z
)

Pr(
m
sent
|z
)
,
for
all
k

i
isent
k
where
k

1
,...,
M
.

Applying Bayes’ rule gives:
ˆ
Set
m
m
iif
p
(
z
|m
)
z
k
p
,is
maximum
for
all
k

i
k
p
(
z
)
z
Lecture 5
6
Detection …

Partition the signal space into M decision
regions, Z1,...,ZM such that
Vector
zlies
inside
region
Z
i if
p
z|m
z(
k)
ln[
p
],is
maximum
for
all
k
i.
k
p
z
)
z(
That
means
ˆ
m
m
i
Lecture 5
7
Detection (ML rule)

For equal probable symbols, the optimum
decision rule (maximum posteriori probability)
is simplified to:
ˆ
Set
m

m
iif
p
(
z
|m
),
is
maximum
for
all
k

i
z
k
or equivalently:
ˆ
Set
m

m
if
i
ln[
p
(
z
|
m
)],
is
maximum
for
all
k

i
z
k
which is known as maximum likelihood.
Lecture 5
8
Detection (ML)…

Partition the signal space into M decision
regions, Z1,...,ZM.

Restate the maximum likelihood decision
rule as follows:
Vector
z
lies
inside
region
Z
i if
ln[
p
z
|m
is
maximum
for
all
k
i
z(
k)],
That
means
ˆ
m
m
i
Lecture 5
9
Detection rule (ML)…

It can be simplified to:
Vector
z
lies
inside
region
Z
iif
z

s
,is
minimum
for
all
k

i
k
or equivalently:
Vector
r
lies
inside
region
Z
iif
N
1
z
E
maximum
for
all
k
i

ja
kj
k,is
2
j
1
where
E
the
energy
of
s
t).
kis
k(
Lecture 5
10
Maximum likelihood detector block
diagram
,s1 
1
 E1
2
z
Choose
the largest
, s M 
1
 EM
2
Lecture 5
11
mˆ
Schematic example of the ML decision regions
 2 (t )
Z2
s2
s3
s1
Z3
Z1
 1 (t )
s4
Z4
Lecture 5
12
Average probability of symbol error

Erroneous decision:
For the transmitted symbol mi
or equivalently signal vector s i , an error in decision occurs
if the observation vector z does not fall inside region Z i.

Probability of erroneous decision for a transmitted symbol
or equivalently
ˆ
P
(
m
)

Pr(
m

m
m
e
i
iand
isent)
ˆ
Pr(
m

m
)

Pr(
m
sent)Pr
z
does
not
lie
insi
Z
m
se
i
i
i
i

Probability of correct decision for a transmitted symbol
ˆ
Pr(
m

m
)

Pr(
m
sent)Pr(
z
lies
inside
Z
m
sen
i
i
i
i
P
(
m
)

Pr(
z
lies
inside
Z
m
sent)

p
(
z
|
m
)
d
z
c
i
i
i
z
i

P
m
1

P
m
e(
i)
c(
i)
Lecture 5
Z
i
13
Av. prob. of symbol error …

Average probability of symbol error :
M
ˆ
P
(
M
)

Pr
(
m
m

E
i)
i

1

For equally probable symbols:
1M
1M
P
(
M
) 
P
(
m

1
 
P
(
m
E
e
i)
c
i)
M
M
i
1
i
1
1M

1
 
p
(
z
|m
d
z
z
i)

M
i
1Z
i
Lecture 5
14
Example for binary PAM
pz (z | m2)
pz (z | m1)
s2
 Eb
s1
0
 1 (t )
Eb


s

s
/
2
1
2


P
(
m
)

P
(
m
)

Q
e
1
e
2
N

/
2
0


2

E
b


P

P
(
2
)

Q
B
E
N

 0
Lecture 5
15
Union bound
Union bound
The probability of a finite union of events is upper bounded
by the sum of the probabilities of the individual events.

Let Aki denote that the observation vector z is closer to
the symbol vector s k than s i , when s i is transmitted.
Pr(
A
)P
sk,si) depends only on s i and s k .
ki
2(

Applying Union bounds yields

M
P
m
P
(sk,s

e(
i)
2
i)
k
1
k
i
1MM
P
(
M
) 
P
(
s
,s

E
2
k
i)
M
i

1k

1
k

i
Lecture 5
16
Example of union bound
r
Z2
P
(
m
)

p
(
r
|m
)
d
r
e
1
r
1
2
Z1
s2
s1
Z

Z

Z
2
3
4
1
Union bound:
s4
Z4
s3
4
Z3
P
(
m
)
P
(
s
s
)

e
1
2
k,
1
k

2
2
A2 r
s2
r
2
s2
s1
r
s3
s4
P
(
s
,s
)
(
r
|m
)
d
r
2
2
1
r
1
p
A
2
s2
s1
1
2
s1
1
s3
A3
s4
P
(
s
,s
)
(
r
|m
)
d
r
2
3
1
r
1
p
A
3
Lecture 5
1
s3
A4
s4
P
(
s
,s
)
(
r
|m
)
d
r
2
4
1
r
1
p
17
A
4
Upper bound based on minimum distance
P
(
s
,
s
)

Pr(
z
is
closer
s
s
, when
s
to
sent)
2
k
i
k than
i
iis
2


d
/
2
1
u
ik



exp(
)
du

Q

N

N
N
/
2
0
d
0
0


ik


dik  si sk
M
M


1
d
/
2
min


P
(
M
)

P
(
s
,
s
)

(
M

1
)
Q


E
2
k
i


M
N
/
2
i

1
k

1
0


k

i
Minimum distance in the signal space:
Lecture 5
dminmin
dik
i,k
ik
18
Example of upper bound on av. Symbol
error prob. based on union bound
 2 (t )
s

E

E
,
i

1
,...,
4
i
i
s
d i , k  2 Es
ik
Es
d min  2 Es
s2
d1, 2
d 2,3
s3
 Es
s1
d 3, 4
d1, 4
Es
s4
 Es
Lecture 5
19
 1 (t )
Eb/No figure of merit in digital
communications

SNR or S/N is the average signal power to the
average noise power. SNR should be modified
in terms of bit-energy in DCS, because:

Signals are transmitted within a symbol duration
and hence, are energy signal (zero power).
 A merit at bit-level facilitates comparison of
different DCSs transmitting different number of bits
per symbol.
E
ST SW
b
 b
N
/WN
R
0 N
b
Lecture 5
Rb
W
: Bit rate
: Bandwidth
20
Example of Symbol error prob. For PAM
signals
Binary PAM
s2
s1
 Eb
s4
6
Eb
5
0
 1 (t )
Eb
4-ary PAM
s3
s2
2
Eb
5
0
2
Eb
5
s1
6
Eb
5
 1 (t )
1
T
0
Lecture 5
21
T t
 1 (t )