Nominal Source. - School of Engineering, University of Cyprus
Download
Report
Transcript Nominal Source. - School of Engineering, University of Cyprus
Robust Entropy Rate for Uncertain
Sources: Applications to Communication
and Control Systems
Charalambos D. Charalambous
Dept. of Electrical and Computer Engineering
University of Cyprus, Cyprus.
Also with the School of Information Technology and Engineering
University of Ottawa, Canada.
and
Alireza Farhadi
School of Information Technology and Engineering
University of Ottawa, Canada.
CWIT 2005
1
Overview
Robust Entropy
Examples
Solution and Relations to Renyi Entropy
Uncertainty in Distribution
Uncertainty in Frequency Response
Stabilization of Communication-Control Systems
CWIT 2005
2
Applications
CWIT 2005
3
Applications: Robust Lossless Coding
Theorem [1]
Source words of block length n produced by a discrete
uncertain memory less source f DSU with Shannon
entropy H S ( f ) can be encoded into code words of block
length r from a coding alphabet of size D with
arbitrary small probability of error if
r
sup H S ( f ) log D
n
f DSU
Uncertain
Source
HS ( f )
Xn
CWIT 2005
Encoder
n
Decoder
n
Destination
X̂ r
Uniform Source Encoder
4
Applications: Observability and Stabilizability
of Networked Control Systems [2]
Definition. The uncertain source is uniform asymptotically observable
in probability if there exists an encoder and decoder such that
1 t 1
~
lim sup Pr(|| Yk Yk ||2 ) ,
t f D t k 0
SU
tr
where f DSU is the joint density of (Y0 ,..., Yt 1 ) . DSU is the
source density
uncertainty set. 0 and 0 1 are fixed and
1
tr
|| y || 2 ( y y ) .2
CWIT 2005
5
Applications: Observability and Stabilizability
of Networked Control Systems [2]
Definition. The uncertain controlled source is uniform
asymptotically stabilizable in probability if there exists
encoder, decoder and controller such that
1 t 1
lim sup Pr(|| Yk ||2 ) .
t f D t k 0
SU
For and small enough, a necessary condition for
uniform observability and stabilizability in probability is
Crobust
CWIT 2005
1
lim sup H SU ( f ) robust ( ).
t t f D
SU
6
Problem Formulation,
Solution and Relations
CWIT 2005
7
Problem Formulation
d
Let D be the space of density functions defined on
, f ( y ) be the true unknown source density, belonging to the
uncertainty set DSU D . Let also g ( y ) be the fixed nominal
source density.
Definition. The robust entropy of observed process Y having
density function f ( y) DSU is defined by
robust ( f ) sup H S ( f ),
*
f DS U
*
where H S (.) is the Shannon entropy and f arg sup H S ( f ).
f DSU
CWIT 2005
8
Problem Formulation
Definition. In Previous definition, if Y (Y0 ,..., YT 1 )
represent a sequence of R.V.’s with length T of source
symbols produced by the uncertain source with joint
Td
density
, the
f ( y) DSU ( y
) robust entropy rate is
defined by
tr
1
H robust ( f * ),
T T
robust ( ) lim
provided the limit exits.
CWIT 2005
9
Solution to the Robust Entropy
Let the uncertain set be defined by DSU { f D; H ( f | g ) Rc },
where H (. | .) is the relative entropy and Rc [0, ) is fixed.
Lemma. Given a fixed nominal density g ( y ) and the uncertainty DSU
defined above, the solution to the robust entropy is given by
H robust ( f *,s ) min [ sRc (1 s ) ln
*
s 0
s
g ( y )1 s
f *,s ( y )
s
g ( y )1 s dy
where the minimizing
*,s*
H( f
| g ) Rc .
CWIT 2005
s
Rc
*,s*
1
s
g ( y ) dy ] H robust ( f
),
,
s* 0
in above is the unique solution of
10
Solution to the Robust Entropy
Corollary [1]. Suppose Rc H (h | g ), h( y ) is uniform
Probability Mass Function (PMF), that is
h( y )
M
h( yi ) ( yi ), h( yi )
i 1
1
.
M
When g ( y ) and consequently f ( y ) Correspond to
M
g
(
y
)
PMF’s, that is
g ( yi ) ( yi )
f ( y)
i 1
M
f ( yi ) ( yi )
i 1
the previous solution is reduced to
Rc
H robust
(
M
f *,s ) min [ sRc (1 s ) ln g ( yi
*
s 0
g ( yi
f *,s ( yi )
s
) 1 s
g ( yi
CWIT 2005
i
s
) 1 s
s
) 1 s
],
i 1
, 1 i M.
11
Solution to the Robust Entropy Rate
Corollary. Let Y (Y0 ,..., YT 1 )tr be a sequence with length T of source
Td
symbols with uncertain joint density function f ( y) DSU , y
and
let we exchange Rcwith TRc . Then, the robust entropy rate is given
by
*
1 TRc
H robust ( f *,s ),
T T
robust ( ) lim
TRc
H robust
( f *,s ) min [ sTRc (1 s ) ln
*
s 0
s
g ( y )1 s
f *,s( y )
s
g ( y )1 s
s
g ( y )1 s
dy ],
,
dy
where the minimizing s* 0 in above is the unique solution of
.
*,s
*
H( f
CWIT 2005
| g ) TRc
12
Robust Entropy Rate of Uncertain Sources
with Nominal Gaussian Source Density
Example. From Previous result follows that if the
nominal source density g ( y ) is Td -dimensional
Gaussian density function with mean m and
covariance Y , Rc [0, ),
1 TRc
d
1 s
d
1
*,s*
H robust ( f
) ln(
) ln( 2e)
ln det Y ,
T
2
s
2
2T
where s 0 is the unique solution of the following
nonlinear equation
Rc
CWIT 2005
d 1 s
d
ln(
) .
2
s
2s
13
Relation among Renyi, Tsallis and
Shannon Entropies
Definition. For 0, 1 and f L1 , the Renyi entropy [3] is
defined by
HR( f )
1
ln f ( y )dy.
1
Moreover, the Tsallis entropy [4] is defined by
HT ( f )
1
( f ( y )dy 1)
1
We have the following relation among Shannon, Renyi and Tsallis
entropies.
H S ( f ) lim H R ( f ),
1
1
[e(1 ) H R ( f ) 1],
1
H S ( f ) lim H T ( f ) lim H R ( f ).
HT ( f )
1
CWIT 2005
1
14
Relation between Robust Entropy and
Renyi Entropy
The robust entropy found previously is related to the
Renyi entropy as follows.
Let
s
1 s
; then
Rc
*,s*
H robust ( f )
Moreover,
min H R ( g )
[0,1)
CWIT 2005
min [
Rc H R ( g )]
[0,1) 1
Rc
*,s*
H robust ( f
)
Rc H R ( g ), [0,1).
1
15
Examples
CWIT 2005
16
Examples: Partially Observed Gauss
Markov Nominal Source
Assumptions. Let f ( y ) and g ( y )are Probability
Density Functions (PDF’s) corresponding to a
sequence with length T of symbols produced by
uncertain and nominal sources respectively. Let also
the uncertain source and nominal source are related
by
DSU { f D;
CWIT 2005
1
H ( f | g ) Rc }.
T
17
Example: Partially Observed Gauss
Markov Nominal Source
Nominal Source. The nominal density is induced by a
partially observed Gauss Markov nominal source
defined by
X t 1 AX t BWt , X 0 ,
Yt CX t DVt , t {0,1,...}.
Assumptions. X t unobserved process, Yt d
observed process, Wt m , Vt l ,Wt is iid ~ N (0, I mm ) , Vt
is iid ~ N (0, I ll ), X 0 ~ N ( x0 ,V0 ), { X 0 ,Vt ,Wt }are mutually
1
tr
independent t , (C, A) is detectable ( A, ( BB ) 2 ) is
stabilizable and D 0.
CWIT 2005
n
18
Partially Observed Gauss Markov
Nominal Source: Lemmas
Lemma. Let Y : d be a stationary Gaussian
process with power spectral density SY (e jw ). Let
1
ln det
2
1
ln det Y ,
ln det SY (e )dw Tlim
T
jw
Y Cov[(Y0 , Y1,..., YT 1 )tr ].
and assume
lim t
t
Zt Yt E[Yt | Y
CWIT 2005
t 1
exists. Then
], Y
t 1
{Y0 ,..., Yt 1}, t Cov( Zt )
19
Partially Observed Gauss Markov
Nominal Source: Lemmas
Lemma. For the partially observed Gauss Markov
nominal source
CC tr DDtr ,
where V is the unique positive semi-definite solution of
the following Algebraic Ricatti equation
V AV Atr AVC tr [CVC tr DDtr ]1CV Atr BB tr .
CWIT 2005
20
Partially Observed Gauss Markov Nominal
Source: Robust Entropy Rate
Proposition. The robust entropy rate of an uncertain
source with corresponding partially observed Gauss
Markov nominal source is
d
1 s
ln(
) S ( ),
2
s
d
1
S ( ) ln( 2e) ln det .
2
2
robust ( )
S ( ) is the Shannon entropy rate of the nominal
source, is given in previously mentioned Lemma
and s 0 is the unique solution of
d 1 s
d
Rc ln(
) .
2
s
2s
CWIT 2005
21
Partially Observed Gauss Markov
Nominal Source: Remarks
Remark. For the scalar case with B 0, after solving V,
we obtain
1
1 s
1
robust ( ) ln(
) ln( 2eD 2 ) max{ 0, ln | A |}.
2
s
2
Remark. The case Rc 0
Letting s , we have
corresponds to s .
robust ( ) S ( ).
CWIT 2005
22
Example: Partially Observed Controlled
Gauss Markov Nominal Source
The nominal source is defined via a partially observed controlled
Gauss Markov system given by
X t 1 AX t BU t , U t KYt ,
Yt CX t DVt , t
Assumptions. The uncertain source and nominal source are
n
related by DSU { f D; 1 H ( f | g ) Rc }, K is stabilizing matrix, X t
T
is unobserved process, Yt is observed process, U t , Vt , Vt
is iid ~ N (0,1), X 0 ~ N ( x0 ,V0 ), { X 0 ,Vt } are mutually independent
t and D 0.
CWIT 2005
23
Partially Observed Controlled Gauss Markov
Nominal Source: Robust Entropy Rate
Proposition. Using Body integral formula [5] (e.g., relation
between the sensivity transfer function and the unstable
eigenvalues of system), the robust entropy rate of family of
uncertain sources with corresponding partially observed
controlled Gauss Markov nominal source is
robust ( )
S ( )
1 1 s
ln(
) S ( ),
2
s
1
ln( 2eD 2 )
ln | i ( A) | .
2
{i ; | ( A)|1}
i
i ( A) are eigenvalues of the system matrix A and s 0
is the unique solution of
d 1 s
d
Rc ln(
)
2
s
2s
CWIT 2005
24
Example: Uncertain Sources in
Frequency Domain
Defining Spaces. Let (1) {z; z C , | z | 1} and H be the space
of scalar bounded, analytic function of z (1) . This space is
equipped with the norm || . || defined by
|| H (e
j
) || sup
| H (e j ) |, ( z e j ).
Assumptions. Suppose the uncertain source is obtained by
passing a stationary Gaussian random process X : ,
j
whit known ~power spectral density S X (e ) , through an uncertain
~
linear filter H ( z )where H ( z ) belongs to the following additive
uncertainty model
~
~
~
~
H Dad {H H ; H ( z ) H ( z ) ( z )W ( z ); H ( z ), H ( z ), ( z ),
W ( z ) H , H ( z ), W ( z ) are fixed, ( z ) is unkown and
CWIT 2005
|| || 1}.
25
Uncertain Sources in Frequency
Domain: Robust Entropy Rate
Results. The observed process Y is Gaussian random process,
~
SY (e j ) | H (e j ) |2 S X (e j ) and the Shannon entropy rate of
observed process is
1
1
S ( ) ln( 2e)
2
4
j
ln
S
(
e
Y )d
Subsequently, the robust entropy rate is defined via
1
1
robust ( ) ln( 2e)
sup ln SY (e j )d ,
2
4 H~Da d
and the solution is given by [6]
1
1
robust ( ) ln( 2e)
2
4
CWIT 2005
j
j
2
j
ln(|
H
(
e
)
|
|
W
(
e
)
|)
S
(
e
)d
X
26
Applications in Stabilizability of
Networked Control Systems
CWIT 2005
27
Observability and Stabilizability of
Networked Control Systems
Definition. The uncertain source is uniform asymptotically observable in
probability if there exists an encode and decoder such that
1 t 1
~
lim sup
Pr(||
Y
Y
k
k || 2 ) .
t f D t k 0
SU
Moreover, a controlled uncertain source is uniform asymptotically
stabilizable in probability if there exists an encoder, decoder and controller
such that
t 1
1
Pr(|| Yk ||2 ) .
t f D t k 0
SU
lim sup
CWIT 2005
28
Necessary Condition for Observability and
Stabilizability of Networked Control Systems
[2]
Proposition. Necessary condition for uniform asymptotically
observability and stabilizability in probability is
1
Crobust robust ( ) ln( 2e) d det g .
2
g is the covariance matrix of Gaussian distribution g ( y ) ~ N (0, g )
( y d ) that satisfies
g ( y)dy .
|| y|| 2
CWIT 2005
29
References
[1] C. D. Charalambous and Farzad Rezaei, Robust Coding
for Uncertain Sources, Submitted to IEEE Trans. On
Information Theory, 2004.
[2] C. D. Charalambous and Alireza Farhadi, Mathematical
Theory for Robust Control and Communication Subject
to Uncertainty, Preprint, 2005.
[3] A. Renyi, “On Measures of Entropy and Information”, in
Proc. 4th Berkeley Symp. Mathematical Statistics and
Probability, vol. I, Berkeley, CA, pp. 547-561, 1961.
CWIT 2005
30
References
[4] C. Tsallis, Possible Generalization of Boltzmann-Gibbs
Statistics, Journal of Statistics Physics, vol. 52, pp. 479,
1998.
[5] B. F. Wu, and E. A. Jonckheere, A Simplified Approach to
Bode’s Theorem for Continuous-Time and Discrete-Time
Systems, IEEE Trans on AC, vol. 37, No. 11, pp. 17971802, Nov 1992.
[6] C. D. Charalambous and Alireza Farhadi, Robust Entropy
and Robust Entropy Rate for Uncertain Sources:
Applications in Networked Control Systems, Preprint, 2005.
CWIT 2005
31