Graphs with tiny vector chromatic numbers and huge chromatic

Download Report

Transcript Graphs with tiny vector chromatic numbers and huge chromatic

The 1’st annual (?)
workshop
Communication
under
Channel Uncertainty:
Oblivious channels
Michael Langberg
California Institute of Technology
2
Coding theory
X
m
Y
{0,1}k
Noise
y
x = C(m)  {0,1}n
Error correcting codes
C: {0,1}k
decode
m
{0,1}n
3
Communication channels X
x
e
Y
y=xe
Design of C depends on properties of channel.
•Channel W:
W(e|x) = probability that error e is imposed by
channel when x=C(m) is transmitted.
In this case y=xe is received.
BSCp: Binary Symmetric Channel.
Each bit flipped with probability p.
W(e|x)=p|e|(1-p)n-|e|
4
Success criteria
C: {0,1}k
{0,1}n
• Let D: {0,1} {0,1} be a decoder.
• C is said to allow the communication of m over W
n
k
(with D) if Pre[D(C(m)e)=m] ~ 1.
BSCover
: exist codes with
• Probability
W(e|C(m)).
p [Shannon]
• C is said to allow the communication of {0,1}
rate ~ 1-H(p) (optimal).
k
over W
(with D) if Prm,e[D(C(m)e)=m] ~ 1.
• Probability uniform over {0,1} and over W(e|C(m)).
• Rate of C is k/n.
X
x=C(m)
k
e
Y
y=xe
5
Channel uncertainty
• What if properties of the channel are not known?
• Channel can be any channel in family W = {W}.
• Objective: design a code that will allow communication
not matter which W is chosen in W.
• C is said to allow the communication of {0,1}
k
over
channel family W if there exists a decoder D s.t. for
each WW : C,D allow communication of {0,1}k over W.
X
?
Y
6
Adversarial model in
The family Wp which the channel W is
chosen maliciously by an
A channel W is a p-channel
if it canjammer
only change
adversarial
limits ofW(e|x)=0
Wp.
a p-fraction of the bitswithin
transmitted:
•
if |e|>pn.
Power constrain on W
• W = family of all p-channels.
p
• Communicating over W : design a code that enables
p
communication no matter which p-fraction of bits are
flipped.
X
Y
7
Communicating over Wp
• Communicating over W : design a code C that enables
p
communication no matter which p-fraction of bits are
flipped.
• “Equivalently”: minimum distance of C is 2pn.
•What is the maximum achievable
*
*
rate over W ?
•
Major open problem.
*
*
*
C
•
Known: 1-H(2p) ≤ R < 1-H(p)
*
*
Min. distance
p
*
{0,1}n
*
X
Wp
Y
8
This talk
X
Y
• Communication over W not fully understood.
• W does not allow communication w/ rate 1-H(p).
• BSC allows communication at rate 1-H(p).
• In “essence” BSC W (power constraint).
• Close gap by considering restriction of W .
• Oblivious channels
• Communication over W with the assumption that
p
p
p
p
p
p
p
the channel has a limited view of the transmitted
codeword.
9
Oblivious channels
X
Y
• Communicating over W : only p-fraction of bits can be flipped.
p
• Think of channel as adversarial jammer.
• Jammer acts maliciously according to codeword sent.
• Additional constraint: Would like to limit the amount
of information the adversary has on codeword x sent.
• For example:
• Channel with a “window” view.
• In general: correlation between codeword x and
error e imposed by W is limited.
11
Oblivious channels: model
• ALet
channel
W is
if W(e|x) is independent
W and
Woblivious
be two distributions
over errors.of x.
Define
as follows:
• BSC
is anWoblivious
channel.
W(e|x) = W (e) if the first bit of x is 0.
• AW(e|x)
channel=W
partially-oblivious
if xthe
dependence of
Wis(e)
if the first bit of
is 1.
0
p
1
0
1
W is almost completely oblivious.
W(e|x) on x is limited:
•
• Intuitively I(e,x) is small.
Partially oblivious - definition:
• For each x: W(e|x)=W (e) is a distribution over {0,1} .
• Limit the size of the family {W |x}.
n
x
x
X
Y
12
Families of oblivious channels
• A family of channels W  W is (partially) oblivious if
*
p
each WW* is (partially) oblivious.
• Study the rate achievable when comm. over W .
• Jammer W is limited in power and knowledge.
• BSC is an oblivious channel “in” W .
• Rate on BSC ~ 1-H(p).
*
*
p
p
p
• Natural question: Can this be extended to any
family of oblivious channels?
13
Our results
X
Y
• Study both oblivious and partially oblivious families.
• For oblivious families W one can achieve rate ~ 1-H(p).
• For families W of partially oblivious channels in which
*
*
WW* : {Wx|x} of size at most 2n.
Achievable rate ~ 1-H(p)- (if  < (1-H(p))/3).
• Sketch proof for oblivious W .
*
14
Previous work
• Oblivious channels in W
*
have been addressed by
[CsiszarNarayan] as a special case of Arbirtrarily
Varying Channels with state constraints.
• [CsiszarNarayan] show that rate ~ 1-H(p) for oblivious
channels in W* using the “method of types”.
• Partially oblivious channels not defined previously.
• For partially oblivious channels [CsiszarNarayan]
implicitly show 1-H(p)-30 (compare with 1-H(p)-).
• Our proof technique are substantially different.
15
Proof technique: Random codes
• Let C be a code (of rate 1-H(p)) in which each codeword
is picked at random.
• Show: with high probability C allows comm. over any
oblivious channel in W* (any channel W which always
imposes the same distribution over errors).
• Implies: Exists a code C that allows comm. over W
*
with
rate 1-H(p).
17
Proof sketch
X
x
e
Y
y=xe
• Show: with high probability C allows comm. over any oblivious
channel in W*.
• Step 1: show that C allows comm. over W
*
iff C allows comm. over
channels W that always impose a single error e (|e| ≤ pn).
• Step 2: Let W
e
be the channel that always imposes error e.
Show that w.h.p. C allows comm. over We.
• Step 3: As there are only ~ 2
H(p)n
channels We: use union bound.
18
Proof of Step 2
• Step 2: Let W
e
X
x
e
Y
y=xe
be the channel that always imposes error e.
Show that w.h.p. C allows comm. over We.
• Let D be the Nearest Neighbor decoder.
• By definition: C allows comm. over W iff
e
for most codewords x=C(m): D(xe)=m.
• Codeword x=C(m) is disturbed if D(xe)m.
• Random C: expected number of disturbed
*
*
*
*
e
*
*
*
*
*
*
C
codewords is small (i.e. in expectation C allows communication).
• Need to prove that number of disturbed codewords is small w.h.p.
19
Concentration
X
x
e
Y
y=xe
• Expected number of disturbed codewords is small.
• Need to prove that number of disturbed
codewords is small w.h.p.
• Standard tool - Concentration inequalities:
Azuma, Talagrand, Chernoff.
• Work well when random variable has small
Lipschitz coefficient.
• Study Lipschitz coefficient of our process.
*
*
*
*
e
*
*
*
*
*
*
C
20
Lipschitz coefficient
X
x
Y
e
y=xe
Lipschitz coefficient in our setting:
• Let C and C’ be two codes that differ
in a single codeword.
• Lipschitz coefficient = difference between
number of disturbed codewords in C and C’
w.r.t. We.
• Can show that L. coefficient is very large.
• Cannot apply standard concentration techniques.
• What next?
*
*
*
*
e
*
*
*
*
*
*
C
21
Lipschitz coefficient
[Vu]: Random process in which Lipschitz coefficient has
X
small “expectation” and “variance” will have exponential
x e
concentration:
probability of deviation from expectation is exponential in
deviation.
Y
y=xe
Lipschitz coefficient in our setting is large.
concentration
of that
low degree
multivariate
•[KimVu]:
However
one may show
“on average”
polynomials (extends Chernoff).
Lipschitz coefficient is small.
• This is done by studying the list decoding
properties of random C.
• Once we establish that “average” Lipschitz
coef. is small one may use recent concentration
*
*
*
*
e
*
*
*
*
*
*
C
result of Vu to obtain proof.
• Establishing “average” Lipschitz coef. is technically involved.
22
Conclusions and future research
• Theme: Communication over W
p
not fully understood.
Gain understanding of certain relaxations of Wp.
• Seen:
• Oblivious channels W  W .
• Allows rate 1-H(p).
• Other relaxations:
• “Online adversaries”.
• Adversaries restricted to changing certain locations
*
p
(unknown to X and Y).
23