On the Interaction Between Lossy Compression, Randomness and

Download Report

Transcript On the Interaction Between Lossy Compression, Randomness and

Stochastic Pooling Networks:
On the interaction between redundancy,
noise and lossy compression in biological neurons.
Mark D. McDonnell
University of South
Australia
Pierre-Olivier Amblard
CNRS, Grenoble, France
Nigel G. Stocks
University of Warwick, UK
6 July 2008
This talk is about understanding how biological
neurons efficiently compress stimuli during coding.
What is a Stochastic
Pooling Network?
What do SPNs Model?
Emergent properties
of SPNs
Real signals are analog but nearly everything in
modern electronics is digital. Why is this?
X
1
x
R1
1
2
x
2
R2
CEO
x
X
N
RN
N
N
R
i 1
i
R
Information theory underpins fast and accurate
communication. But it is also about selecting what
information is worth storing and communicating.
What mechanisms do biological sensory systems use
to compress information during transduction?
There is lossy, and lossless compression…
and maybe something in between: “loss-least!” 
A “Recipe” for Stochastic Pooling Networks
Ingredients
•
1 information source
•
N > 1 independent sensors (compressive types)
•
N > 1 random noise sources
Preparation
Step 1: For each sensor, measure the information
source mixed with one of the noise sources,
and then compresses its measurement.
Step 2: Ensure the whole network produces a single
measurement by pooling each sensor’s.
SPNs can model many sensor network or
source coding scenarios
• Digital Beamforming Arrays
– Sonar, radar, MIMO communications
• Digital signal processing
– Noise reduction via coherent averaging after digitization in
ADCs.
• Distributed/Decentralized sensor networks
– CEO problem
– Multiaccess Communication Channels
– Power constrained wireless sensor networks
• Biological neurons
– Representation of analog stimuli by rate coding action
potentials
– Quantal release of chemical neurotransmitters at synaptic
junctions
– Maybe some ion channels?
SPNs can model many sensor network or
source coding scenarios
• Reliability schemes in nano-electronics
– Averaging and redundancy to overcome
parameter variations and noise [Ferran Martorell, Spain]
• Social networks
– Subjective voting on a continuous variable
• Quantum optical communication using
polarization detection of single photons?
• Coupled multistable dynamic systems?
• Reconfigurable chaotic logic gates?
The neurons that code sounds immediately after
transduction can be modelled as an SPN
Ear
Cochlea
Information
Sound
Inner Hair Cell
Auditory Nerve
The neurons that code sounds immediately after
transduction can be modelled as an SPN
Outer
hair cell
Inner
hair cell
Brain
Basilar
membrane
Slide courtesy of Prof Tony Burkett, Uni. Melbourne
We assume combining of the ingredients
is left to physics: POOLING
1
g1 (.)
g2 (.)
x
y1  {0,1,..,M}
y2  {0,1,..,M}
P
2
y = h(y1, ..,yN)
|y| << M
N
gN(.)
yN  {0,1,..,M}
Pooling must occur “naturally,” without external
intervention, e.g. adding, or superposition.
We will not have a pooling network otherwise!
Pooling loses no (or negligible) information!
Assume the information source, x, is random.
The mutual information, I(x,y) loosely measures how well, on average,
the SPN output, y, provides a good estimate for x.
This is a surprising emergent property of SPNs
SPNs digitize (quantize) their input
McDonnell, Stocks et. al., Fluct. Noise Lett. 5, L457-L468, 2005.
McDonnell, “Applying stochastic signal quantization theory to the robust digitization of noisy analog
signals", Book chapter in Springer Verlag Complexity Series, In Press, 2008.
SPNs digitize (quantize) their input
McDonnell an Abbott, Proc. SPIE, 2006.
SPNs reduce noise via coherent averaging…
but not in a linear way!
0.5log2(1+NSNR)
N=1,M=511
N=511,M=1
• For small noise, performance is limited by compression:
I(x,y) < log(1+MN)
• For large noise, performance is limited by “averaging”:
I(x,y) < 0.5log(1+N SNR)
There are a number of differences in the final SNR
achieved by linear analog averaging vs SPNs
Output SNR
Analog
SPN
Dependency on
source distribution
no
yes
Decoding
dependency
N/A
yes
Scaling with input
SNR
linear
Scaling with N
(no. of times
averaged)
Scaling with M (bits)
-linear for small SNRs,
-nonlinear for large SNRs
proportional - Approx. proportional for
small SNRs
-no effect for large SNRs
N/A
-increases for small SNRs
-no effect for large SNRs
McDonnell, “Signal Estimation Via Averaging of Coarsely Quantised Signals,” Proc IEEE
Information, Decision and Control, Adelaide, Australia, pp 100-105, 2005.
Suprathreshold Stochastic Resonance in SPNs
SSR measured by mutual information for
additive random noise.
[Stocks, Phys. Rev. Lett., 2000]
Probability of Error: binary detection.
[Zozor, Amblard and Duchene, Fluct. Noise Lett.
7, L39-L60, 2007]
Multiplicative Noise at the input to network nodes.
[Nikitin, Stocks and Morse, Phys. Rev. E, 2007]
Simulation of cochlear implant coding.
[Stocks et. al., Proc SPIE, 2007.]
*Also studies by others, e.g. Rousseau & Chapeau-Blondeau (2003), Hoch et al (2003), Martorell (2005)…
Optimizing the nodes: more noise means more
nodes that are identical
McDonnell, Stocks et. al., “Optimal information transmission in nonlinear arrays through
suprathreshold stochastic resonance,” Phys. Lett. A 352, pp. 183-189, 2006.
We have observed similar effects for optimized networks of Poisson
neuron models, and binary detection networks.
There are many other surprising emergent
properties
• Very noisy SPNs behave like analogue
Gaussian channels [McDonnell, IEEE Aus. Comm Theory Wkshp, 2008]
• Very large SPNs behave like multiplicative
noise channels [McDonnell and Stocks, Proc. SPIE. 2007]
• Optimal reconstruction depends only on the
noise distribution and the number of sensors
[McDonnell, Stocks and Abbott, Phys. Rev. E 75, Art. No. 061105.]
• Optimizing the noise distribution is like
optimizing a neuron’s stimulus-response
curve.
• Negative correlation provides improved MI.
• The optimal stimulus is actually discrete!
Unsolved problems on SPNs in biology
• Do biological senses really use SPN
principles?
• What mechanisms does the brain use
to compress/reduce information
• Will the controlled use of random noise
in cochlear implants improve the
hearing of patients?
If healthy auditory neurons act like SPNs then
bionic ears should stimulate them randomly!
Electrode array
1st turn
of
inner ear
Auditory
nerve
fibres
Image courtesy of Cochlear Ltd, 2008
If healthy auditory neurons act like SPNs then
bionic ears should stimulate them randomly!
“Stochastic beamforming coding strategy”,
Morse, Holmes, Shulgin, Nikitin and Stocks, 2007
© Australasian Science, 2008
There are many unsolved
theoretical problems on SPNs
• Can the clustering of nodes be
predicted mathematically?
• What further complexities can be added
to SPNs without changing the basic
ingredients?
In summary, stochastic pooling networks are a
versatile and surprising concept for achieving the
twin goals of accuracy and efficiency.
Redundancy allows noise
reduction and simplicity
Lossy compression is
required for efficiency
Random noise improves
sub-optimal compression
Questions?