Transcript Slide 1
Sensing and expectation
Life is uncertain.
Optimal statistical inference in an uncertain world
requires that we integrate new information into a
framework of prior expectations.
Suppose your patient has just tested positive for a disease.
How likely is it that he actually has the disease?
What factors do we need to consider?
The probability of testing positive if you have the disease.
The probability of testing positive if you are healthy.
The probability that anybody has the disease, regardless of their test.
Bayes’ rule:
P( s r )
P ( r s ) P( s )
P( r )
P(s r )
The probability of sickness, given the result.
P(r s)
The probability of the result, given sickness.
P(s)
The probability of sickness.
P(r )
The probability of the result.
Thomas Bayes
1701?-1761
Suppose your patient has just tested positive for a disease.
How likely is it that he actually has the disease?
Consider this scenario:
• the probability of testing positive if you have the disease is 95%,
• the probability of testing positive if you are healthy is 5%,
• but the disease is rare, so the probability that anybody has it is 1%.
P( sick )
P( sick ) P( sick )
P( )
P( sick ) P(sick )
P( sick ) P(sick ) P( healthy) P(healthy)
P ( sick ) 0.95
P ( healthy) 0.05
P ( sick ) 0.01
P (healthy) 0.99
P(sick ) 16%
Bayes’ rule:
P( s r )
P ( r s ) P( s )
Thomas Bayes
1701?-1761
P( r )
P(s r )
The probability of the stimulus after taking into account the
response (the posterior distribution).
P(r s)
The probability of the response after taking into account the
stimulus.
P(s)
The probability of the stimulus occurring (the prior distribution).
P(r )
The probability of the response occurring.
eye movements
head movements
hand movements
Sensing often occurs within the context of motor actions.
Motor actions create expectations that are important for the optimal interpretation
of sensory signals.
How does the nervous system account for the expected
consequences of motor actions?
1. Solving this problem can involve filtering out the expected (“trivial”)
consequences of motor actions.
2. Solving this problem can involve also performing a coordinate transform to
re-register sensory information within a more stable coordinate system that
is invariant to the motor action.
Studying how the nervous system integrates sensory information with the expected
consequences of motor actions provides a concrete way to study the integration of
new information with prior expectations.
Filtering out the expected
consequences of
motor actions
Hearing:
filtering out “expected” sounds
wing nerve
neuron carrying corollary discharge
auditory neuron
A dedicated single neuron in the cricket conveys corollary discharge to auditory
processing centers and inhibits auditory interneurons when the cricket is singing.
Hedwig & Poulet Science 2006
Visual acuity (normalized)
Vision:
We move our eyes because our vision is poor outside the fovea
Hans-Werner Hunziker, (2006) “Im Auge des Lesers”, Transmedia Stäubli Verlag Zürich
Saccadic eye movements bring objects into the fovea
We make a saccade several times per second.
Vision:
filtering out “expected” visual motion
Problem #1: fast (saccadic) eye movements blur the image on the retina.
Solution: visual signals are suppressed during saccades.
cat LGN, spontaneous saccades in the dark, avg of 71 cells
Lee & Malpeli, J. Neurophysiol. 1998
Vision:
filtering out “expected” visual motion
Problem #2: even slow eye movements create fictive motion of the image on the retina.
Solution: the visual system interprets visual signals
in the context of knowledge about coordinated eye movements.
-- Hermann von Helmholtz, Physiologische Optik
trans. William James, The Principles of Psychology
Electrosensation:
filtering out “expected” electric field disruptions
a “weakly electric fish”: Gnathonemus petersii
see e.g., Sensory Exotica: A World beyond Human Experience, by H. Hughes (MIT Press, 2001)
Active sensing in electrosensation
higher brain regions
electric organ discharge
command nucleus
electric organ
electrosensory lobe
(ELL)
electrosensory receptor neurons
fish
water
electric organ
discharge
(EOD)
adapted from Bell J. Exp. Biol. 1989
Active sensing in electrosensation
The fish actively
produces electric organ
discharges (EODs).
Objects in the water
perturb the amplitude
of the electric field.
This changes the
latency of spikes in
electrosensory
afferents.
higher brain regions
electric organ discharge
command nucleus
electric organ
corollary discharge
(EOCD)
electrosensory lobe
(ELL)
intramuscular
curare
electric organ
electrosensory receptor neurons
fish
water
electric organ
discharge
(EOD)
adapted from Bell J. Exp. Biol. 1989
Plastic responses to corollary discharge in the electrosensory lobe
EOD command (note: the effect on the electric organ has been blocked for several minutes with curare)
command alone
(9 min)
command
+ electrosensory stimulus 1.5 msec later
command alone
1 min
recording from mormyrid ELL
80 msec
Bell Brain Res. 1986
Plastic responses to corollary discharge
recording from mormyrid ELL
Bell Curr. Opin. Neurobiol. 2001
higher brain regions
electric organ discharge
command nucleus
electric organ
electric organ
corollary discharge
(EOCD)
electrosensory lobe
(ELL)
electrosensory receptor neurons
proprioceptors
fish
water
electric organ
discharge
(EOD)
adapted from Bell J. Exp. Biol. 1989
Plastic responses to proprioceptive stimuli in the electrosensory lobe
tail bend alone
tail bend
+ electrosensory stimulus
tail bend alone
recording from gymnotid ELL
Bastian J. Comp. Physiol. 19956
A circuit for predicting the effects of sensing actions
motor corollary discharge,
proprioceptive afferents
(i.e., the “expectation” signal)
Parallel fiber synapses
are the site of plasticity.
electroreceptive afferents
(i.e., the sensory signal)
mormyrid ELL
Sawtell & Williams J. Neurosci. 2008
Coordinate transforms that
re-register sensory
information within a more
stable coordinate system
The superior colliculus contains maps of
auditory and visual space
inter-aural
level difference
computed
external nucleus
of the inferior colliculus
(map of auditory space)
cochlear
nuclei
(bilateral)
inter-aural
time difference
computed
superior
colliculus
retina
(map of visual space)
Bimodal neurons in the superior colliculus have congruent receptive field locations.
Why might it be useful to map visual and auditory
information onto the same coordinates?
Multimodal neurons with congruent audio-visual receptive field locations
trigger the same behavior even when one source of information is absent
(e.g., if the prey is momentarily silent or out of sight).
Auditory and visual information can be seamlessly combined when both are
present, improving the estimate of the spatial location of the stimulus.
Vision and hearing:
coordinate transforms
Problem: eye movements change the relationship between the visual world and the head,
so visual and auditory maps are misaligned.
Solution: eye position transiently modifies the auditory receptive fields of superior colliculus
neurons.
Stein & Stanford Nature Neuroscience 2008
Vision and hearing:
coordinate transforms
Problem: the growth of the body changes the relationship between the world and the head,
so visual and auditory maps can become misaligned over time.
Solution: over time, eye position modifies the auditory receptive fields of superior colliculus (or
tectum) neurons in a semi-stable fashion.
www.ardgrain.com/barn-owls
Vision and hearing:
coordinate transforms
head turns evoked by visual or auditory stimuli in the barn owl
before prisms
day 1 of prisms (23º)
visual
visual
auditory
auditory
day 42 of prisms (23º)
prisms removed
visual
visual
auditory
auditory
adapted from Knudsen & Knudsen 1989
Plasticity of auditory receptive fields in the owl tectum
before → after prisms (23º)
after 8 weeks of prisms
Why do auditory receptive fields shift toward
visual receptive fields?
If visual responses are stronger and/or more reliable,
then a Hebbian mechanism is sufficient to explain this.
before prisms
just after prisms on
after plasticity
Somatosensation:
coordinate transforms
Kleinfeld, Ahissar, & Diamond, Curr. Opin. Neurobiol. 2006
Adapted from Fee, Mitra, & Kleinfeld, J. Neurophysiol. 1997
Somatosensation:
coordinate transforms
A rodent can determine the position of an object relative to its head position
through the use of a single moving vibrissa.
This requires interpreting somatosensory information within the coordinate
space dictated by vibrissa position.
Curtis & Kleinfeld Nature Neuroscience 2009
Sensory encoding in rat S1 cortex depends on vibrissa position
Primary somatosensory cortex neurons jointly encode both touch and whisking phase.
Sensory and motor circuits in the rodent whisking system are
linked by multiple recurrent loops
4°
3°
2°
motor
neurons
1°
Kleinfeld, Ahissar, & Diamond, Curr. Opin. Neurobiol. 2006