NeuralComp - University of Washington

Download Report

Transcript NeuralComp - University of Washington

Biologically Inspired
Computation
Chris Diorio
Computer Science & Engineering
University of Washington
[email protected]
Nature is telling us something...
 Can add numbers together in
nanoseconds
 Hopelessly beyond the
capabilities of brains
C. Diorio, 10–8–00
 Can understand speech trivially
 Far ahead of digital computers
 …and Moore’s law will end
2
Problem: How do we build circuits that learn
 One approach: Emulate neurobiology
 Dense arrays of synapses
synapse
error signal
learn signal
synapse
W21
W22
output   W2 j X j
j
synapse
error signal
learn signal
synapse
W11
W12
output   W1 j X j
j
X1
C. Diorio, 10–8–00
input vector X
X2
3
Silicon synapses
 Use the silicon physics itself for learning
 Local, parallel adaptation
 Nonvolatile memory
Silicon Synapse Transistor
Charge Q Sets the Weight
-5
10
-6
n+
p
floating gate
(charge Q)
n+
n+
n–
electron
injection
electron
tunneling
p – substrate
source current (A)
10
Q1
-7
Q2
10
Q3
-8
10
Q4
Q5
-9
10
-10
10
-11
10
0
1
2
3
4
5
control-gate–to–source voltage (V)
C. Diorio, 10–8–00
4
Silicon synapses can mimic biology
 Local, autonomous learning
Biological Synapses
Silicon Synapses
synapse source currents (nA)
5
4
3
2
1
0
–10
0
10
20
30
40
50
time (min)
Mossy-fiber EPSC amplitudes plotted over time, before and after the
induction of LTP. Brief tetanic stimulation was applied at the time indicated. From Barrionuevo et al., J. Neurophysiol. 55:540-550, 1986.
C. Diorio, 10–8–00
Synapse transistor source currents plotted over time, before and
after we applied a tetanic stimulation of 2×10 5 coincident (row
& column) pulses, each of 10 µs duration, at the time indicated.
5
Synaptic circuits can learn complex functions
1
 Synapse-based circuit operates
on probability distributions
Competitive learning
Nonvolatile memory
11 transistors
0.35µm CMOS
Silicon physics learns
“naturally”
value (V)





0.8
true means
circuit output
0.6
software neural
network
0.4
0.2
0
1000
2000
3000
4000
number of training examples
 Silicon learning circuit versus software neural network
 Both unmix a mixture of Gaussians
 Silicon circuit consumes nanowatts
 Scaleable to many inputs and dimensions
C. Diorio, 10–8–00
6
Technology spinoff: Adaptive filters
 Synapse transistors for signal processing
 ~100× lower power and ~10× smaller size than digital
Mixed-signal FIR filter
FIR filter with on-chip learning
16-tap, 7-bits 225MHz, 2.5mW
Built and tested in 0.35µm CMOS
Adjust synaptic tap weights off-line
64 taps, 10 bits, 200MHz, 25mW
In fabrication in 0.35µm CMOS
On-line synapse-based LMS
C. Diorio, 10–8–00
7
Problem: How to study neural basis of behavior
 Measure neural signaling in intact animals
A. Tritonia and seapen
 Implant a microcontroller in Tritonia brain
 Tritonia is a model organism
 Well studied neurophysiology
 500µm neurons; tolerant immune response
 Work-in-progress
Tritonia diomedea
MEMS probe tip,
amplifier
brain
visceral
cavity
memory
C. Diorio, 10–8–00
B. Brain with implanted chip: Dorsal view
tether
battery
microcontroller,
A/D, cache
Images courtesy James Beck & Russell Wyeth
8
An in-flight data recorder for insects
 An autonomous microcontroller “in-the-loop”
 Study neural basis of flight control
Manduca Sexta or “hawk moth”
C. Diorio, 10–8–00
9