No Slide Title
Download
Report
Transcript No Slide Title
N. Laskaris
Professor John Hopfield
The Howard A. Prior Professor of Molecular Biology
Dept. of Molecular Biology
Computational Neurobiology; Biophysics
Princeton University
The physicist Hopfield showed that models
of physical systems could be used
to solve computational problems
Such systems could be implemented
in hardware by combining
standard components
such as capacitors and resistors.
The importance of the Hopfield nets
in practical application is limited
due to theoretical limitations of the structure,
but, in some cases,
they may form interesting models.
Usually employed in binary-logic tasks :
e.g. pattern completion and association
The concept
In the beginning of 80s
Hopfield published two scientific papers,
which attracted much interest.
(1982): ‘’Neural networks and physical systems with
emergent collective computational abilities’’.
Proceedings of the National Academy of Sciences, pp. 2554-2558.
(1984): ‘’Neurons with graded response have collective
computational properties like those of two-state
neurons’’.
Proceedings of the National Academy of Sciences, pp. 81:3088-3092
This was the starting point of the new era
of neural networks, which continues today
‘‘The dynamics
of brain computation”
The core question :
How is one to understand
the incredible effectiveness of a brain
in tasks such as recognizing
a particular face in a complex scene?
Like all computers,
a brain is a dynamical system
that carries out its computations
by the change of its 'state' with time.
Simple models
of
the
dynamics
Using these collective properties
of neural circuits
are described
in processing
information
that have collective
dynamical
properties.
is effective in that
it exploits the spontaneous properties
These can be of
exploited
nerve cells and circuits
in
recognizing
sensory
patterns.
to produce robust computation.
J. Hopfield’s quest
While the brain is totally unlike modern computers,
Associative
memory,as computation.
much of what it does
can be described
logic and inference,
His
research
focuses
recognizing an odor or a chess position,
on understanding
parsing
the world into objects,
andhow
generating
appropriate
sequences
of brain
locomotor
the neural
circuits
of the
muscle
commands
produce
are all describable
such powerful
and
complex
as computation.
computations.
Olfaction
However,problem
olfaction
remote sensing,
The simplest
inallows
olfaction
and much
morehas
complex
computations
is simply
identifying
a known odor.
Hopfield
been studying
windmight
direction
how such involving
computations
be performed
and
mixtures
of odors
by fluctuating
the known neural
circuitry
must be described
to account
for the ability
of the olfactory
bulb
ofand
homing
pigeonscortex
or slugs
navigate
prepiriform
of to
mammals
throughcircuits
the useof
ofsimpler
odors. animals.
or the analogous
Dynamical systems
Any computer does its computation
Systems of differential
equations
by its changes
in internal state.
can represent these aspects of neurobiology.
In neurobiology,
He seeks tothe
understand
aspects
change ofsome
potentials
ofof
neurons
neurobiological
computation
(and changes
in the strengths of the synapses)
through
the behavior
ofthe
equations
withstudying
time is what
performs
computations.
modeling the time-evolution of neural activity.
Action potential computation
For much of neurobiology,
information is represented
by the paradigm of ‘‘firing rates’’,
i.e. information is represented
by the rate of generation of action potential spikes,
and the exact timing of these spikes is unimportant.
Action potential computation
Since action potentials
last only about a millisecond,
the use of action potential timing
seems a powerful potential means of neural
computation.
Action potential computation
There are cases,
for example the binaural auditory determination
of the location of a sound source,
where information is encoded
in the timing of action potentials.
Speech
Identifying words in natural speech is a difficult
computational task which brains can easily do.
They use this task as a test-bed
for thinking about
the computational abilities of neural networks
and neuromorphic ideas
Simple (e.g. binary-logic ) neurons
are coupled in a system
with recurrent signal flow
A 2-neurons Hopfield network
of continuous states
characterized by 2 stable states
1st
Example
Contour-plot
2nd
Example
A 3-neurons Hopfield network of 23=8 states
characterized by 2 stable states
The behavior of such a dynamical system
is fully determined by the synaptic weights
3rd
Example
Wij = Wji
And can be thought of as
an Energy minimization process
Hopfield Nets are fully connected,
symmetrically-weighted networks
that extended the ideas of linear associative memories
by adding cyclic connections .
Note: no self-feedback !
Operation of the network
After the ‘teaching-stage’,
in which the weights are defined,
the initial state of the network is set (input pattern)
and
a simple
recurrent
rule is iterated
Regarding
training
a
Hopfield
net
There
are two
of operation:
till convergence
to a modes
stable state
(output
pattern)
as a main
content-addressable
memory
Synchronous
vs.
Asynchronous
updating
the outer-product rule for storing patterns is used
Hebbian Learning
Probe pattern
Dynamical
evolution
A Simple Example
Step_1. Design a network
with memorized patterns (vectors) [ 1, -1, 1 ] & [ -1, 1, -1 ]
Step_2. Initialization
There are 8 different states
that can be reached by the net
and therefore can be used as its initial state
#1: y1
#2: y2
#3: y3
Step_3. Iterate till convergence
Synchronous
Updating
3 different examples
of the net’s flow
It converges immediately
Step_3. Iterate till convergence
- Synchronous Updating -
Stored pattern
Schematic diagram of all the dynamical trajectories
that correspond to the designed net.
Or
Step_3. Iterate till convergence
- Asynchronous Updating Each time,
select one neuron
at random
and update its state
with the previous rule
and the –usual- convention
that if the total input to
that neuron is 0
its state remains
unchanged
Explanation of the convergence
There is an energy function related
with each state of the Hopfield network
E( [y1, y2, …, yn]T ) = -Σ Σ wij yi yj
where [y1, y2, …, yn]T
is the vector of neurons’ output,
wij is the weight from neuron j to neuron i,
and the double sum is over i and j.
The corresponding
dynamical system evolves
toward states of lower Energy
States of lowest energy
correspond to attractors
of Hopfield-net dynamics
E( [y1, y2, …, yn]T ) =
= -Σ Σ wij yi yj
Attractor-state
Capacity of the Hopfield memory
In short, while training the net
(via the outer-product rule)
we’re storing patterns by posing different
attractors in the state-space of the system.
While operating,
the net searches the closest attractor.
When this is found,
the corresponding pattern of activation is outputted
How many patterns
we can store in a Hopfield-net ?
0.15 N,
N: # neurons
Computer
Experimentation
Class-project
A simple
Pattern Recognition
Example
Stored Patterns (binary images)
Perfect RecallImage Restoration
Erroneous
Recall
Irrelevant results
Note:
explain
the ‘negatives’ ….
The continuous Hopfield-Net
as optimization machinery
[ Tank and Hopfield ;
IEEE Trans. Circuits Syst. 1986; 33: 533-541.]:
‘Simple "Neural" Optimization Networks:
An A/D Converter, Signal Decision Circuit,
and a Linear Programming Circuit’
Hopfield modified his network
so as to work with continuous activation and
-by adopting a dynamical-systems approach-
showed that the resulting system is characterized
by a Lyaponov-function
who termed it ‘Computational-Energy’
& which can be used to tailor
the net for specific optimizations
The system of coupled differential equation
describing the operation of continuous Hopfield net
dui
ui n
Tij g j ( u j ) I i
dt
ni j 1
Neuronal
outputs: Yi ≡ Vi
Biases: Ii
Weights:
Wij ≡ Tij
g( u)
1 n n
E
Tij g i ( u i ) g j ( u j )
2 i1 j1
1 n n
E
Tij Vi Vj
2 i1 j1
n
1
2
1 tanh(gain u)
n
I i gi (ui )
i 1
Tij=Tji και Tij=0
I i Vi The Computational Energy
i 1
When Hopfield nets are used for function optimization,
the objective function F to be minimized is written as
energy function in the form of computational energy E .
The comparison between E and F
leads to the design,
i.e. definition of links and biases,
of the network that can solve the problem.
The actual advantage of doing this
is that the Hopfield-net
has a direct hardware implementation
that enables even a VLSI-integration
of the algorithm performing the optimization task
An example:
‘Dominant-Mode Clustering’
Given a set of N vectors {Xi} define the k among them
that form the most compact cluster {Zi}
N
N
2
F ({ u i } ) = ui u j
Xi - X j
i=1
j=1
1 if Xi { Zi }
{ui } with ui = k : {ui }
i
0 if Xi { Zi }
N
The objective function F can be written easily
in the form of computational energy E
There’s an additional Constraint
so as k neurons are ‘on’
N
F ({ u i } ) =
N
Xi - X j
i=1
2
ui u j
j=1
N
1 N N
F= - T ij V i V j - I i V i
2 i=1 j=1
i=1
0 if i j
2
T ij = T ji = -2 D(i, j) =
2 X i X j
I iobj = 0
With each pattern Xi we associate a neuron
in the Hopfield network ( i.e. #neurons = N ).
The synaptic weights are the pairwise-distances (*2)
If its activation is ‘1’ when the net will converge
the corresponding pattern will be included in the cluster.
A classical example:
‘The Travelling
Salesman Problem’
The principle
Coding a possible route
as a combination
of neurons’ firings
53 4 1 2 5
|5-3|+|3-4|+|4-1|+|1-2|+|2-5|
An example
from clinical Encephalography
The problem :
The idea :
The solution :
‘‘Hopfield Neural Nets
for monitoring Evoked Potential Signals’’
N. Laskaris et al.
[ Electroenc. Clin. Neuroph. 1997;104(2) ]
The Boltzmann
Machine
Improving Hopfield nets
by simulating annealing
and adopting
more complex topologies
(430 – 355) π.X.
‘Ας κλείσω λοιπόν εδώ . . . .
..............
. . . . κάποιος άλλος,
ίσως θα συμπληρώσει
όσα δεν μπόρεσα να ολοκληρώσω’
- Θεμιστογένης ο Συρακούσιος
1ο έτος της 105ης Ολυμπιάδας
ΕΛΛΗΝΙΚΑ
(1979-1982)
(1982)
Hopfield-nets
PNAS
‘‘ Τα παιδιά στην Κερκίδα
είναι η μόνη σου Ελπίδα ....’’
A Very Last Comment
on Brain-Mind-IntelligenceLife-Happiness
How I Became Stupid
by
Martin Page
Penguin Books, 2004, 160 pp.
ISBN: 0-14-200495-2
In HOW I BECAME STUPID,
The 25-year-old Antoine concludes
‘‘to think is to suffer’’,
a twist on the familiar assertion of
For Antoine, intelligence
Descartes.
is the source of unhappiness.
He embarks on a series of hilarious strategies
to make himself
stupid and possibly happy
Animals
that Abandon
their Brains
Dr. Jun Aruga
Laboratory for Comparative Neurogenesis
A “primitive but
successful”
animal
Oxycomanthus
japonicus
There is astonishing diversity in the nervous systems of
animals, and the variation between species is remarkable.
From the basic, distributed nervous systems of jellyfish and sea
anemones to the centralized neural networks of squid and
octopuses to the complex brain structures at the terminal end
of the neural tube in vertebrates,
the variation across species is humbling
people may claim that “more advanced” species like humans
are the result of an increasingly centralized nervous system that
was produced through evolution.
This claim of advancement through evolution is a common, but
misleading, one. It suggests that evolution always moves in one
direction: the advancement of species by increasing complexity
evolution may selectively enable body structures
that are more enhanced and complicated,
but it may just as easily enable species
that have abandon complex adaptations
in favour of simplification.
Brains, too, have evolved in the same way.
While the brains of some species, including humans,
developed to allow them to thrive,
others have abandoned their brains
because they are no longer necessary.
For example, the ascidian, or sea squirt, lives in shallow
coastal waters and which is a staple food in certain regions,
has a vertebrate-like neural structure with a neural tube and
notochord in its larval stage.
As the larvae becomes an adult, however,
these features disappear until
only very basic ganglions remain.
In evolutionary terms this animal is a “winner”
because it develops a very simplified neural system better
adapted to a stationary life in seawater
In the long run, however, evolutionary success will be
determined by what species survives longer:
humans with their complex brains (and their weapons)
or the brainless Dicyemida
1948-1990
Δισέγγονος του Ζορμπά και ανηψιός της Ελλης Αλεξίου.
Γεννήθηκε στην Αθήνα.
Ξεκίνησε την καριέρα του το 1970 από τη Θεσσαλονίκη
με το συγκρότημα-ντουέτο "Δάμων και Φιντίας".
Το 1976 ιδρύει το συγκρότημα "Σπυριδούλα".
Η σκέψη μας
είναι το αφεντικό
ή
ο υπηρέτης μας ;
Emotional Intelligence
also called EI or EQ ,
describes an ability, capacity, or skill
to perceive, assess, and manage
the emotions of one's self, of others, and of groups
H ποιητική νοημοσύνη
μπορεί να λείπει από τους
παντογνώστες,
κι ωστόσο να κατοικεί
μέσα στον πιο απλόν άνθρωπο
Class-project Oral-Exams
Oral-Exam
Appointments
Date
AEM
1st
hour
Time
2nd
hour
3rd
hour
31 May
5 June
7 June
794
845
893
899
915
920
932
949
1023
711
809
874
909
923
950
979
1024
1227
627
887
946
960
962
980
995
1202
1223
Further Inquiries