ICANN2006web

Download Report

Transcript ICANN2006web

ICANN 2006, Greece
Backbone Structure of Hairy
Memory
Cheng-Yuan Liou
Department of Computer Science
and Information Engineering
National Taiwan University

2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Discussions

Patterns in {N_i,p & N_i,n} are backbones of the Hopfield
model. They form the backbone structure of the model.

Hairy model is a homeostatic system.

All four methods, et-AM, e-AM, g-AM, and b-AM, derive
asymmetric weight matrices with nonzero diagonal elements
and keep Hebb’s postulate.

In almost all of our simulations, the evolution of states
converged in a single iteration (basin-1) during recall after
learning. This is very different from the evolutionary recall
process in many other models.
21
Discussions
All three methods, et-AM, e-AM, and g-AM, operate in one shift.
Each hyperplane is adjusted in turn.
Each iteration improves the location of a single hyperplane.
Each hyperplane is independent of all others during learning.
Localizing neuron damages
Localizing learning
The computational cost is linearly proportional to the network
size, N, and the number of patterns, P.
22
Discussions



All of the methods, et-AM, e-AM, g-AM and b-AM
give non-zero values to the self-connections, wii \=
0, which is very different from Hopfield’s setting, wii
= 0.
We are still attempting to understand and clarify the
meaning of the setting wii = 0, where newborn
neurons start learning from full self-reference, wii =
1, and end with whole network-reference, wii = 0.
This is beneficial for cultured neurons working as a
whole. This implies that stabilizing memory might
not be the only purpose of learning and evolution
23
Discussions

The Boltzmann machine can be designed
according to et-AM, e-AM, or g-AM.
24
25