Expanding SOM

Download Report

Transcript Expanding SOM

A Self-Organizing Map with
Expanding Force for Data
Clustering and Visualization
Advisor:Dr. Hsu
Graduate:You-Cheng Chen
Author:Wing-Ho Shum
Hui-Dong Jin
Kwong-Sak Leung
Outline









Motivation
Objective
Introduction
Expanding SOM
Experimental Results
Example
Conclusions
Personal Opinion
Review
Motivation
SOM maps high-dimensional data items onto a lowdimensional gird of neurons. The neighborhood
preservation cannot always lead to perfect topology
preservation.
Objective

We establish an Expanding SOM(ESOM) to detect
and preserve better topology correspondence.
Introduction
Expanding SOM
In order to detect and preserve topology relationship, we can figure
out the distance between data and their center.
And we introduce a new learning rule to liner ordering relationship.
the expanding coefficient cj(t), which is used to push neurons away
from the center of all data items during the learning process.
Expanding SOM
The ESOM algorithm
1. Linearly transform the coordinates

'
xi  x1' i , x2' i ,, xDi'

(i  1, N)
Let the center of all data items be xC
'
1

N
N
x
'
i
i
'
'
R
The input neurons xi 
( xi  xC ) for all i
Dmax
N is the number of data items
D is the dimensionality of the data set
Dmax is the maximum distance of data from the data center
Expanding SOM
The ESOM algorithm
2. The initialize weight vectors w j (0) (j  1,, M)
with random value within the above sphere
SR where M is the number of output neurons.
3. Select a data item at random , say
feed it to the input neurons.
xk (t )  x1k , x2k ,, xDk 
Expanding SOM
The ESOM algorithm
4. Find the winning output neuron, say m(t)
m(t )  min xk (t )  w j (t )
j
5. Train neuron m(t) and its neighbors by using the following


w j (t  1)  c j (t ) w j (t )   j (t ) xk (t )  w j (t )

Expanding SOM
The ESOM algorithm
c j (t )  1  2 j (t )(1   j (t )) k j (t ) 

1
2
where kj(t) is specified by
2
2
k j (t )  1  xk (t ), w j (t )  (1  xk (t ) )(1  w j (t ) )
Expanding SOM
The ESOM algorithm
6. Update α(t) . If the learning loop does not reach
a predetermined number, go to Step 3 with t=t+1
Expanding SOM
Expanding SOM
We employ both the quantization error EQ and the topological
error ET to evaluate the mapping obtained by our ESOM.
1
EQ 
N
N

k 1
xk (t )  wm (t )
Expanding SOM
Theoretical analysis
Theorem 1
 w j (t )  SR
then w j (t )  R
 for j {1,2, , M}
1  c j (t ) 
1
1- R2
 c j (t ) increases with x k (t ) when x k (t )  w j (t )
Expanding SOM
Theoretical analysis
Proof
c j (t )  1  2 j (t )(1   j (t )) k j (t ) 

1
2
Expanding SOM
Theoretical analysis
 says that cj(t) is always larger than or equal to 1.0
In other words, it always pushes neurons away from
the origin. But it will never push the output neurons
to infinite locations.
 points out that the larger the distance between a
data item and the center of all data items is, the
stronger the cj(t) will be associated output neuron.
Experiment results
Experiment results
Experiment results
Experiment results
Experiment results
Conclusions
ESOM constructs better visualization results than the
classic SOM in terms of both the topological and
quantization errors.
Personal Opinion
We can apply this idea to improve the Extended SOM
implemented by our lab.
Review
The key point of Expanding SOM
'
'
R
The input neurons xi 
( xi  xC ) for all i
Dmax
c j (t )  1  2 j (t )(1   j (t )) k j (t ) 

1
2