Artificial Speciation and Automatic Modularisation

Download Report

Transcript Artificial Speciation and Automatic Modularisation

Artificial Speciation and
Automatic Modularisation
Neural Network Ensemble
• A group of neural networks is used to solve a
problem; each perhaps concentrating on part of the
dataset
• The final output of the ensemble is determined by
combining the output of individual NN using:
– Majority Voting
– Simple Averaging
– Weighted average
• Evolutionary NNE: Ensemble = whole population/
a sub-population formed by clustering
Distance Measure
• Distance between two NN, D(p,q), is
measured by the cross-entropy:
pj
qj
1 n
D( p, q )   ( p j log
 q j log )
2 j 1
qj
pj
• Low D(p,q) means p and q are very similar
Fitness Sharing
• Speciation at the phenotypic (behavior) level
• Raw fitness:
1
f raw, p 
• Shared fitness:
MSE p
f shared, p 
f raw, p
n
 s( D( p, q ))
j 1
j
• The sharing function, S(D(p,q)), can be linear
– Gaussian distribution in this case?
Negative Correlation Learning
• Error of NN i for the nth training pattern
1
Ei 
N
N
1
1
2
( Fi (n)  d (n)) 

N
n 1 2
N
  p ( n)
n 1
i
• The penalty term
pi (n)  ( Fi (n)  F (n))  ( F j (n)  F (n))
j i
The Algorithm
• Elitist: top portion with high fraw and the one
with best fshare
• Selection based shared fitness
• Crossover: exchange subnet with same
input and output
• Mutation: add/delete connection between
two randomly chosen nodes
Some Thoughts
• Very high computational cost in calculating the
cross-entropy and the fitness sharing function!
• EDO
– Fitness sharing among agents of the same parent?
– Speciation between agents belonging to different
lineage?
• ERA?
References
1. V. Khare and X. Yao, Artificial Speciation and
Automatic Modularisation, in Proc. of SEAL’02.
2. Y. Liu, X. Yao, T. Higuchi, Evolutionary
Ensembles with Negative Correlation Learning,
in IEEE Tran. Evolutionary Computation, 4(4),
pp. 380-387, November, 2000.