Transcript Document

SOFT COMPUTING &
COMPUTATIONAL INTELLIGENCE
•
•
•
•
•
•
•
Biologically inspired computing models
Compatible with human expertise/reasoning
Intensive numerical computations
Data and goal driven
Model-free learning
Fault tolerant
Real world/novel applications
RENSSELAER
SOFT COMPUTING &
COMPUTATIONAL INTELLIGENCE
•
•
•
•
•
•
Artificial Neural Networks (ANN)
Fuzzy Logic
Genetic Algorithms (GAs)
Fractals/Chaos
Artificial life
Wavelets
ANNs
• Data mining
FL
GAs
RENSSELAER
Biological neuron
hair cell
dendrites (sensory transducer)
signal
flow
synapse
axon hillock
cell body
axon
synapse
RENSSELAER
Artificial neuron
i1
w1
inputs
i2
w2
i3
o

w3
weighted sum
of the inputs
output
1
nonlinear
transfer function
w1 i1+w2 i2 +w3 i3
o
sigmoid
0
w1 i1+w2 i2 +w3 i3
RENSSELAER
Neural net yields weights
to map inputs to outputs
Molecular
weight

w11
Neural Network

h
w11
H-bonding

Hydrofobicity
Electrostatic
interactions

w34
Molecular
Descriptor

Boiling Point

Biological response
h
w23
Observable
Projection
There are many algorithms that can
determine the weights for ANNs RENSSELAER
Neural networks in a nutshell
• A problem can be formulated and represented as a mapping
problem from  m   n
• Such a map can be realized by an ANN, which is a
framework of basic building blocks of
McCulloch-Pitts neurons
• The neural net can be trained to conform with the map
based on samples of the map and will reasonably generalize
to new cases it has not encountered before
RENSSELAER
Neural network as a map
m  n
0000000
0000001
0000010
0000011
...
1111111
map from
R7to R1
1
0
R7
R1
64
Figure 1.1 The seven-bit parity problem posed as a mapping problem.
R1
Map from
R4096 to R1
64
R1
R2
R2
Figure 1.2
Determining the radius of a 64x64 B&W image of a circle, posed as a formal mapping
problem.
RENSSELAER
McCulloch-Pitts Neuron
x1
sum 
w1
w2
 f()
w3
x3
wN
xN
w x
i 1 N
y
i i
y  f sum 
1
f sum  
1  e  sum
RENSSELAER
Neural network as collection of M-P neurons
x1
x2
w 111
w 112
w 113
1
 f() w 211
 f()
 f()
11
 f()
w 22
w 123
Output
w 3 neuron
 f() w 321
 f() w 232
First hidden
layer
Second hidden
layer
E
y
 o
2
noutputs
j 1
j
tj
wjin 1  wjin   w ji
dE
w ji  
dw ji
RENSSELAER
Kohonen SOM for
text retrieval on
WWW newsgroups
WEBSOM node u21
Click arrows
to move to neighboring nodes on the map.
Instructions
Re: Fuzzy Neural Net References Needed Derek Long , 27 Oct 1995, Lines: 24.
Distributed Neural Processing Jon Mark Twomey, 28 Oct 1995, Lines: 12.
Distributed Neural Processing Jon Mark Twomey, 28 Oct 1995, Lines: 12.
Re: neural-fuzzy TiedNBound, 11 Dec 1995, Lines: 10.
New neural net C library available Simon Levy, 2 Feb 1996, Lines: 15.
Re: New neural net C library available Michael Glover, Sun, 04 Feb 1996, Lines: 25.
From Guido De Boeck
SOM’s for Data Mining
To be published (Springer Verlag)