Artificial Neural Networks

Download Report

Transcript Artificial Neural Networks

SELF ORGANISING
NETWORKS/MAPS (SOM)
AND
NEURAL NETWORK APPLICATIONS
Outcomes
 Look at the theory of self-organisation.
 Other self-organising networks
 Look at examples of neural network
applications
Four requirements for SOM
Weights in neuron must represent a class of
pattern
 one neuron, one class
Four requirements for SOM
Inputs pattern presented to all neurons and
each produces an output.
 Output: measure of the match between input
pattern and pattern stored by neuron.
Four requirements
A competitive learning strategy selects neuron
with largest response.
Four requirements
A method of reinforcing the largest response.
Architecture
 The Kohonen network (named after Teuvo
Kohonen from Finland) is a self-organising
network
 Neurons are usually arranged on a 2dimensional grid
 Inputs are sent to all neurons
 There are no connections between neurons
Architecture
X
Kohonen network
Theory
 For a neuron output (j) is a weighted
some:
 Where x is the input, w is the weights,
net is the output of the neuron
Four requirement-Kohonen
networks
 True
 Euclidean distance and weighted sum
 Winner takes all
 Learning rule of Kohonen learning
Output value
 The output of each neuron is the weighted
sum
 There is no threshold or bias
 Input values and weights are normalized
“Winner takes all”
 Initially the weights in each neuron are
random
 Input values are sent to all the neurons
 The outputs of each neuron are compared
 The “winner” is the neuron with the largest
output value
Training
 Having found the winner, the weights of the
winning neuron are adjusted
 Weights of neurons in a surrounding
neighbourhood are also adjusted
Neighbourhood
X
neighbourhood
Kohonen network
Training
 As training progresses the neighbourhood
gets smaller
 Weights are adjusted according to the
following formula:
Weight adjustment
 The learning coefficient (alpha) starts with a
value of 1 and gradually reduces to 0
 This has the effect of making big changes to the
weights initially, but no changes at the end
 The weights are adjusted so that they more
closely resemble the input patterns
Example
 A Kohonen network receives the input
pattern 0.6 0.6 0.6.
 Two neurons in the network have
weights 0.5 0.3 0.8 and -0.6 –0.5 0.6.
 Which neuron will have its weights
adjusted and what will the new values
of the weights be if the learning
coefficient is 0.4?
Answer
The weighted sums are 0.96 and –0.3 so the first neuron wins.
The weights become:
w1 = 0.5 + 0.4 *(0.6 – 0.5)
w1 = 0.5 + 0.4 * 0.1 = 0.5 + 0.04 = 0.54
w2 = 0.3 + 0.4 *(0.6 – 0.3)
w2 = 0.3 + 0.4 * 0.3 = 0.3 + 0.12 = 0.42
w2 = 0.8 + 0.4 *(0.6 – 0.8)
w2 = 0.8 - 0.4 * 0.2 = 0.8 - 0.08 = 0.72
Summary
 The Kohonen network is self-organising
 It uses unsupervised training
 All the neurons are connected to the input
 A winner takes all mechanism determines which
neuron gets its weights adjusted
 Neurons in a neighbourhood also get adjusted
Demonstration
 A demonstration of a Kohonen network
learning has been taken from the following
websites:
 http://www.patol.com/java/TSP/index.html
 http://www.samhill.co.uk/kohonen/index.htm
Applications of Neural Networks
ARTIFICIAL INTELLIGENCE
TECHNIQUES
Example Applications
 Analysis of data
 Classifying in EEG
 Pattern recognition in ECG
 EMG disease detection.
Gueli N et al (2005) The influence of lifestyle on cardiovascular
risk factors analysis using a neural network Archives of
Gerontology and Geriatrics 40 157–172
 To produce a model of risk facts in heart
disease.
 MLP used
 The accuracy was relatively good for
chlorestremia and triglyceremdia:
 Training phase around 99%
 Testing phase around 93%
 Not so good for HDL
Subasi A (in press) Automatic recognition of alertness level from
EEG by using neural network and wavelet coefficients Expert Systems
with Applications xx (2004) 1–11
 Electroencephalography (EEG)
 Recordings of electrical activity from the brain.
 Classifying operation
 Awake
 Drowsy
 Sleep
 MLP
 15-23-3
 Hidden layer – log-tanh function
 Output layer – log-sigmoid function
 Input is normalise to be within the range 0 to
1.
 Accuracy
 95%+/-3% alert
 93%+/-4% drowsy
 92+/-5% sleep
 Feature were extracted and form the input to
the network, from wavelets.
Karsten Sternickel (2002) Automatic pattern recognition in
ECG time series Computer Methods and Programs in
Biomedicine 68 109–115
 ECG – electrocardiographs – electrical signals
from the heart.
 Wavelets again.
 Classification of patterns
 Patterns were spotted
Abel et al (1996) Neural network analysis of the EMG
interference pattern Med. Eng. Phys. Vol. 18, No. 1.
pp. 12-l 7
 EMG – Electromyography – muscle activity.
 Interference patterns are signals produce
from various parts of a muscle- hard to see
features.
 Applied neural network to EMG interference
patterns.
 Classifying
 Nerve disease
 Muscle disease
 Controls
 Applied various different ways of presenting
the pattern to the ANN.
 Good for less serve cases, serve cases can
often be see by the clinician.
Example Applications
 Wave prediction
 Controlling a vehicle
 Condition monitoring
Wave prediction
 Raoa S, Mandal S(2005) Hindcasting of storm
waves using neural networks Ocean Engineering
32 (2005) 667–684
 MLP used to predict storm waves.
 2:2:2 network
 Good correlation between ANN model and
another model
van de Ven P, Flanagan C, Toal D (in press)
Neural network control of underwater vehicles
Engineering Applications of Artificial
Intelligence
 Semiautomous vehicle
 Control using ANN
 ANN replaces a mathematical model of the
system.
Silva et al (2000) THE ADAPTABILITY OF A TOOL WEAR
MONITORING SYSTEM UNDER CHANGING CUTTING CONDITIONS
Mechanical Systems and Signal Processing (2000)
14(2), 287-298
 Modelling tool wear
 Combines ANN with other AI (Expert
systems)
 Self organising Maps (SOM) and ART2
investigated
 SOM better for extracting the required
information.
Examples to try yourself
 A.1 Number recognition (ONR)
 http://www.generation5.org/jdk/demos.asp#
neuralNetworks
 Details:
http://www.generation5.org/content/2004/si
mple_ocr.asp
 B.1 Kohonen Self Organising Example 1
 http://www.generation5.org/jdk/demos.asp#
neuralNetworks
 B.2 Kohonen 3D travelling salesman problem
 http://fbim.fhregensburg.de/~saj39122/jfroehl/diplom/eindex.html