13_Artificial_Neural_Networks

Download Report

Transcript 13_Artificial_Neural_Networks

Artificial Neural
Networks
Neural networks to the rescue
Neural network: information processing
paradigm inspired by biological nervous
systems, such as our brain
 Structure: large number of highly
interconnected processing elements
(neurons) working together
 Like people, they learn from experience
(by example)

Neural networks to the rescue


Neural networks are configured for a specific
application, such as pattern recognition or data
classification, through a learning process
In a biological system,
learning involves
adjustments to the
synaptic connections
between neurons
 same for artificial neural networks (ANNs)
Where can neural network systems help
When we can't formulate an algorithmic
solution.
 When we can get lots of examples of the
behavior we require.
‘learning from experience’
 When we need to pick out the structure
from existing data.

Inspiration from Neurobiology




A neuron: many-inputs /
one-output unit
output can be excited or
not excited
incoming signals from
other neurons determine
if the neuron shall excite
("fire")
Output subject to
attenuation in the
synapses, which are
junction parts of the
neuron
Mathematical representation
The neuron calculates a weighted sum of inputs and
compares it to a threshold. If the sum is higher than the
threshold, the output is set to 1, otherwise to -1.
Non-linearity
The McCulloch-Pitts Model of Neuron

Figure :Symbolic Illustration of Linear Threshold Gate
The McCulloch-Pitts model of a neuron is simple yet has
substantial computing potential. It also has a precise mathematical
definition. However, this model is so simplistic that it only
generates a binary output and also the weight and threshold values
are fixed. The neural computing algorithm has diverse features for
various applications . Thus, we need to obtain the neural model
with more flexible computational features.
A simple perceptron


It’s a single-unit network
Change the weight by an
amount proportional to the
difference between the
desired output and the
actual output.
Δ Wi = η * (D-Y).Ii Input
Learning rate
Actual output
Desired output
Perceptron Learning Rule
Example: A simple single unit
adaptive network

The network has 2
inputs, and one
output. All are binary.
The output is



1 if W0I0 + W1I1 + Wb > 0
0 if W0I0 + W1I1 + Wb ≤ 0
We want it to learn
simple OR:
output a = 1 if either
I0 or I1 is 1.
A simplest network
x1
Neuron 1
Neuron 3
x2
Neuron 2
10
Solving XOR problem using the simplest
network
x1  x2  x1 x2  x1 x2  f1 ( x1 , x2 )  f 2 ( x1 , x2 )
x1
-3
N1
1
3
3
3
-1
x2
3
N3
-1
3
N2
11
Solving XOR problem using the simplest
network
Neuron 1
Neuron 2
Neuron 3
Inputs
#
XOR=
~
W  (1,3,3)
x1
~
W  (3,3,1)
sign ( z )
x2
Z
output
~
W  (1,3,3)
sign ( z )
Z
output
sign ( z )
Z
 x1  x2
output
1)
1
1
1
1
5
1
5
1
1
2)
1
-1
-5
-1
7
1
-1
-1
-1
3)
-1
1
7
1
-1
-1
-1
-1
-1
4)
-1
-1
1
1
1
1
5
1
1
12
Learning
From experience: examples / training data
 Strength of connection between the
neurons is stored as a weight-value for
the specific connection
 Learning the solution to a problem =
changing the connection weights

Operation mode
Fix weights (unless in online learning)
 Network simulation = input signals flow
through network to outputs
 Output is often a binary decision
 Inherently parallel
 Simple operations and threshold:
fast decisions and real-time
response

Single-Layer Network




structure of connecting
neurons into a network is by
layers.
input layer : neurons are to
only pass and distribute the
inputs and perform no
computation.
Thus, the only true layer of
neurons is the one on the
right.
Each of the inputs x1,x2,…xN
is connected to every artificial
neuron in the output layer
through the connection
weight.
Single-Layer Network


Since every value of outputs
y1,y2,…yN is calculated from
the same set of input values,
each output is varied based on
the connection weights.
Although the presented
network is fully connected, the
true biological neural network
may not have all possible
connections - the weight value
of zero can be represented as
``no connection".
Multilayer Network



To achieve higher level of
computational capabilities, a
more complex structure of
neural network is required.
Figure shows the multilayer
neural network which
distinguishes itself from the
single-layer network by
having one or more hidden
layers.
In this multilayer structure,
the input nodes pass the
information to the units in
the first hidden layer, then
the outputs from the first
hidden layer are passed to
the next layer, and so on.
Multilayer Network



Multilayer network can be
also viewed as cascading of
groups of single-layer
networks.
The level of complexity in
computing can be seen by
the fact that many singlelayer networks are
combined into this
multilayer network.
The designer of an artificial
neural network should
consider how many hidden
layers are required,
depending on complexity in
desired computation.
Multilayer Network - Inter-layer
connections

Fully connected


Partially connected



There is another set of connections carrying the output of the neurons
of the second layer into the neurons of the first layer.
Feed forward and bi-directional connections could be fully- or partially
connected.
Hierarchical


The neurons on the first layer send their output to the neurons on the
second layer, but they do not receive any input back form the neurons
on the second layer.
Bi-directional


A neuron of the first layer does not have to be connected to all
neurons on the second layer.
Feed forward


Each neuron on the first layer is connected to every neuron on the
second layer.
If a neural network has a hierarchical structure, the neurons of a lower
layer may only communicate with neurons on the next level of layer.
Resonance

The layers have bi-directional connections, and they can continue
sending messages across the connections a number of times until a
certain condition is achieved.
Evolving networks

Continuous process of:




Evaluate output
Adapt weights
Take new inputs
“Learning”
ANN evolving causes stable state of the
weights, but neurons continue working:
network has ‘learned’ dealing with the
problem
Learning performance
Network architecture
 Learning method:





Supervised learning: have a teacher, telling
you where to go
Unsupervised learning: no teacher, net learns
by itself
Reinforcement learning: have a critic, wrong
or correct
Type of learning used depends on task at
hand.
Where are NN used?
Recognizing and matching complicated,
vague, or incomplete patterns
 Data is unreliable
 Problems with noisy data







Prediction
Classification
Data association
Data conceptualization
Filtering
Planning
Applications

Prediction: learning from past
experience




pick the best stocks in the market
predict weather
identify people with cancer risk
Classification



Image processing
Predict bankruptcy for credit card companies
Risk assessment
Applications

Recognition




Pattern recognition: SNOOPE (bomb detector
in U.S. airports)
Character recognition
Handwriting: processing checks
Data association

Not only identify the characters that were
scanned but identify when the scanner is not
working properly
Strengths of a Neural Network


Power: Model complex functions, nonlinearity
built into the network
Ease of use:



Learn by example
Very little user domain-specific expertise needed
Intuitively appealing: based on model of
biology, will it lead to genuinely intelligent
computers/robots?
Neural networks cannot do anything that cannot be
done using traditional computing techniques,
BUT they can do some things which would
otherwise be very difficult.
General Advantages

Advantages




Disadvantages



Adapt to unknown situations
Robustness: fault tolerance due to network
redundancy
Autonomous learning and generalization
Not exact
Large complexity of the network structure
For motion planning?