Neural Networks - Flathead Valley Community College

Download Report

Transcript Neural Networks - Flathead Valley Community College

Neural Networks
John Riebe and Adam Profitt
What is a neuron?
PR
W
∑
b
n
ƒ
a
Elements of the input vector
Weights
Summer
Bias
Sum of all P elements and b
Translation Function
Output
Weights: Weights are scalars that multiply each input element
Summer: The summer sums the input elements, PR, together with the bias
Bias:
A bias is a number that is added to the total from the summer
Translation Function:
A translation function is one of many specific functions used in
neural networking.
Layers of the Neural Network
There are only three different types of layers
in a network:
•The Input Layer
•Moves the input vectors into each
neuron of the first hidden layer
•The Hidden Layers
•Performs the bulk of the computations
in most networks
•Hidden layers are not always required
•The Output Layer
•Each neuron in the output layer
outputs it’s own result
Translation Functions
Types of Neural Networks
Perceptrons:
1. Used to classify data.
2. Applies the hard-limit transfer function.
3. Usually does not have any hidden layers.
Linear Filters
1. Used to solve linearly separable problems.
2. Applies the linear transfer function.
Backpropagation
1. Generally has only one hidden layer.
2. Can solve any reasonable problem.
3. Hidden layers use sigmoid translations, outputs use the linear transfer function
Training Neurons
Training a network sets the biases and weights in each neuron
To train a network you need:
•A network
•An input
•A target vector
There are many different types of
training algorithms. To name a few:
•Levenberg-Marquardt
•BFGS quasi-Newton
•Bayesian regularization
•One step secant
•Random order incremental
Training algorithms
1. Gives a network an input
2. Receives the output
3. Calculates error between output and target
4. Adjusts weights and biases
5. Goes back to step 1
Each time the algorithm goes through the steps is called an epoch.
Most networks go through many epochs.
Matlab
Application
The newff Function
Create a feed-forward network
Syntax
net = newff
net = newff(PR,[S1 S2...Si],{TF1 TF2...TFi})
Description
net = newff creates a new network with a dialog box.
newff(PR,[S1 S2...Si],{TF1 TF2...TFi}) takes,
PR - R x 2 matrix of min and max values for R input elements.
Si - Size of ith layer, for Nl layers.
TFi - Transfer function of ith layer, default = 'tansig'.
The train Function
Trains a neural network
Syntax
net = train(net,P,T)
Description
train trains a network.
train(net,P,T) takes,
net - Neural network object.
P - Network inputs.
T - Network targets, default = zeros.
The sim Function
The sim function simulates a neural network.
This function feeds the network the input, P, and displays the results.
Syntax
sim(net,P)
Description
sim simulates neural networks.
sim(net,P) takes,
net - Network.
P - Network inputs.
Transfer Functions Revisited
Transfer functions:
•Hard-Limit
a = hardlim(n)
•Linear
a = purelin(n)
•Log-Sigmoid
a = logsig(n)
•Tan-Sigmoid
a = tansig(n)
Outputs either a 1 or a 0
Outputs the scaled and summed input
Squeezes the input to between 0 and 1
Squeezes the input to between -1 and 1
The Baum-Haussler Rule
The Baum-Haussler Rule is one of the most useful rules for neural networks.
Nhidden ≤ (Ntrain • Etolerance) / (Npts + Noutputs)
This rule helps you determine the maximum number of neurons you will need
for your network to function properly.
This is NOT a law: it will not work in all situations.
Sometimes you just have to use another method.
Bibliography
Demuth, Howard and Mark Beale. “Neural Network Toolbox User’s Guide.“ 1992-2003
URL: http://www.mathworks.com/access/helpdesk/help/toolbox/nnet/nnet.shtml