open in new window

Download Report

Transcript open in new window

Group 10
Application Of Neural Networks in
Stock Market Prediction
Backpropagation
Backpropagation is the basis for training a
supervised neural network.
At the core of all backpropagation methods is
the calculation of the sensitivity that a cost
function has with respect to the internal states
and weights of a network. It implies a backward
pass of error to each internal node within the
network, which is then used to calculate weight
gradients for that node. Learning progresses by
alternately propagating forward the activations
and propagating backward the instantaneous
errors.
ADX: average directional movement over the previous 18 days
S&P 500: current value of 500 listed companies
S&P-5d: net change in the S&P 500 value from five days prior
http://www-cse.stanford.edu/classes/sophomore-college/projects-00/neural-networks/Applications/stocks.html
Learning
When we work with yesterday's price, we not only know
the price for the "day - 1", but also the price we are trying
to predict, called the DESIRED OUTPUT of the Neural
Net. When we compare the two values, we can compute
the Error:
dError = dDesiredOutput - dOutput;
Now we can adjust this particular neuron to work better
with this particular input. For example, if the dError is
10% of the dOutput, we can we can increase all synaptic
weights of the neuron by 10%.
Learning (Contd..)
Once we decided the adjustment for the neurons in the
output layer, we can backpropagate the changes to the
previous layers of the network. Indeed, as soon as we
have desired outputs for the output layer, we can make
adjustment to reduce the error (the difference between
the output and the desired output). Adjustment will
change weights of the input nodes of the neurons in the
output layer.
But the input nodes of the last layer are OUTPUT nodes
of the previous layer! So we have the actual output of the
previous layer and the desired output (after correction) and we can adjust the previous layer of the net! And so
on, until we reach the first layer.