Transcript nn1-02
Introduction to Neural Networks
Jianfeng Feng
School of Cognitive and Computing Sciences
[email protected]
Spring 2001
Tutors
Name
Office
Email
Jianfeng Feng
COGS-5C19
jianfeng
(first half term)
Andy Philippides
CCNR
andrewop
(second half term)
Tutors
Name
Office
Email
Jianfeng Feng
COGS-5C19
jianfeng
(first half term)
Andy Philippides
CCNR
andrewop
(second half term)
Lectures -- 2 per week
Time
Day
Place
2:00 - 2:50
Thu
Arun - 401
4:00 - 4:50
Fri
PEV1- 1A7
Seminar– 1 per week (from first week) (Andy)
Tutors
Name
Office
Email
Jianfeng Feng
COGS-5C19
jianfeng
(first half term)
Andy Philippides
CCNR
andrewop
(second half term)
Lectures -- 2 per week
Time
Day
Place
2:00 - 2:50
Thu
Arun - 401
4:00 - 4:50
Fri
PEV1- 1A7
Seminar– 1 per week (from first week) (Andy)
Please see school notice board
Lecture notes are available at my homepage
(I will modify them before each lecture)
Today’s Topics:
Summary
A comparison of biological neuron networks
with artificial neural networks
A little bit math
Single artificial neuron
Course Summary
After a short introduction to neurons, synapses, and the
concept of learning (biological and statistical foundations of
neural networks), the course covers
• methods of supervised learning (perceptron and linear
separability, backprop,
Course Summary
After a short introduction to neurons, synapses, and the
concept of learning (biological and statistical foundations of
neural networks), the course covers
• methods of supervised learning (perceptron and linear
separability, backprop, radial basis functions and the
problems of generalization, support vector machine)
• methods of unsupervised learning (classification, PCA,
Kohonen, vector quantization)
• Computational neuroscience and robots
Course Topics
1. Introduction to neural
networks
6. Radial basis function
networks(2)
2. Formal neurons ( 2)
7. Support vector machines
3. Learning (1)
8. Unsupervised learning(2)
4. Single layer perceptron
9. Preprocessing and PCA
5. Multilayer perceptron (2)
11. Computation Neurosci(2)
(one revision)
(2)=two lectures
(3)=three lectures
Why do we need NN?
Why do we need NN?
Neural Networks Are For
Applications
Science
Character recognition
Neuroscience
Optimization
statistics
Physics, mathematics
Financial prediction
Computer science
Automatic driving
Psychology
..............................
...........................
History
Minsky & Papert(1969)
------
Perceptrons
Rosenblatt(1960)
------
Perceptron
Minsky(1954)
------
Neural Networks (PhD Thesis)
Hebb(1949)
--------The organization of behaviour
McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born
History
spiking neural networks
Vapnik (1990) ---support vector machine
Broomhead & Lowe (1988) ----Radial basis functions (RBF)
Linsker (1988) ----- Informax principle
Rumelhart, Hinton
& Williams (1986)
--------
Back-propagation
Kohonen(1982)
Hopfield(1982)
-----------
Self-organizing maps
Hopfield Networks
Minsky & Papert(1969)
------
Perceptrons
Rosenblatt(1960)
------
Perceptron
Minsky(1954)
------
Neural Networks (PhD Thesis)
Hebb(1949)
--------The organization of behaviour
McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born
History
spiking neural networks (Feng)
Vapnik (1990) ---support vector machine
Broomhead & Lowe (1988) ----Radial basis functions (RBF)
Linsker (1988) ----- Informax principle
Rumelhart, Hinton
& Williams (1986)
--------
Back-propagation
Kohonen(1982)
Hopfield(1982)
-----------
Self-organizing maps
Hopfield Networks
Minsky & Papert(1969)
------
Perceptrons
Rosenblatt(1960)
------
Perceptron
Minsky(1954)
------
Neural Networks (PhD Thesis)
Hebb(1949)
--------The organization of behaviour
McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born
History
spiking neural networks (Feng, Cogs)
Vapnik (1990) ---support vector machine
Broomhead & Lowe (1988) ----Radial basis functions (RBF)
Linsker (1988) ----- Informax principle
Rumelhart, Hinton
& Williams (1986)
--------
Back-propagation
Kohonen(1982)
Hopfield(1982)
-----------
Self-organizing maps
Hopfield Networks
Minsky & Papert(1969)
------
Perceptrons
Rosenblatt(1960)
------
Perceptron
Minsky(1954)
------
Neural Networks (PhD Thesis)
Hebb(1949)
--------The organization of behaviour
McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born
Today’s Topics:
Summary
A comparison of biological neuron networks
with artificial neural networks
What are biological neuron networks?
(see next lectures for more details)
• UNITs: nerve cells called neurons, many different types and are
extremely complex, around 1011 neurons in the brain
• INTERACTIONs: signal is conveyed by action potentials,
interactions could be chemical (release or
receive ions) or electrical. Each neuron makes
contact with around 103 other neurons
• STRUCTUREs: feedforward and feedback and self-activation
recurrent
A carton
: Soma
The complexity of a neuronal system can be
partly seen from a picture in a book I am currently
editing on computational neuroscience
What are (artificial) neural networks?
It is a network with interactions, in attempt to mimicking the brain
• UNITs:
artificial neuron (linear or nonlinear
input-output unit), small numbers,
a few hundreds
• INTERACTIONs: simply by weights, how strong a neuron
affects others
• STRUCTUREs:
could be feedforward, feedback or
recurrent
It is still far too naive, and the development of the field
relies on all of us
Four-layer networks
x1
x2
Input
Output
(visual
input)
(Motor
output)
xn
Hidden layers
Reading list: any book on NN
1.
Neural networks,
Haykin S. (1999), Prentice Hall international Inc.
(nice, but too much)
2.
Neutral Network for Pattern Recognition
Bishop C.M. (1995), Oxford: Clarendon Press
(too much on statistics, less dynamical theory)
3.
Introduction to the theory of neural computation
Hertz J., Krogh A., and Palmer R.G.
(nice, but somewhat out of date)
4. Theoretical Neuroscience
Dayan P., and Abbott L.F. (2001)
More on specific topics
Assessment
Master students: 3 assignments (50%) and unseen
exam (50%)
Undergraduates: 3 assignments (100%) -no exam.
The assignments will use MATLAB or any other languages
(up to your choice).
Seminar running by Andy will help you ensure the assignments
are completed.
Don’t panic, everything is going to be easy.
To understand main ideas and be able to implement them using one
of the languages (MATLAB, C, JAVA) you are most familiar with,
be able to apply them to solving practical problems
For these students who get bored, please come to see me or visit
my homepage to get rough ideas on the research of Neural
Networks and Computational Neuroscience
http://www.cogs.susx.ac.uk/users/jianfeng/
office: 5c19, COGS
Today’s Topics:
Summary
A comparison of biological neuron networks
with artificial neural networks
A little bit math
A Little bit math
a11a1m
A n m
an1anm
m
m
A X ( a1 j x j ,, anj x j )
j 1
matrix
j 1
vector
A Little bit math
Y f (AX)
m
m
j 1
j 1
( f ( a1 j x j ),, f ( anj x j ))
where f is a function, for example the sigmoidal function
underline of a variable = vector
A Little bit math
f ( x)
1
1 exp( x )
sigmoidal function
Output firing rate of a neuron
A Little bit math
Random variables: X (capital), random vector X
discrete case: P(X=a)=p
continuous case: distribution density
p( x )
a
P ( X a ) p( x ) dx
mean EX xp( x ) dx
2
var iance ( X ) E ( X EX )
s tan dard deviation ( X )
More math
http://wwwslab.usc.edu/courses/CS599NCANN.html,
Appendix of Duda, Hart and
Stork’s book
More math
http://wwwslab.usc.edu/courses/CS599NCANN.html,
Appendix of Duda, Hart and
Stork’s book
The general artificial neuron model has five components, shown
in the following list. (The subscript i indicates the i-th input
or weight.)
1.
A set of inputs, xi.
2.
A set of weights, wi.
3.
A bias, u.
4.
An activation function, f.
5.
Neuron output, y