Introduction to knowledge-based systems

Download Report

Transcript Introduction to knowledge-based systems

Lecture 5
Knowledge-based systems
Sanaullah Manzoor
CS&IT, Lahore Leads University
[email protected]
https://sites.google.com/site/engrsanaullahmanzoor/home
Research Project
Submission Date: 10/04/2016
Group members: 3-4
Submission in form of presentation and report.
Topics
1.
2.
3.
Feature Selection Methods
Supervised Learning
Un-Supervised Learning
Description:
Take three recent research papers (2010 to 2016) understand
problem area, author’s techniques, identify any drawback/
improvement in that technique. Write plagiarized-free project report
depicting selected papers techniques, drawbacks/improvements
and your algorithm or method for that problem.
Overview

Classification

Artificial Neural Networks

ANN classifier implementation
3
Classification
Artificial Neural Networks
Applications: Classification
Business
•Credit rating and risk assessment
•Insurance risk evaluation
•Fraud detection
•Insider dealing detection
•Marketing analysis
•Mailshot profiling
•Signature verification
•Inventory control
Engineering
•Machinery defect diagnosis
•Signal processing
•Character recognition
•Process supervision
•Process fault analysis
•Speech recognition
•Machine vision
•Speech recognition
•Radar signal classification
Security
•Face recognition
•Speaker verification
•Fingerprint analysis
Medicine
•General diagnosis
•Detection of heart defects
Science
•Recognising genes
•Botanical classification
•Bacteria identification
Applications: Modelling
Business
•Prediction of share and
commodity prices
•Prediction of economic indicators
•Insider dealing detection
•Marketing analysis
•Mailshot profiling
•Signature verification
•Inventory control
Engineering
•Transducer linerisation
•Colour discrimination
•Robot control and
navigation
•Process control
•Aircraft landing control
•Car active suspension
control
•Printed Circuit auto
routing
•Integrated circuit layout
•Image compression
Science
•Prediction of the performance of
drugs from the molecular structure
•Weather prediction
•Sunspot prediction
Medicine
•. Medical imaging
and image processing
Applications: Forecasting
•Future sales
•Production Requirements
•Market Performance
•Economic Indicators
•Energy Requirements
•Time Based Variables
Applications: Novelty
Detection
•Fault Monitoring
•Performance Monitoring
•Fraud Detection
•Detecting Rate Features
•Different Cases
Background
- Neural Networks can be :
- Biological models
- Artificial models
- Desire to produce artificial systems capable of
sophisticated computations similar to the human brain.
Biological analogy
The brain is composed of a mass of interconnected
neurons
each neuron is connected to many other neurons
Neurons transmit signals to each other
Whether a signal is sent, depends on the strength of the
bond (synapse) between two neurons
How Does the Brain Work ?
NEURON
- The cell that performs information processing in the brain.
- Fundamental functional unit of all nervous system tissue.
How Does the Brain Work ?
Each neuron consists of :
SOMA, DENDRITES, AXON and Synapse.
Brain vs. Digital Computers
- Computers require hundreds of cycles to simulate
a firing of a neuron.
- The brain can fire all the neurons in a single step.
Parallelism
- Serial computers require billions of cycles to
perform some tasks but the brain takes less than
a second.
e.g. Face Recognition
Comparison of Brain and computer
Human
100 Billion
Processing
neurons
Elements
Interconnects 1000 per
neuron
Cycles per sec 1000
2X
improvement
200,000
Years
Computer
10 Million
gates
A few
500 Million
2 Years
Definition of Neural Network
A Neural Network is a system composed of
many
simple processing elements operating in parallel which
can acquire, store, and utilize experiential knowledge.
Neurons vs. Units
- Each element of NN is a node called unit.
- Units are connected by links.
- Each link has a numeric weight.
Computing Elements
A typical unit:
Network architectures
Three different classes of network architectures
single-layer feed-forward
multi-layer feed-forward
neurons are organized
in acyclic layers
The architecture of a neural network is linked with the
learning algorithm used to train
18
Single Layer Feed-forward
Input layer
of
source nodes
Output layer
of
neurons
19
Multi layer feed-forward
3-4-2 Network
Output
layer
Input
layer
Hidden Layer
20
Dimensions of a Neural Network
Various types of neurons
Various network architectures
Various learning algorithms
Various applications
Back Propagation
Learning
- In 1969 a method for learning in multi-layer network,
Backpropagation, was invented by Bryson and Ho.
- The Back-propagation algorithm is a sensible approach
for dividing the contribution of each weight.
22
Back-propagation Learning Principles:
Hidden Layers and Gradients
There are two differences for the updating rule :
1) The activation of the hidden unit is used instead of the
input value.
2) The rule contains a term for the gradient of the activation
function.
23
Back-propagation Network training
1. Initialize network with random weights
2. For all training cases (called
examples):
a. Present training inputs to network and
calculate output
b. For all layers (starting with output layer,
back to input layer):
i. Compare network output with correct output
(error function)
ii. Adapt weights in current layer
This is
what you
want
Neural Network Example
𝜃4
𝜃6
outpu
t
𝜃5
Neural Network Example
Steps:
1. Initialize the weights
2. Propagate the inputs forward
Note: where 𝑤𝑖𝑗 is the
weight of the connection
from unit i in the previous
layer to unit j; 𝑂𝑖 is
the output of unit i from
the previous layer; and 𝜃𝑗
is the bias of the unit.
Neural Network Example
Steps:
3. Calculate output at each node
4. Calculate error
Neural Network Example
Steps:
5. Update weights
28
Neural Network Example
𝜃4
𝜃6
outpu
t
𝜃5
Neural Network Example
Solution:
1-Calculating Input at each nude with
following formula
2- Calculating Output at each nude with
following formula
30
Neural Network Example
Solution:
3- Calculating error at each nude
3a - If node is hidden layer node then using following formula
3b - If node is output node using then following formula
Neural Network Example
Solution:
4- Updating bias and weights
Role of Bias
Just to add my two cents.
A simpler way to understand
the bias is it is somehow
similar to the coefficient b of
a linear function
y = ax + b
It allows you to move the line up
and down to fit the prediction
with the data better.
Without b the line always go
through the origin (0.0) and you
may get a poorer fit.
Face Recognition
90% accurate learning head pose, and recognizing 1-of-20 faces
34
Handwritten digit recognition
35
Reference
Russell and Norvig, Artificial Intelligence: A
Modern Aproach, 2nd ed, Pearson Education.
Jiawei Han and Micheline Kamber, “Data
Mining:Concepts and Techniques”, 2nd ed
36