Transcript apr3
CS 4100 Artificial Intelligence
Prof. C. Hafner
Class Notes April 3, 2012
Term Project Presentations
Thursday, April 12 Groups:
1.
2.
3.
4.
Tuesday, April 17 Groups:
5.
6.
7.
8.
9.
Naive Bayes Classifiers:
Our next example of machine learning
• A supervised learning method
• Making independence assumption, we can explore a
simple subset of Bayesian nets, such that:
• It is easy to estimate the CPT’s from sample data
• Uses a technique called “maximum likelihood
estimation”
– Given a set of correctly classified representative
examples
– Q: What estimates of conditional probabilities maximize
the likelihood of the data that was observed?
– A: The estimates that reflect the sample proportions
# Juniors
were Juniors and
# Juniors
were Non-Juniors
# Non-Juniors
Class Exercise: Naive Bayes Classifier
with multi-valued variables
Major: Science, Arts, Social Science
Student characteristics:
Gender (M,F),
Race/Ethnicity (W, B, H, A)
International (T/F)
What do the conditional probability tables look like??
Perceptron Leaning Algorithm and BackProp
Perceptron Learning (Supervised)
•
•
•
•
Assign random weights (or set all to 0)
Cycle through input data until change < target
Let α be the “learning coefficient”
For each input:
– If perceptron gives correct answer, do nothing
– If perceptron says yes when answer should be no,
decrease the weights on all units that “fired” by α
– If perceptron says no when answer should be yes,
increase the weights on all units that “fired” by α