Rule based Methods
Download
Report
Transcript Rule based Methods
Rule based Methods: revised part
Lecture Notes for Chapter 5
Introduction to Data Mining
by
Tan, Steinbach, Kumar
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
1
Rule-Based Classifier
Classify records by using a collection of
“if…then…” rules
Rule:
(Condition) y
– where
Condition is a conjunctions of attributes
y is the class label
– LHS: rule antecedent or condition
– RHS: rule consequent
– Examples of classification rules:
(Blood Type=Warm) (Lay Eggs=Yes) Birds
(Taxable Income < 50K) (Refund=Yes) Evade=No
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Rule-based Classifier (Example)
Name
human
python
salmon
whale
frog
komodo
bat
pigeon
cat
leopard shark
turtle
penguin
porcupine
eel
salamander
gila monster
platypus
owl
dolphin
eagle
Blood Type
warm
cold
cold
warm
cold
cold
warm
warm
warm
cold
cold
warm
warm
cold
cold
cold
warm
warm
warm
warm
Give Birth
yes
no
no
yes
no
no
yes
no
yes
yes
no
no
yes
no
no
no
no
no
yes
no
Can Fly
no
no
no
no
no
no
yes
yes
no
no
no
no
no
no
no
no
no
yes
no
yes
Live in Water
no
no
yes
yes
sometimes
no
no
no
no
yes
sometimes
sometimes
no
yes
sometimes
no
no
no
yes
no
Class
mammals
reptiles
fishes
mammals
amphibians
reptiles
mammals
birds
mammals
fishes
reptiles
birds
mammals
fishes
amphibians
reptiles
mammals
birds
mammals
birds
R1: (Give Birth = no) (Can Fly = yes) Birds
R2: (Give Birth = no) (Live in Water = yes) Fishes
R3: (Give Birth = yes) (Blood Type = warm) Mammals
R4: (Give Birth = no) (Can Fly = no) Reptiles
R5: (Live in Water = sometimes) Amphibians
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Application of Rule-Based Classifier
A rule r covers an instance x if the attributes of
the instance satisfy the condition of the rule
R1: (Give Birth = no) (Can Fly = yes) Birds
R2: (Give Birth = no) (Live in Water = yes) Fishes
R3: (Give Birth = yes) (Blood Type = warm) Mammals
R4: (Give Birth = no) (Can Fly = no) Reptiles
R5: (Live in Water = sometimes) Amphibians
Name
hawk
grizzly bear
Blood Type
warm
warm
Give Birth
Can Fly
Live in Water
Class
no
yes
yes
no
no
no
?
?
The rule R1 covers a hawk => Bird
The rule R3 covers the grizzly bear => Mammal
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Rule Coverage and Accuracy
Tid Refund Marital
Status
Coverage of a rule:
1
Yes
Single
– Fraction of records
2
No
Married
that satisfy the
3
No
Single
antecedent of a rule
4
Yes
Married
5
No
Divorced
Accuracy of a rule:
6
No
Married
– Fraction of records
7
Yes
Divorced
that satisfy both the
8
No
Single
9
No
Married
antecedent and
10 No
Single
consequent of a
(Status=Single) No
rule
Taxable
Income Class
125K
No
100K
No
70K
No
120K
No
95K
Yes
60K
No
220K
No
85K
Yes
75K
No
90K
Yes
10
Coverage = 40%, Accuracy = 50%
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
How does Rule-based Classifier Work?
R1: (Give Birth = no) (Can Fly = yes) Birds
R2: (Give Birth = no) (Live in Water = yes) Fishes
R3: (Give Birth = yes) (Blood Type = warm) Mammals
R4: (Give Birth = no) (Can Fly = no) Reptiles
R5: (Live in Water = sometimes) Amphibians
Name
lemur
turtle
dogfish shark
Blood Type
warm
cold
cold
Give Birth
Can Fly
Live in Water
Class
yes
no
yes
no
no
no
no
sometimes
yes
?
?
?
A lemur triggers rule R3, so it is classified as a mammal
A turtle triggers both R4 and R5
A dogfish shark triggers none of the rules
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Characteristics of Rule-Based Classifier
Mutually exclusive rules
– Classifier contains mutually exclusive rules if
the rules are independent of each other
– Every record is covered by at most one rule
Exhaustive rules
– Classifier has exhaustive coverage if it
accounts for every possible combination of
attribute values
– Each record is covered by at least one rule
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
From Decision Trees To Rules
Classification Rules
(Refund=Yes) ==> No
Refund
Yes
No
NO
Marita l
Status
{Single,
Divorced}
(Refund=No, Marital Status={Single,Divorced},
Taxable Income<80K) ==> No
{Married}
(Refund=No, Marital Status={Single,Divorced},
Taxable Income>80K) ==> Yes
(Refund=No, Marital Status={Married}) ==> No
NO
Taxable
Income
< 80K
NO
> 80K
YES
Rules are mutually exclusive and exhaustive
Rule set contains as much information as the
tree
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Rules Can Be Simplified
Tid Refund Marital
Status
Taxable
Income Cheat
1
Yes
Single
125K
No
2
No
Married
100K
No
3
No
Single
70K
No
4
Yes
Married
120K
No
5
No
Divorced 95K
6
No
Married
7
Yes
Divorced 220K
No
8
No
Single
85K
Yes
9
No
Married
75K
No
10
No
Single
90K
Yes
Refund
Yes
No
NO
{Single,
Divorced}
Marita l
Status
{Married}
NO
Taxable
Income
< 80K
NO
> 80K
YES
60K
Yes
No
10
Initial Rule:
(Refund=No) (Status=Married) No
Simplified Rule: (Status=Married) No
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Effect of Rule Simplification
Rules are no longer mutually exclusive
– A record may trigger more than one rule
– Solution?
Ordered rule set
Unordered rule set – use voting schemes
Rules are no longer exhaustive
– A record may not trigger any rules
– Solution?
Use a default class
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Ordered Rule Set
Rules are rank ordered according to their priority
– An ordered rule set is known as a decision list
When a test record is presented to the classifier
– It is assigned to the class label of the highest ranked rule it has
triggered
– If none of the rules fired, it is assigned to the default class
R1: (Give Birth = no) (Can Fly = yes) Birds
R2: (Give Birth = no) (Live in Water = yes) Fishes
R3: (Give Birth = yes) (Blood Type = warm) Mammals
R4: (Give Birth = no) (Can Fly = no) Reptiles
R5: (Live in Water = sometimes) Amphibians
Name
turtle
© Tan,Steinbach, Kumar
Blood Type
cold
R6: Default=mammals
Give Birth
Can Fly
Live in Water
Class
no
no
sometimes
?
Introduction to Data Mining
4/18/2004
‹#›
Rule Ordering Schemes
Rule-based ordering
– Individual rules are ranked based on their quality
Class-based ordering
– Rules that belong to the same class appear together
proportion of class
Rule-based Ordering
Class-based Ordering
(Refund=Yes) ==> No
(Refund=Yes) ==> No
(Refund=No, Marital Status={Single,Divorced},
Taxable Income<80K) ==> No
(Refund=No, Marital Status={Single,Divorced},
Taxable Income<80K) ==> No
(Refund=No, Marital Status={Single,Divorced},
Taxable Income>80K) ==> Yes
(Refund=No, Marital Status={Married}) ==> No
(Refund=No, Marital Status={Married}) ==> No
© Tan,Steinbach, Kumar
(Refund=No, Marital Status={Single,Divorced},
Taxable Income>80K) ==> Yes
Introduction to Data Mining
4/18/2004
‹#›
Building Classification Rules
Direct Method:
Extract rules directly from data
e.g.: RIPPER, CN2, Holte’s 1R
Indirect Method:
Extract rules from other classification models (e.g.
decision trees, neural networks, etc).
e.g: C4.5rules
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Holte’s 1R method
Holte’s 1R method
– For each attribute,
For
each value of the attribute
– Form a rule: choose the value that maximize the accuracy
Calculate
the average training error of the attribute
– Select the Attribute with the maximum accuracy (or
minimum error)
Example: For Outlook, Temp, Humidity, Windy,
– Outlook= Sunny, choose class=N, error=2/5
– Outlook=Overcast, choose class=P, error=0
– …
– Outlook’s average error=5/14*2/5+0+5/14*2/5=4/14
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Direct Method: Sequential Covering
1.
2.
3.
4.
Start from an empty rule
Grow a rule using the Learn-One-Rule function
Remove training records covered by the rule
Repeat Step (2) and (3) until stopping criterion
is met
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Example of Sequential Covering
(i) Original Data
© Tan,Steinbach, Kumar
(ii) Step 1
Introduction to Data Mining
4/18/2004
‹#›
Example of Sequential Covering…
R1
R1
R2
(iii) Step 2
© Tan,Steinbach, Kumar
(iv) Step 3
Introduction to Data Mining
4/18/2004
‹#›
Aspects of Sequential Covering
Rule Growing
Instance Elimination
Rule Evaluation
Stopping Criterion
Rule Pruning
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Rule Growing
Two common strategies
{}
Yes: 3
No: 4
Refund=No,
Status=Single,
Income=85K
(Class=Yes)
Refund=
No
Status =
Single
Status =
Divorced
Status =
Married
Yes: 3
No: 4
Yes: 2
No: 1
Yes: 1
No: 0
Yes: 0
No: 3
...
Income
> 80K
Yes: 3
No: 1
(a) General-to-specific
© Tan,Steinbach, Kumar
Introduction to Data Mining
Refund=No,
Status=Single,
Income=90K
(Class=Yes)
Refund=No,
Status = Single
(Class = Yes)
(b) Specific-to-general
4/18/2004
‹#›
Rule Growing (Examples)
RIPPER Algorithm:
– Start from an empty rule: {} => class
– Add conjuncts that maximizes FOIL’s information gain
measure:
R0: {} => class (initial rule)
R1: {A} => class (rule after adding conjunct)
Gain(R0, R1) = p1 [ log (p1/(p1+n1)) – log (p0/(p0 + n0)) ]
where p0: number of positive instances covered by R0
n0: number of negative instances covered by R0
p1: number of positive instances covered by R1
n1: number of negative instances covered by R1
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Instance Elimination
Why do we need to eliminate
instances?
– Otherwise, the next rule is
identical to previous rule
– Suppose R1 is generated first
We
remove all instances of R1
R1
+
class = +
+
Why do we remove positive
instances?
– Ensure that the next rule is
different
– Prevent overestimating R3
R3
-
class = -
– Prevent underestimating
accuracy of rule R3
– Compare rules R2 and R3 in
the diagram
© Tan,Steinbach, Kumar
-
-
Why do we remove negative
instances?
+ +
++
+
+
+
++
+ + +
+ +
-
+
+
+
+
+ +
-
-
-
+
+
+
-
+
+
+
+
-
-
R2
-
-
-
-
After removing R1 instances,
-Accuracy of R3 = 6/8=0.75
-Accuracy of R2 = 7/10=0.7
Introduction to Data Mining
4/18/2004
‹#›
Stopping Criterion and Rule Pruning
Stopping criterion
– Compute the gain
– If gain is not significant (less than a theta), discard the new rule
Rule Pruning: to see if we can remove a condition of a rule, while
improving the generalization error
– Similar to post-pruning of decision trees
Reduced Error Pruning: assume we have a validation data set.
– Remove one of the conjuncts in the rule
– Compare error rate on validation set before and after prunin
If error decreases, prune the conjunct
Measure for pruning: v = (p-n)/(p+n)
p: number of positive examples covered by the rule in
a validation set
n: number of negative examples covered by the rule in
a validation set
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Summary of Direct Method
Grow a single rule
Remove Instances from rule
Prune the rule (if necessary)
Add rule to Current Rule Set
Repeat
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Indirect Methods
P
No
Yes
Q
No
-
Rule Set
R
Yes
No
Yes
+
+
Q
No
-
© Tan,Steinbach, Kumar
Yes
r1: (P=No,Q=No) ==> r2: (P=No,Q=Yes) ==> +
r3: (P=Yes,R=No) ==> +
r4: (P=Yes,R=Yes,Q=No) ==> r5: (P=Yes,R=Yes,Q=Yes) ==> +
+
Introduction to Data Mining
4/18/2004
‹#›
Indirect Method: C4.5rules
Extract rules from an unpruned decision tree
For each rule, r: A y,
– consider an alternative rule r’: A’ y where A’
is obtained by removing one of the conjuncts
in A
– Compare the pessimistic error rate for r
against all r’s
– Prune if one of the r’s has lower pessimistic
error rate
– Repeat until we can no longer improve
generalization error
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Indirect Method: C4.5rules
Instead of ordering the rules, order subsets of
rules (class ordering)
– Each subset is a collection of rules with the
same rule consequent (class)
– Compute description length of each subset
Description length = L(error) + g L(model)
g is a parameter that takes into account the
presence of redundant attributes in a rule set
(default value = 0.5)
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Example
Name
human
python
salmon
whale
frog
komodo
bat
pigeon
cat
leopard shark
turtle
penguin
porcupine
eel
salamander
gila monster
platypus
owl
dolphin
eagle
© Tan,Steinbach, Kumar
Give Birth
yes
no
no
yes
no
no
yes
no
yes
yes
no
no
yes
no
no
no
no
no
yes
no
Lay Eggs
no
yes
yes
no
yes
yes
no
yes
no
no
yes
yes
no
yes
yes
yes
yes
yes
no
yes
Can Fly
no
no
no
no
no
no
yes
yes
no
no
no
no
no
no
no
no
no
yes
no
yes
Introduction to Data Mining
Live in Water Have Legs
no
no
yes
yes
sometimes
no
no
no
no
yes
sometimes
sometimes
no
yes
sometimes
no
no
no
yes
no
yes
no
no
no
yes
yes
yes
yes
yes
no
yes
yes
yes
no
yes
yes
yes
yes
no
yes
Class
mammals
reptiles
fishes
mammals
amphibians
reptiles
mammals
birds
mammals
fishes
reptiles
birds
mammals
fishes
amphibians
reptiles
mammals
birds
mammals
birds
4/18/2004
‹#›
C4.5 versus C4.5rules versus RIPPER
C4.5rules:
Give
Birth?
(Give Birth=No, Can Fly=Yes) Birds
(Give Birth=No, Live in Water=Yes) Fishes
No
Yes
(Give Birth=Yes) Mammals
(Give Birth=No, Can Fly=No, Live in Water=No) Reptiles
Live In
Water?
Mammals
Yes
( ) Amphibians
RIPPER:
No
(Live in Water=Yes) Fishes
(Have Legs=No) Reptiles
Sometimes
Fishes
Yes
Birds
© Tan,Steinbach, Kumar
(Give Birth=No, Can Fly=No, Live In Water=No)
Reptiles
Can
Fly?
Amphibians
(Can Fly=Yes,Give Birth=No) Birds
No
() Mammals
Reptiles
Introduction to Data Mining
4/18/2004
‹#›
C4.5 versus C4.5rules versus RIPPER
C4.5 and C4.5rules:
PREDICTED CLASS
Amphibians Fishes Reptiles Birds
ACTUAL Amphibians
2
0
0
CLASS Fishes
0
2
0
Reptiles
1
0
3
Birds
1
0
0
Mammals
0
0
1
0
0
0
3
0
Mammals
0
1
0
0
6
0
0
0
2
0
Mammals
2
0
1
1
4
RIPPER:
PREDICTED CLASS
Amphibians Fishes Reptiles Birds
ACTUAL Amphibians
0
0
0
CLASS Fishes
0
3
0
Reptiles
0
0
3
Birds
0
0
1
Mammals
0
2
1
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›
Advantages of Rule-Based Classifiers
As highly expressive as decision trees
Easy to interpret
Easy to generate
Can classify new instances rapidly
Performance comparable to decision trees
© Tan,Steinbach, Kumar
Introduction to Data Mining
4/18/2004
‹#›