Transcript Document

Artificial Intelligence
Knowledge Representation Problem
Knowledge bases

Knowledge base = set of sentences in a
formal language
Stages of Knowledge Use

Acquisition
structure of facts
 integration of old & new knowledge


Retrieval (recall)
roles of linking and chunking
 means of improving recall efficiency

Representation
Set of syntactic and semantic conventions
which make it possible to describe things
 Syntax



specific symbols allowed and rules allowed
Semantics

how meaning is associated with symbol
arrangements allowed by syntax
5
Knowledge Representation Schemas
 Logic based representation – first order
predicate logic, Prolog
 Procedural representation – rules, production
system
 Network representation – semantic networks,
conceptual graphs
 Structural representation – scripts, frames,
objects
Conceptual Graphs

each concept has got its type and an instance
general concept – a concept with a wildcard instance
dog:*X
colour
brown
specific concept – a concept with a concrete instance
dog:Emma
colour
brown
animal


there exsists a hierarchy of types subtype:
concept w is specialisation of concept v if
type(v)>type(w) or instance(w)::type(v)
dog
cat
Types of Knowledge

Objects


Events



usually involve time
maybe cause & effect relationships
Performance


both physical & concepts
how to do things
META Knowledge

knowledge about how to use knowledge
Proposition logic
Basic connectives and truth tables
statements (propositions): declarative sentences that are
either true or false--but not both.
Eg. Ahmed Hassan wrote Gone with the Wind.
2+3=5.
not statements:
What a beautiful morning!
Get up and do your exercises.
Fundamentals of Logic
"The number x is an integer."
is not a statement because its truth value cannot
be determined until a numerical value is
assigned for x.
Propositional logic




Logical constants: true, false
Propositional symbols: P, Q, S, ... (atomic
sentences)
Sentences are combined by connectives:
 ...and
[conjunction]
 ...or
[disjunction]
...implies [implication / conditional]
..is equivalent [biconditional]
 ...not
[negation]
Literal: atomic sentence or negated atomic
sentence
Truth Tables
p q
p
q
pq
p q
0
0
0
1
0
0
0
1
0
1
1
1
1
0
1
0
0
1
1
0
0
1
1
1
1
0
1
1
pq
pq
Examples of PL sentences






P means “It is hot.”
Q means “It is humid.”
R means “It is raining.”
(P  Q)  R
“If it is hot and humid, then it is raining”
QP
“If it is humid, then it is hot”
A better way:
Hot = “It is hot”
Humid = “It is humid”
Raining = “It is raining”
Example
s: Aya goes out for a walk.
t: The moon is out.
u: It is snowing.
( t  u )  s : If the moon is out and it is not snowing, then
Aya goes out for a walk.
If it is snowing and the moon is not out, then Aya
will not go out for a walk. ( u  t )  s
Logical Equivalence
p
0
0
1
1
q
0
1
0
1
s1  s2
p
1
1
0
0
p  q
1
1
0
1
pq
1
1
0
1
Logical equivalence



Two sentences are logically equivalent} iff true in same
models: α ≡ ß iff α╞ β and β╞ α
Tables of Logical Equivalences

Identity laws
Like adding 0

Domination laws
Like multiplying by 0

Idempotent laws
Delete redundancies

Double negation
“I don’t like you, not”

Commutativity
Like “x+y = y+x”

Associativity
Like “(x+y)+z = y+(x+z)”

Distributivity
Like “(x+y)z = xz+yz”

De
L3 Morgan
18
Tables of Logical Equivalences



Excluded middle
Negating creates opposite
Definition of implication in
terms of Not and Or
Fundamentals of Logic
A compound statement is called a tautology(T0) if it is
true for all truth value assignments for its component
statements.
If a compound statement is false for all such assignments,
then it is called a contradiction(F0).
p  ( p  q ) : tautology
p  ( p  q ) : contradiction
Propositional Logic - 2 more defn…
A tautology is a proposition that’s always TRUE.
A contradiction is a proposition that’s always FALSE.
p p p  p p  p
T
F
T
F
F
T
T
F
Tautology example
Demonstrate that
[¬p (p q )]q
is a tautology in two ways:
1. Using a truth table – show that
[¬p (p q )]q is always true
2. Using a proof (will get to this later).
22
Tautology by truth table
p q ¬p p q ¬p (p q ) [¬p (p q )]q
T T
T F
F T
F F
Tautology by truth table
p q ¬p p q ¬p (p q ) [¬p (p q )]q
T T
F
T F
F
F T
T
F F
T
Tautology by truth table
p q ¬p p q ¬p (p q ) [¬p (p q )]q
T T
F
T
T F
F
T
F T
T
T
F F
T
F
Tautology by truth table
p q ¬p p q ¬p (p q ) [¬p (p q )]q
T T
F
T
F
T F
F
T
F
F T
T
T
T
F F
T
F
F
Tautology by truth table
p q ¬p p q ¬p (p q ) [¬p (p q )]q
T T
F
T
F
T
T F
F
T
F
T
F T
T
T
T
T
F F
T
F
F
T
Derivational Proof Techniques
EG: consider the compound proposition
(p p )  ((sr)t) )  (qr )
Q: Why is this a tautology?
Derivational Proof Techniques
A: Part of it is a tautology (p p ) and the
disjunction of True with any other
compound proposition is still True:
(p p )  ((sr)t ))  (qr )

T  ((sr)t ))  (qr )

T
Derivational techniques formalize the
intuition of this example.
L3
29
Tautology by proof
[¬p (p q )]q
L3
30
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
L3
Distributive
31
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
L3
Distributive
ULE
32
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
L3
Distributive
ULE
Identity
33
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
L3
Distributive
ULE
Identity
ULE
34
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
 [¬(¬p) ¬q ]  q
L3
Distributive
ULE
Identity
ULE
DeMorgan
35
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
 [¬(¬p) ¬q ]  q
 [p  ¬q ]  q
L3
Distributive
ULE
Identity
ULE
DeMorgan
Double Negation
36
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
 [¬(¬p) ¬q ]  q
L3
Distributive
ULE
Identity
ULE
DeMorgan
 [p  ¬q ]  q
Double Negation
 p  [¬q q ]
Associative
37
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
 [¬(¬p) ¬q ]  q
L3
Distributive
ULE
Identity
ULE
DeMorgan
 [p  ¬q ]  q
Double Negation
 p  [¬q q ]
Associative
 p  [q ¬q ]
Commutative
38
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
 [¬(¬p) ¬q ]  q
L3
Distributive
ULE
Identity
ULE
DeMorgan
 [p  ¬q ]  q
Double Negation
 p  [¬q q ]
Associative
 p  [q ¬q ]
pT
Commutative
ULE
39
Tautology by proof
[¬p (p q )]q
 [(¬p p)(¬p q)]q
 [ F  (¬p q)]q
 [¬p q ]q
 ¬ [¬p q ]  q
 [¬(¬p) ¬q ]  q
L3
Distributive
ULE
Identity
ULE
DeMorgan
 [p  ¬q ]  q
Double Negation
 p  [¬q q ]
Associative
 p  [q ¬q ]
pT
T
Commutative
ULE
Domination
40
Examples
1.
2.
“I don’t study well and fail” is logically
equivalent to “If I study well, then I don’t
fail”
Write a C program that represents the
compound proposition (pq)r
Use truth table to find



-PQPR
P  -Q  R  P  R  - Q
A  B  -C  D  E  F
Limitations of propositional logic

So far we studied propositional logic

Some English statements are hard to model in
propositional logic:

“If your roommate is wet because of rain, your
roommate must not be carrying any umbrella”

Pathetic attempt at modeling this:

RoommateWetBecauseOfRain =>
(NOT(RoommateCarryingUmbrella0) AND
NOT(RoommateCarryingUmbrella1) AND
NOT(RoommateCarryingUmbrella2) AND …)
Problems with propositional logic

No notion of objects

No notion of relations among objects

RoommateCarryingUmbrella0 is instructive to us,
suggesting


there is an object we call Roommate,

there is an object we call Umbrella0,

there is a relationship Carrying between these two objects
Formally, none of this meaning is there

Might as well have replaced RoommateCarryingUmbrella0 by
P
First-Order Logic
Syntax
Constants




Constants refer to objects, functions and relationships.
Ahmed, Mona, loves, happy,
Simple sentences express relationships among objects.
loves(Ahmed, Mona)
They are called atoms.
Compound sentences capture relationships among relations.
loves(x,y) loves(y,x)
loves(x,y) loves(y,x) happy(x)
Relations can be unary as well.
tall(Tomy)
Elements of first-order logic

Objects: can give these names such as Umbrella0,
Person0, John, Earth, …

Relations: Carrying(., .), IsAnUmbrella(.)


Carrying(Person0, Umbrella0),
IsUmbrella(Umbrella0)

Relations with one object = unary relations =
properties
Functions: Roommate(.)


Roommate(Person0)
Equality: Roommate(Person0) = Person1
Example with Functions
E.g. How about saying that Ahmed
has a big nose?
E.g. Mona loves her dog.
loves(Mona, dog_of (Mona))
Ahmed is an object and
nose_of (Ahmed)
is a function that constructs an
object from the argument
object.
Then, we can write:
big(nose_of (Ahmed))
Note: We are allowed to relate sentences
only.
So, we can say:
loves(Mona, dog_of (Mona)) 
loves(Mona, cat_of (Mona))
But not,
loves(Mona,
dog_of (Mona) cat_of (Mona))
First-Order Logic: ,





The language that we have described so far, consisting of atoms and the
connectives (,,,,,) is typically called predicate logic.
To extend it to first-order logic, we need to add quantifiers.
The purpose of quantifiers is to allow us to say things about sets of
objects.
To say that Heba loves everything we write:
x. loves (Heba, x)
We can think of  as a big conjunction. For example, if there are only
three objects Heba, dog, and cat, what the above asserts is:
loves (Heba, dog)  loves (Heba, cat)  loves (Heba, Heba)
To say that Hassan loves something we write:
x. loves (Hassan, x)
We can think of  as a big disjunction. For example, if there are only
three objects as above, then what we are asserting is:
loves (Hassan, dog)  loves (Hassan, cat)  loves (Hassan, Hassan)
 First Order Predicate Logic –
enriched by variables, predicates, functions
 quantifiers , 
friends(father(david),father(andrew))
 Y friends(Y, petr)
 X likes(X,ice_cream)
 X  Y  Z parent(X,Y)  parent(X,Z)
 siblings(Y,Z)

Reasoning about many objects at once


Variables: x, y, z, … can refer to multiple objects
New operators “for all” and “there exists”


for all x: CompletelyWhite(x) =>
NOT(PartiallyBlack(x))


Universal quantifier and existential quantifier
Completely white objects are never partially black
there exists x: PartiallyWhite(x) AND PartiallyBlack(x)

There exists some object in the world that is partially white and
partially black
Practice converting English to firstorder logic






“John has an umbrella”
there exists y: (Has(John, y) AND IsUmbrella(y))
“Anything that has an umbrella is not wet”
for all x: ((there exists y: (Has(x, y) AND
IsUmbrella(y))) => NOT(IsWet(x)))
“Any person who has an umbrella is not wet”
for all x: (IsPerson(x) => ((there exists y: (Has(x, y)
AND IsUmbrella(y))) => NOT(IsWet(x))))
More practice converting English
to first-order logic




“John has at least two umbrellas”
there exists x: (there exists y: (Has(John, x) AND
IsUmbrella(x) AND Has(John, y) AND IsUmbrella(y)
AND NOT(x=y))
“John has at most two umbrellas”
for all x, y, z: ((Has(John, x) AND IsUmbrella(x) AND
Has(John, y) AND IsUmbrella(y) AND Has(John, z)
AND IsUmbrella(z)) => (x=y OR x=z OR y=z))
Even more practice converting
English to first-order logic…




“Duke’s basketball team defeats any other
basketball team”
for all x: ((IsBasketballTeam(x) AND
NOT(x=BasketballTeamOf(Duke))) =>
Defeats(BasketballTeamOf(Duke), x))
“Every team defeats some other team”
for all x: (IsTeam(x) => (there exists y: (IsTeam(y)
AND NOT(x=y) AND Defeats(x,y))))
Reverse translation
• Translate the following into English.
• x hesitates(x)  lost(x)
• He who hesitates is lost.
• x business(x)  like(x,Showbusiness)
• There is no business like show business.
• x glitters(x)  gold(x)
• Not everything that glitters is gold.
• x t person(x)  time(t)  canfool(x,t)
• You can fool some of the people all the time.
Translating English to FOL
Every gardener likes the sun.
x gardener(x)  likes(x,Sun)
You can fool some of the people all of the time.
x t person(x) time(t)  can-fool(x,t)
You can fool all of the people some of the time.
x t (person(x)  time(t) can-fool(x,t))
Equivalent
x (person(x)  t (time(t) can-fool(x,t)))
All purple mushrooms are poisonous.
x (mushroom(x)  purple(x))  poisonous(x)
No purple mushroom is poisonous.
x purple(x)  mushroom(x)  poisonous(x)
Equivalent
x (mushroom(x)  purple(x))  poisonous(x)
There are exactly two purple mushrooms.
x y mushroom(x)  purple(x)  mushroom(y)  purple(y) ^ (x=y)  z
(mushroom(z)  purple(z))  ((x=z)  (y=z))
Clinton is not tall.
tall(Clinton)
X is above Y iff X is on directly on top of Y or there is a pile of one or more other
objects directly on top of one another starting with X and ending with Y.
x y above(x,y) ↔ (on(x,y)  z (on(x,z)  above(z,y)))
Resolution for first-order logic

for all x: (NOT(Knows(John, x)) OR IsMean(x) OR
Loves(John, x))


John loves everything he knows, with the possible exception of mean
things
for all y: (Loves(Jane, y) OR Knows(y, Jane))

Jane loves everything that does not know her

What can we unify? What can we conclude?

Use the substitution: {x/Jane, y/John}

Get: IsMean(Jane) OR Loves(John, Jane) OR Loves(Jane,
John)

Complete (i.e., if not satisfiable, will find a proof of this), if we
can remove literals that are duplicates after unification

Also need to put everything in canonical form first
Properties of quantifiers

x y is the same as y x


x y is the same as y x



x y is not the same as y x
x y Loves(x,y)

“There is a person who loves everyone in the world”

y x Loves(x,y)

“Everyone in the world is loved by at least one person”




Quantifier duality: each can be expressed using the other
x Likes(x,IceCream)x Likes(x,IceCream)
x Likes(x,Broccoli)
x Likes(x,Broccoli)
Using FOL

Brothers are siblings

 x,y Brother(x,y)  Sibling(x,y)

One's mother is one's female parent

 m,c Mother(c) = m  (Female(m)  Parent(m,c))

“Sibling” is symmetric

 x,y Sibling(x,y)  Sibling(y,x)

A first cousin is a child of a parent’s sibling

 x,y FirstCousin(x,y)   p,ps Parent(p,x) 
Sibling(ps,p)  Parent(ps,y)
An example
1.
Sameh is a lawyer.
2.
Lawyers are rich.
3.
Rich people have big houses.
4.
Big houses are a lot of work.

We would like to conclude that Sameh’s house
is a lot of work.
Natural languages are ambiguous so we can
have different axiomatizations.

Axiomatization 1
1.
2.
3.
4.
5.




lawyer(Sameh)
x lawyer(x)  rich(x)
x rich(x)  y house(x,y)
x,y rich(x)  house(x,y)  big(y)
x,y ( house(x,y)  big(y)  work(y) )
3 and 4, say that rich people do have at least one house and all
their houses are big.
Conclusion we want to show:
house(Sameh, S_house)  work(Sameh, S_house)
Or, do we want to conclude that John has at least one house
that needs a lot of work? I.e.
y house(Sameh,y)  work(y)
Amir and the cat





Everyone who loves all animals is loved by
someone.
Anyone who kills an animal is loved by no
one.
Mohamed loves all animals.
Either Mohamed or Amir killed the cat,
who is named SoSo.
Did Amir kill the cat?
first-order logic

for all x: (NOT(Knows(John, x)) OR IsMean(x) OR
Loves(John, x))


John loves everything he knows, with the possible exception of mean
things
for all y: (Loves(Jane, y) OR Knows(y, Jane))

Jane loves everything that does not know her
Converting sentences to CNF
1. Eliminate all ↔ connectives
(P ↔ Q)  ((P  Q) ^ (Q  P))
2. Eliminate all  connectives
(P  Q)  (P  Q)
3. Reduce the scope of each negation symbol to a single predicate
P  P
(P  Q)  P  Q
(P  Q)  P  Q
(x)P  (x)P
(x)P  (x)P
4. Standardize variables: rename all variables so that each quantifier
has its own unique variable name
64
Converting sentences
5. Eliminate existential quantification by introducing Skolem
constants/functions
(x)P(x)  P(c)
c is a Skolem constant (a brand-new constant symbol that
is not used in any other sentence)
(x)(y)P(x,y)  (x)P(x, f(x))
since  is within the scope of a universally quantified
variable, use a Skolem function f to construct a new value
that depends on the universally quantified variable
f must be a brand-new function name not occurring in any
other sentence in the KB.
E.g., (x)(y)loves(x,y)  (x)loves(x,f(x))
In this case, f(x) specifies the person that x loves
66
Types of Inference

Model Checking

Forward chaining with modus ponens

Backward chaining with modus ponens
67
Model Checking

Enumerate all possible worlds

Restrict to possible worlds in which the KB
is true

Check whether the goal is true in those
worlds or not
68
Inference as Search




State: current set of sentences
Operator: sound inference rules to derive new
entailed sentences from a set of sentences
Can be goal directed if there is a particular goal
sentence we have in mind
Can also try to enumerate every entailed sentence
69
Generalized Modus Ponens
Modus Ponens - special case of
Resolution
pq
p
q
Using the tricks:
pq
 p
ppq
 q,
i.e. q
Sunday  Dr Yasser is teaching AI
Sunday
Dr Yasser teaching AI
72
Sound rules of inference

Each can be shown to be sound using a truth
table
RULE
PREMISE
CONCLUSION
Modus Ponens
And Introduction
And Elimination
Double Negation
Unit Resolution
Resolution
A, A  B
A, B
AB
A
A  B, B
A  B, B  C
B
AB
A
A
A
AC
An example
(x)(P(x)  ((y)(P(y)  P(f(x,y)))  (y)(Q(x,y)  P(y))))
2. Eliminate 
(x)(P(x)  ((y)(P(y)  P(f(x,y)))  (y)(Q(x,y)  P(y))))
3. Reduce scope of negation
(x)(P(x)  ((y)(P(y)  P(f(x,y))) (y)(Q(x,y)  P(y))))
4. Standardize variables
(x)(P(x)  ((y)(P(y)  P(f(x,y))) (z)(Q(x,z)  P(z))))
5. Eliminate existential quantification
(x)(P(x) ((y)(P(y)  P(f(x,y))) (Q(x,g(x))  P(g(x)))))
6. Drop universal quantification symbols
(P(x)  ((P(y)  P(f(x,y))) (Q(x,g(x))  P(g(x)))))
Two broad kinds of rule system



forward chaining systems, and backward
chaining systems.
In a forward chaining system you start with the
initial facts, and keep using the rules to draw new
conclusions (or take certain actions) given those
facts
In a backward chaining system you start with
some hypothesis (or goal) you are trying to prove,
and keep looking for rules that would allow you
to conclude that hypothesis, perhaps setting new
subgoals to prove as you go.
Forward chaining


Proofs start with the given
axioms/premises in KB, deriving new
sentences until the goal/query sentence is
derived
This defines a forward-chaining
inference procedure because it moves
“forward” from the KB to the goal
[eventually]
Forward chaining

Idea: fire any rule whose premises are satisfied in the KB,

add its conclusion to the KB, until query is found
Forward chaining example
Backward chaining




Proofs start with the goal query, find
rules with that conclusion, and then prove
each of the antecedents in the implication
Keep going until you reach premises
Avoid loops: check if new sub-goal is
already on the goal stack
Avoid repeated work: check if new subgoal


Has already been proved true
Has already failed
Backward Chaining
1.Tortoise ( x)  Slug ( y )  Faster ( x, y )
2.Slimy ( z )  Creeps( z )  Slug ( z )
3.Tortoise (Tom)
4.Slimy ( Steve)
5.Creeps ( Steve)
Is Tom faster than someone?
Forward chaining example

KB:
allergies(X)  sneeze(X)
 cat(Y)  allergic-to-cats(X)  allergies(X)
 cat(Felix)
 allergic-to-cats(Lise)


Goal:

sneeze(Lise)
Exercise




You go to the doctor and for insurance
reasons they perform a test for a horrible
disease
You test positive
The doctor says the test is 99% accurate
Do you worry?
Reduction to propositional inference
Suppose the KB contains just the following:
x King(x)  Greedy(x)  Evil(x)
King(Ali)
Greedy(Ali)
Brother(Saad, Ali)
Instantiating the universal sentence in all possible ways, we have:
King(John)  Greedy(John)  Evil(John)
King(Richard)  Greedy(Richard)  Evil(Richard)
King(John)
Greedy(John)
Brother(Richard,John)

The new KB is propositionalized: proposition symbols are
King(John), Greedy(John), Evil(John), King(Richard), etc.
An example
1.
Sameh is a lawyer.
2.
Lawyers are rich.
3.
Rich people have big houses.
4.
Big houses are a lot of work.

We would like to conclude that Sameh’s house
is a lot of work.
Axiomatization 1
1.
2.
3.
4.
5.




lawyer(Sameh)
x lawyer(x)  rich(x)
x rich(x)  y house(x,y)
x,y rich(x)  house(x,y)  big(y)
x,y ( house(x,y)  big(y)  work(y) )
3 and 4, say that rich people do have at least one house and all
their houses are big.
Conclusion we want to show:
house(Sameh, S_house)  work(Sameh, S_house)
Or, do we want to conclude that Sameh has at least one house
that needs a lot of work? I.e.
y house(Sameh,y)  work(y)
Hassan and the cat






Every animal owner is an animal lover
Everyone who loves all animals is loved by
someone.
Anyone who kills an animal is loved by no
one.
Mustafa owns a dog.
Either Mustafa or Hassan killed the cat,
who is named SoSo.
Did Hassan kill the cat?
Practice example
Did Hassan kill the cat


Mustafa owns a dog. Every dog owner is an animal lover. No
animal lover kills an animal. Either Hassan or Mustafa killed
the cat, who is named SoSo . Did Hassan kill the cat?
These can be represented as follows:
A. (x) Dog(x)  Owns(Mustafa ,x)
B. (x) ((y) Dog(y)  Owns(x, y))  AnimalLover(x)
C. (x) AnimalLover(x)  ((y) Animal(y)  Kills(x,y))
D. Kills(Mustafa ,SoSo)  Kills(Hassan,SoSo)
E. Cat(SoSo)
F. (x) Cat(x)  Animal(x)
G. Kills(Hassan, SoSo)
GOAL

Convert to clause form
A1. (Dog(D))
A2. (Owns(Mustafa,D))
B. (Dog(y), Owns(x, y), AnimalLover(x))
C. (AnimalLover(a), Animal(b), Kills(a,b))
D. (Kills(Mustafa,SoSo), Kills(Hassan,SoSo))
E. Cat(SoSo)
F. (Cat(z), Animal(z))

Add the negation of query:
G: (Kills(Hassan, SoSo))
88
The resolution refutation proof
R1: G, D, {}
(Kills(Mustafa,SoSo))
R2: R1, C, {a/Mustafa, b/SoSo}
(~AnimalLover(Mustafa),
~Animal(SoSo))
R3: R2, B, {x/Mustafa} (~Dog(y), ~Owns(Mustafa, y),
~Animal(SoSo))
R4: R3, A1, {y/D}
(~Owns(Mustafa, D),
~Animal(SoSo))
R5: R4, A2, {}
(~Animal(SoSo))
R6: R5, F, {z/SoSo}
(~Cat(SoSo))
R7: R6, E, {}
FALSE

89

The proof tree
G
D
{}
R1: K(J,T)
C
{a/J,b/T}
B
R2: AL(J)  A(T)
{x/J}
R3: D(y)  O(J,y)  A(T)
A1
{y/D}
R4: O(J,D), A(T)
A2
{}
R5: A(T)
F
{z/T}
R6: C(T)
A
{}
R7: FALSE
Umbrellas in first-order logic

You know the following things:








You have exactly one other person living in your house, who is wet
If a person is wet, it is because of the rain, the sprinklers, or both
If a person is wet because of the sprinklers, the sprinklers must be on
If a person is wet because of rain, that person must not be carrying
any umbrella
There is an umbrella that “lives in” your house, which is not in its
house
An umbrella that is not in its house must be carried by some person
who lives in that house
You are not carrying any umbrella
Can you conclude that the sprinklers are on?
Example knowledge base

The law says that it is a crime for an
American to sell weapons to hostile nations.
The country Nono, an enemy of America, has
some missiles, and all of its missiles were sold
to it by Colonel West, who is American.

Prove that Col. West is a criminal
Example knowledge base
... it is a crime for an American to sell weapons to hostile nations:
American(x)  Weapon(y)  Sells(x,y,z)  Hostile(z)  Criminal(x)
Nono … has some missiles, i.e., x Owns(Nono,x)  Missile(x):
Owns(Nono,M1) and Missile(M1)
… all of its missiles were sold to it by Colonel West
Missile(x)  Owns(Nono,x)  Sells(West,x,Nono)
Missiles are weapons:
Missile(x)  Weapon(x)
An enemy of America counts as "hostile“:
Enemy(x,America)  Hostile(x)
West, who is American …
American(West)
The country Nono, an enemy of America …
Enemy(Nono,America)
Resolution proof: definite clauses

Rule-Based Systems
Also known as “production systems” or
“expert systems”
 Rule-based systems are one of the most
successful AI paradigms
 Used for synthesis (construction) type systems
 Also used for analysis (diagnostic or
classification) type systems

Rule-Based Systems


Instead of representing knowledge in a relatively
declarative, static way (as a bunch of things that
are true), rule-based system represent knowledge
in terms of a bunch of rules that tell you what you
should do or what you could conclude in different
situations.
A rule-based system consists of a bunch of IFTHEN rules, a bunch of facts, and some
interpreter controlling the application of the
rules, given the facts.
1.
2.
3.
4.
5.
6.
IF (lecturing X)
AND (marking-practicals X)
THEN ADD (overworked X)
IF (month february)
THEN ADD (lecturing ali)
IF (month february)
THEN ADD (marking-practicals ali)
IF (overworked X)
OR (slept-badly X)
THEN ADD (bad-mood X)
IF (bad-mood X)
THEN DELETE (happy X)
IF (lecturing X)
THEN DELETE (researching X)
Rule Based Reasoning

The advantages of rule-based approach:




The ability to use
Good performance
Good explanation
The disadvantage are


Cannot handle missing information
Knowledge tends to be very task dependent
Other Reasoning

There exist some other approaches as:



Case-Based Reasoning
Model-Based Reasoning
Hybrid Reasoning
Rule-based + case-based
 Rule-based + model-based
 Model-based + case-based
