Transcript Knowledge3
isGreat(god)
For HW 4 “Approximately how many games of tic-tac-toe are
there?”
Many of you answered by copied down and detailed and exact
formula from the web (mostly without acknowledgement of your
source).
I did ask “Approximately”. (Brown M&Ms)
I was hoping for something like:
• The maximum branching factor is 9
• The game lasts at most 9 moves
• So a very loose upper bound is 99 = 387420489
• This is a weak upper bound, as the branching factor decreases as you play the
game.
• Here is a better bound. Every state has 9 cells, filled with three possibilities B, X
or O, so there are 93= 19,683 states
• Of course, this assumes…
Predicates
Functions
isTypeOfGrain(rice_seed)
isLocatedIn(the_atantic, new_york)
isDirector(alan_parker, bugsy_malone)
locationOf(the_atantic)
Suppose you know that you are going to fail or pass CS 170. Further,
since this is the only class you need to graduate, you know that you are
going to either not pass the class or graduate on time.
Then you can derive the fact that you will either not pass or you graduate
on time.
Fail or Pass, not Pass or GraduateOnTime
Therefore Fail or GraduateOnTime
F
P
G
Fail
Pass
Graduate on time
F P,
P G
FG
Horn Form
A sentence is in Horn form if and only if ….
It is a conjunction of atomic sentences on the left side, and
a single atom on the right side
i.e.
PQRX
ABPQRX
Overview
We plan to represent the world (or some subset thereof, ie
our domain) in FOL, and to TELL this information into our
Knowledge Base KB.
We will represent our problem (ie the statement we want to
prove) in FOL. We will sometimes call the statement to be
proved a goal.
So proving a statement is equivalent to seeing if the goal is
in the KB, or if not in it, can be deduced (entailed) from it.
The simplest example
Suppose our KB consists of
P
Q
R
A simple goal is … Q. Literally we are asking is it the
case that Q is true. In this case, the answer is yes!
Another simple example
Suppose our KB consists of
P
Q
R
Now suppose the goal is … P Q.
We are literally asking is it the case that P is true and Q
is true. In this case, the fact P Q is not in our KB. But
we can infer it by using the And-Introduction rule.
A more interesting example
Suppose our KB consists of facts about containment
General facts about containment
Note, “in” means
x,y,z in(x,y) in(y,z) enclosedby(x,z)
directly inside with
x,y in(x,y) enclosedby(x,y)
nothing in between,
“encloses” means inside,
possibly with other
items in between.
Specific facts for our particular problem
in(A,B)
in(B,C)
in(F,C)
B
A goal might be
A
in(A,F) (i.e. is A inside F?)
or
x enclosedby(B,x)
(i.e., is B inside anything ?)
or
x,y,z enclosedby(x,y) enclosedby(y,z)
F
C
First Order Logic is Powerful!
We can use FOL to represent many important domains of commercial
and scientific interest.
However, it does have some limitations…
x isHumanHand(x) equals(NumberOfFingersOf(x),5)
Dealing with rare exceptions
Dealing with uncertainty
Dealing with time
(Markov Logic)
(Temporal Logic, Allens logic)
Models I
A model is simply a “world” in which a sentence is true.
Consider the sentence P Q , given that
P
Means “is tall”.
Q
Means “is rich”.
We can now ask if the following interpretations of P Q are
models…
“Bill Gates” is not a model of P Q
“Eamonn Keogh” is not a model of P Q
“Michael Jordan” is a model of P Q
Models II
If we have the sentence P Q, and
P
Q
Means “is astronaut”.
Means “is female”.
Which of the following interpretations of P Q are models?
“Neil Armstrong”, “Eamonn Keogh”, “Sally Ride”, “Dolly
Parton”, “Bill Clinton”, “Valentina Tereshkova”,
Models III
Models are important because they allow a simple
definition of entailment.
A Knowledge Base KB entails a sentence if and only if,
all models of KB are also models of .
For example the KB
PQ
…can be said to entail the sentence P Q, since all models
of P Q are also models of P Q.
In propositional logic, proof is easy I
Suppose we had P and P Q in
our KB, and we have a goal of Q
PQ
P
We can always build a truth table
with an entry for each symbol and
each item from our KB
Goal:
P
Q
PQ
FALSE
FALSE
TRUE
FALSE
TRUE
TRUE
TRUE
FALSE
FALSE
TRUE
TRUE
TRUE
We need to look at all rows where the statements in our
KB are true…
Q?
In propositional logic, proof is easy I
Suppose we had P and P Q in our KB,
and we have a goal of Q
PQ
P
We can always build a truth table with an
entry for each symbol and each item from
our KB
P
Q
PQ
FALSE
FALSE
TRUE
FALSE
TRUE
TRUE
TRUE
FALSE
FALSE
TRUE
TRUE
TRUE
Q?
We need to look at all rows where the statements in our KB are true.
Since in every case where our KB statements are true, Q is also true, thus
we have proved Q! (I.e our KB entails Q)
All models of {P Q , P} are also models of Q.
In FOL, proof is harder
Suppose we had P(A) and x P(x) Q(x)
in our KB, and we have a goal of Q(A)
We can build a truth table but we cannot
fill it in.
x P(x) Q(x)
P(A)
Q(A)?
x P(x) Q(x)
P(A)
Q(A)
FALSE
FALSE
?
FALSE
TRUE
?
TRUE
FALSE
?
TRUE
TRUE
?
I.e. the general sentence never follows from any of these instances.
How are we to know if the KB entails the sentence Q(A)?
In the next few slides we will look at some tools that
will allow us to prove facts in First Order Logic
These tools are 3 flavors of Substitution
• Universal Elimination
• Existential Elimination
• Existential Introduction
We can think of these as three operators in our search
space.
Substitution (Binding, Instantiation)
Substitution is a variable (or set of variables) paired with a term.
Example: SUBST{x/Harpo} is a substitution indicating that Harpo
should be substituted for x.
Brother(x,Groucho)
becomes
Brother(Harpo,Groucho)
Example: SUBST{x/YoungestSonOF(Minnie)}is a substitution
indicating that YoungestSonOF(Minnie)} should be substituted for x.
Brother(x,Groucho)
becomes
Brother(YoungestSonOF(Minnie),Groucho)
Universal Elimination
(Universal Instantiation)
x
SUBST({x/g, })
For any sentence , variable
x, and term g.
For example, from
x
loves(x, Beatles)
We can use the substitution {x/Nick} to
infer
Terms
loves(Nick, Beatles)
Or can use the substitution
{x/MotherOf(Joe) } to infer
(A term is an expression that refers to an object).
Either
• A symbol
• A function application
Symbols can be constants: John, Mary, 17, 56
or variables: F_name, L_name, X, Y
Function applications are usually written as SomethingOf
loves(MotherOf(Joe) , Beatles)
MotherOf(John)
AgeOf(Mary)
EyeColorOf(MotherOf(John))
Existential Elimination
(Existential Instantiation)
x
SUBST({x/k, })
For any sentence , variable
x, and constant term k. Where
k does not already exist in our KB
For example from
x
President(x, USA)
We can use substitution
SUBST{x/someoldperson} to infer
President(someoldperson, USA)
Later on, if we happen to learn the president is
Trump, we can assign
someoldperson = Trump
Existential Introduction
x SUBST({g/x, })
For example from
President(Obama, USA)
We can infer
x President(x, USA)
For any sentence , term g
that occurs in and x that
does not occur in .
We can now take these new tools
• Universal Elimination
• Existential Elimination
• Existential Introduction
Combined with some tools we have already defined
To prove facts in First Order Logic
A sample proof in FOL
Bob is a buffalo
Pat is a pig
Buffaloes outrun pigs
Can Bob outrun Pat?
Buffalo(Bob)
Pig(Pat)
x,y Buffalo(x) Pig(y) Faster(x,y)
Faster(Bob,Pat)?
Note: I could have written the predicates as isPig, isBuffalo and
isFasterThan, but I wanted to be consistent with the R&N book.
A sample proof in FOL
Can Bob outrun Pat?
Faster(Bob,Pat)?
Buffalo(Bob)
Pig(Pat)
x,y Buffalo(x) Pig(y) Faster(x,y)
Buffalo(Bob) Pig(Pat)
1. We can use And Introduction to add the following to our database…
Buffalo(Bob) Pig(Pat)
A sample proof in FOL
Can Bob outrun Pat?
Faster(Bob,Pat)?
Buffalo(Bob)
Pig(Pat)
x,y Buffalo(x) Pig(y) Faster(x,y)
Buffalo(Bob) Pig(Pat)
Buffalo(Bob) Pig(Pat) Faster(Bob, Pat)
1. We can use And Introduction to add the following to our database…
Buffalo(Bob) Pig(Pat)
2. We can do Universal Elimination to get…
{x/Bob, y/Pat} Buffalo(Bob) Pig(Pat) Faster(Bob, Pat)
A sample proof in FOL
Can Bob outrun Pat?
Faster(Bob,Pat)?
Buffalo(Bob)
Pig(Pat)
x,y Buffalo(x) Pig(y) Faster(x,y)
Buffalo(Bob) Pig(Pat)
Buffalo(Bob) Pig(Pat) Faster(Bob, Pat)
Faster(Bob, Pat)
1. We can use And Introduction to add the following to our database…
Buffalo(Bob) Pig(Pat)
2. We can do Universal Elimination to get…
{x/Bob, y/Pat} Buffalo(Bob) Pig(Pat) Faster(Bob, Pat)
3. We then use Modus Ponens to derive
Faster(Bob, Pat)
The need for a better proof mechanism
The proof mechanism we have just seen can be imagined as search.
The operators are inference rules (Existential Introduction, Universal Elimination, etc)
The states are set of sentences (The KB itself)
The goal test is to see if a state contains the query sentence.
This idea works, but…
It seriously slows down when the KB gets larger, because the
number of possible substitutions grows exponentially.
One possible solution is to reduce the number of operators…
This leads us to Generalized Modus Ponens (GMP).
Generalized Modus Ponens GMP
GMP (shown in the next slide) looks very complicated.
Here is the way to think about it. For Farmer-Wolf-Duck-Corn we could have
had operators such as:
• Untie the boat
• Place the goat into the boat
• Step into the boat
• Grab hold of the oars
• Row the boast
• Tie up the boat
• Remove the goat from the boat
• Place the corn into the boat
• Etc
But this would have made our search tree both deeper and bushier. Instead, we had
just four operators:
•Move goat to other side
•Move corn to other side
•Move wolf to other side
•Move self (only) to other side
This make the search much more efficient.
Generalized Modus Ponens
Combines And-Introduction, Universal-Elimination, and Modus Ponens
For atomic sentences p1’, p2’, and q, where there is a
substitution such that SUBST({/ pi’ }) = SUBST({/ pi})
p1’, p2’,…pn’ (p1 p2… pn q )
SUBST({/q})
Example:
p1’ = Faster(Bob,Pat)
p2’ = Faster(Pat,Steve)
p1 p2 q = Faster(x,y) Faster(y,z) Faster(x,z)
SUBST({/q}) = SUBST({x/Bob, y/Pat, z/Steve })
Then using GMP we can conclude Faster(Bob,Steve)
Note that GMP only works for a KB of definite clauses (Exactly one positive literal): either
a single atom or (conjunction of atomic sentences) atomic sentence
Sentences in this form are known as horn clauses.
Note also that all sentences are assumed to be universally qualified
Generalized Modus Ponens
Combines And-Introduction, Universal-Elimination, and Modus Ponens
For atomic sentences p1’, p2’, and q, where there is a
substitution such that SUBST({/ pi’ }) = SUBST({/ pi})
p1’, p2’ (p1 p2 q )
SUBST({/q})
Example:
p1’ = Faster(Bob,Pat)
p2’ = Faster(Pat,Steve)
p1 p2 q = Faster(x,y) Faster(y,z) Faster(x,z)
SUBST({/q}) = SUBST({x/Bob, y/Pat, z/Steve })
Then using GMP we can conclude Faster(Bob,Steve)
Faster(Bob,Pat) Faster(Pat,Steve) Faster(Bob,Steve)
Forward Chaining
Proofs start with the given axioms/premises in KB, deriving new sentences using GMP until
the goal/query sentence is derived. This defines a forward chaining inference procedure
because it moves "forward" from the KB to the goal.
Example: KB = All cats like fish, cats eat everything they like, and
Ziggy is a cat. In FOL, KB =
1. x cat(x) likes(x, Fish)
2. x y (cat(x) likes(x,y)) eats(x,y)
3. cat(Ziggy)
Goal query: Does Ziggy eat fish?
eats(Ziggy, Fish)?
Proof:
4. Use GMP with (1) and (3) to derive: 4. likes(Ziggy, Fish)
5. Use GMP with (3), (4) and (2) to derive eats(Ziggy, Fish)
So, Yes, Ziggy eats fish.
Backward Chaining
Backward-chaining deduction using GMP is complete for KBs containing only Horn clauses. Proofs start
with the goal query, find implications that would allow you to prove it, and then prove each of the
antecedents in the implication, continuing to work "backwards" until we get to the axioms, which we know
are true.
Example: Does Ziggy eat fish?
To prove eats(Ziggy, Fish), first see if this is known from one of the axioms directly. Here it is
not known, so see if there is a Horn clause that has the consequent (i.e., right-hand side) of the
implication matching the goal. Here,
1. x cat(x) likes(x, Fish)
2. x y (cat(x) likes(x,y)) eats(x,y)
3. cat(Ziggy)
Backward Chaining
Backward-chaining deduction using GMP is complete for KBs containing only Horn clauses. Proofs start
with the goal query, find implications that would allow you to prove it, and then prove each of the
antecedents in the implication, continuing to work "backwards" until we get to the axioms, which we know
are true.
Example: Does Ziggy eat fish?
To prove eats(Ziggy, Fish), first see if this is known from one of the axioms directly. Here it is
not known, so see if there is a Horn clause that has the consequent (i.e., right-hand side) of the
implication matching the goal. Here,
1. x cat(x) likes(x, Fish)
2. x y (cat(x) likes(x,y)) eats(x,y)
3. cat(Ziggy)
Proof:
1.Goal matches RHS of Horn clause (2), so try and prove new sub-goals
cat(Ziggy) and likes(Ziggy, Fish) that correspond to the LHS of (2)
Backward Chaining
Backward-chaining deduction using GMP is complete for KBs containing only Horn clauses. Proofs start
with the goal query, find implications that would allow you to prove it, and then prove each of the
antecedents in the implication, continuing to work "backwards" until we get to the axioms, which we know
are true.
Example: Does Ziggy eat fish?
To prove eats(Ziggy, Fish), first see if this is known from one of the axioms directly. Here it is
not known, so see if there is a Horn clause that has the consequent (i.e., right-hand side) of the
implication matching the goal. Here,
1. x cat(x) likes(x, Fish)
2. x y (cat(x) likes(x,y)) eats(x,y)
3. cat(Ziggy)
Proof:
1.Goal matches RHS of Horn clause (2), so try and prove new sub-goals
cat(Ziggy) and likes(Ziggy, Fish) that correspond to the LHS of (2)
2. cat(Ziggy) matches axiom (3), so we've "solved" that sub-goal
Backward Chaining
Backward-chaining deduction using GMP is complete for KBs containing only Horn clauses. Proofs start
with the goal query, find implications that would allow you to prove it, and then prove each of the
antecedents in the implication, continuing to work "backwards" until we get to the axioms, which we know
are true.
Example: Does Ziggy eat fish?
To prove eats(Ziggy, Fish), first see if this is known from one of the axioms directly. Here it is
not known, so see if there is a Horn clause that has the consequent (i.e., right-hand side) of the
implication matching the goal. Here,
1. x cat(x) likes(x, Fish)
2. x y (cat(x) likes(x,y)) eats(x,y)
3. cat(Ziggy)
Proof:
1.Goal matches RHS of Horn clause (2), so try and prove new sub-goals
cat(Ziggy) and likes(Ziggy, Fish) that correspond to the LHS of (2)
2.cat(Ziggy) matches axiom (3), so we've "solved" that sub-goal
3. likes(Ziggy, Fish) matches the RHS of (1), so try and prove cat(Ziggy)
Backward Chaining
Backward-chaining deduction using GMP is complete for KBs containing only Horn clauses. Proofs start
with the goal query, find implications that would allow you to prove it, and then prove each of the
antecedents in the implication, continuing to work "backwards" until we get to the axioms, which we know
are true.
Example: Does Ziggy eat fish?
To prove eats(Ziggy, Fish), first see if this is known from one of the axioms directly. Here it is
not known, so see if there is a Horn clause that has the consequent (i.e., right-hand side) of the
implication matching the goal. Here,
1. x cat(x) likes(x, Fish)
2. x y (cat(x) likes(x,y)) eats(x,y)
3. cat(Ziggy)
Proof:
1.Goal matches RHS of Horn clause (2), so try and prove new sub-goals
cat(Ziggy) and likes(Ziggy, Fish) that correspond to the LHS of (2)
2.cat(Ziggy) matches axiom (3), so we've "solved" that sub-goal
3. likes(Ziggy, Fish) matches the RHS of (1), so try and prove cat(Ziggy)
4. cat(Ziggy) matches (as it did earlier) axiom (3), so we've solved this sub-goal
5.There are no unsolved sub-goals, so we're done. Yes, Ziggy eats fish.
We have seen that we can use Generalized Modus Ponens
(GMP) combined with search to see if a fact is entailed from a
Knowledge Base.
Unfortunately, there are some facts that we cannot prove using GMP
Suppose we have in our KB
GoToSchool(Joe) GoToSchool(Sue)
GoToSchool(Joe) GoToSchool(Sue)
And we want to know if
GoToSchool(Sue)
Although humans can look at such a KB and immediately see that
GoToSchool(Sue) is true, GMP cannot entail that fact!!
So GMP is not complete.
Is there any complete inference
procedure for FOL?
Yes! It is called resolution refutation (or just resolution).
• It is complete.
• It is sound.
• It is optimally efficient.
• It works for KB in any format unlike GMP which only works
for KBs in Horn clause format (although we generally have to reorganize the format).
How resolution refutation works I
Suppose our KB consists the following sentences
PQ
P R
Q R
And the goal is to prove R S
We negate the goal, so R S becomes (R S)
R S
R , S
Now we add R , S to our knowledge base
and look for a contradiction.
(by De Morgan’s law)
(And-Elimination)
PQ
P R
Q R
R
S
PQ
P R
Q R
R
S
How resolution refutation works II
PQ
QP
P R
QR
Q R
R
R
R R Contradiction!
That's all there is to resolution for propositional logic, for first order logic we
just have to put the KB in the right format, otherwise it works the same.
In order for resolution to work for FOL, the KB must be in Conjunctive Normal Form (CNF)
Conjunctive Normal Form: The KB is a conjunction of disjunctions
P(w) Q(w)
P(x) R(x)
Q(y) S(y)
R(z) S(z)
The Goal is S(A),
so negate it and add
it to the KB
P(w) Q(w)
SUBST{y/w}
P(w) S(w)
S(w) P(w)
P(x) R(x)
SUBST{w/x}
P(w) Q(w)
P(x) R(x)
Q(y) S(y)
R(z) S(z)
S(A)
Now lets look for a
contradiction…
Q(y) S(y)
R(z) S(z)
S(x) R(x)
SUBST{x/A, z/A}
S(A)
We will address the problem of
translating an arbitrary KB into
CNF in the next few slides.
Contradiction!
S(A)
S(A) S(A)
We have seen that we can do resolution when the KB is in CNF.
But most KB are not is that format.
Theorem: We can convert any KB to CNF
• Eliminate Implications
• Move inwards
• Standardize Variables
• Move Quantifiers left
• Skolemize {Eliminate Existential Quantifiers, Drop Universal Quantifiers }
• Distribute over
• Flatten nested conjunctions and disjunctions.
Gloss over!
Note that we must do the steps in order!
Eliminate Implications
Change all implication sentences to disjunctions
So
AB
Gloss
A Bover!
becomes
C D becomes
etc
C D
Move inwards
Negation is only allow on atom, not whole sentences.
So
Gloss
A over!
B
(A B) becomes
(A B)
becomes
A B
x P
x P
becomes
becomes
x P
P
becomes
P
x P
Standardize Variables
Make sure that you are not using the same variable name twice in
a single sentence (unless you really meant to).
Gloss over!
So
( x P(x) ) ( x Q(x) )
becomes
( x P(x) ) ( z Q(z) )
Move Quantifiers left
Move all quantifiers left, but keep them in order!
So
x P(x) y Q(y)
Gloss over!
becomes
x y P(x) Q(y)
Skolemize {Eliminate Existential Quantifiers, Drop Universal Quantifiers }
Existential quantifiers can be eliminated by the introduction of a new
constant that does not appear elsewhere in the database.
x P(x) Q(x) becomes
SUBST{x/A}
P(A) Q(A)
Gloss over!
One possible complication occurs if we also have Universal quantifiers… Consider
x
person(x) y heart(y) has(x,y)
Becomes by
x
SUBST{y/H}
person(x) heart(H) has(x,H)
!!!
Instead we have to create a new (Skolem) function to map from a person to their heart F(x)
x
person(x) heart(F(x)) has(x,F(x))
Drop Universal Quantifiers
Distribute over
(A B) C becomes (A C) (A B)
Gloss over!
Just like distribution in arithmetic
(5 + 4) * 6
becomes (5 * 6 ) + (4 * 6 )
Flatten nested conjunctions and disjunctions.
(A B) C
becomes
(A B C)
(A B) C
Gloss
over!
becomes
(A B C)
Done!!
• Eliminate Implications
• Move inwards
• Standardize Variables
• Move Quantifiers left
• Skolemize {Eliminate Existential Quantifiers, Drop Universal Quantifiers }
• Distribute over
• Flatten nested conjunctions and disjunctions.
Once we have done all the above steps, our KB is in CNF. We can now do
resolution refutation.
There is still an element of search, we have to decide which pairs of
sentences to resolve, and which substitutions we should use. The goal is
simply to find a contradiction.
There are many heuristics for the search (Our textbook lists Unit preference,
Set of support, Input resolution )
Bad News
• Even with all the heuristics, resolution is exponential in general.
• Resolution is complete, but only semidecidable.
If our KB implies A (or it implies A), resolution will find the
proof in finite time. But if the KB does not implies A (or A),
resolution will simply run forever.
Suppose our KB consists of facts about containment
General facts about containment
x,y,z in(x,y) in(y,z) enclosedby(x,z)
x,y in(x,y) enclosedby(x,y)
Specific facts for our particular problem
in(A,B)
in(B,C)
in(F,C)
Containment Revisited: Exercise
I will not collect or grade these, but expect a quiz soon and/or examples
on the final
General facts about containment
x,y,z in(x,y) in(y,z) enclosedby(x,z)
x,y in(x,y) enclosedby(x,y)
C
Let us add some predicates and functions to the containment world
VolumeOf(x)
SumOf
isLargerThan(x,y)
returns the volume of x
returns the sum of two numbers
TRUE, iff x > y
You can add more as needed. If you do, try to keep your set minimal
Given the above, encode (TELL) the following facts
• If something is inside another thing, the second thing must have a greater volume
• If x is in z, and y is in z, then the sum of the volumes of x and y is not larger than that of z
• (at least in our world) There exists a box that contains all the boxes (except itself)
• (encode some novel fact of your own for this domain)
Given the above, encode specific facts for our particular problem
• Encode the setup on the next page. Use only ‘in’, not “enclosedby”.
Given the above, how do you ASK the following?
• Is there anything inside of F?
• Is there anything inside of G?
• Is there any box that has at least two things in it?
• (tricky) Is F the outermost box?
B
K
A
F
G
C
In case the image is fuzzy…
(C((B(K,A)) (F(G))))