Transcript Logic 2
CSM6120
Introduction to Intelligent Systems
Propositional and Predicate Logic 2
[email protected]
Syntax and semantics
Syntax
Semantics
Rules for constructing legal sentences in the logic
Which symbols we can use (English: letters, punctuation)
How we are allowed to combine symbols
How we interpret (read) sentences in the logic
Assigns a meaning to each sentence
Example: “All lecturers are seven foot tall”
A valid sentence (syntax)
And we can understand the meaning (semantics)
This sentence happens to be false (there is a counterexample)
Propositional logic
Syntax
Propositions, e.g. “it is wet”
Connectives: and, or, not, implies, iff (equivalent)
Brackets (), T (true) and F (false)
Semantics (Classical/Boolean)
Define how connectives affect truth
“P and Q” is true if and only if P is true and Q is true
Use truth tables to work out the truth of statements
Important concepts
Soundness, completeness, validity (tautologies)
Logical equivalences
Models/interpretations
Entailment
Inference
Clausal forms (CNF, DNF)
Resolution algorithm
Given formula in conjunctive normal form, repeat:
If the empty clause results, formula is not satisfiable
Find two clauses with complementary literals,
Apply resolution,
Add resulting clause (if not already there)
Must have been obtained from P and NOT(P)
Otherwise, if we get stuck (and we will eventually), the
formula is guaranteed to be satisfiable
Example
Our knowledge base:
1) FriendWetBecauseOfSprinklers
2) NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn
Can we infer SprinklersOn? We add:
3) NOT(SprinklersOn)
From 2) and 3), get
4) NOT(FriendWetBecauseOfSprinklers)
From 4) and 1), get empty clause = SpinklersOn entailed
Horn clauses
A Horn clause is a CNF clause with at most one positive literal
Horn clauses form the basis of forward and backward chaining
The positive literal is called the head, negative literals are the body
Prolog: head:- body1, body2, body3 …
English: “To prove the head, prove body1, …”
Implication: If (body1, body2 …) then head
The Prolog language is based on Horn clauses
Modus ponens: complete for Horn KBs
Deciding entailment with Horn clauses is linear in the size of the knowledge
base
Reasoning with Horn clauses
Chaining: simple methods used by most inference engines to
produce a line of reasoning
Forward chaining (FC)
For each new piece of data, generate all new facts, until the desired fact
is generated
Data-directed reasoning
Backward chaining (BC)
To prove the goal, find a clause that contains the goal as its head, and
prove the body recursively
(Backtrack when the wrong clause is chosen)
Goal-directed reasoning
Forward chaining algorithm
Read the initials facts
Begin
Filter_phase: find the fired rules (that match facts)
While fired rules not empty AND not end DO
Choice_phase: Analyse the conflicts, choose most appropriate rule
Apply the chosen rule
End do
End
Forward chaining example (1)
Suppose we have three rules:
R1: If A and B then D (= A ˄ B → D)
R2: If B then C
R3: If C and D then E
If facts A and B are present, we infer D from R1 and infer
C from R2
With D and C inferred, we now infer E from R3
Forward chaining example (2)
Rules
R1: IF hot AND smoky THEN fire
R2: IF alarm-beeps THEN smoky
R3: IF fire THEN switch-sprinkler
First cycle: R2 holds
Second cycle: R1 holds
Third cycle: R3 holds
Facts
• alarm-beeps
• hot
• smoky
• fire
• switch-sprinkler
Action
Forward chaining example (3)
Fire any rule whose premises are satisfied in the KB
Add its conclusion to the KB until the query is found
Forward chaining
AND-OR Graph
Multiple links joined by an arc indicate conjunction – every link must be
proved
Multiple links without an arc indicate disjunction – any link can be
proved
Forward chaining
If we want to apply the forward chaining to
this graph we first process the known
leaves A and B and then allow inference to
propagate up the graph as far as possible
Forward chaining
Forward chaining
Forward chaining
Forward chaining
Forward chaining
Forward chaining
Backward chaining
Idea - work backwards from the query q
To prove q by BC,
Avoid loops
Check if q is known already, or
Prove by BC all premises of some rule concluding q (i.e. try to prove each of
that rule’s conditions)
Check if new subgoal is already on the goal stack
Avoid repeated work: check if new subgoal
Has already been proved true, or
Has already failed
Backward chaining example
The same three rules:
R1: If A and B then D
R2: If B then C
R3: If C and D then E
If E is known (or is our hypothesis), then R3 implies C
and D are true
R2 thus implies B is true (from C) and R1 implies A and B
are true (from D)
Example
Facts
Rules
• alarm-beeps
• hot
Hypothesis
R1: IF hot AND smoky THEN fire
R2: IF alarm-beeps THEN smoky
R3: IF fire THEN switch-sprinkler
Should I switch the
sprinklers on?
Evidence
IF fire
Use R3
Use R1
Use R2
IF hot
IF smoky
IF alarm-beeps
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Backward chaining
Comparison
Forward chaining
From facts to valid conclusions
Good when:
Backward chaining
From hypotheses to relevant facts
Good when:
Less clear hypothesis
Very large number of possible
conclusions
Limited number of (clear)
hypotheses
Determining truth of facts is
expensive
Large number of possible facts,
mostly irrelevant
True facts known at start
Forward vs. backward chaining
FC is data-driven
BC is goal-driven
Automatic, unconscious processing
E.g. routine decisions
May do lots of work that is irrelevant to the goal
Appropriate for problem solving
E.g. “Where are my keys?”, “How do I start the car?”
The complexity of BC can be much less than linear in size of
the KB
Application
Wide use in expert systems
Backward chaining: diagnosis systems
Start with set of hypotheses and try to prove each one, asking additional
questions of user when fact is unknown
Forward chaining: design/configuration systems
See what can be done with available components
Checking models
Sometimes we just want to find any model that satisfies
the KB
Propositional satisfiability: Determine if an input
propositional logic sentence (in CNF) is satisfiable
We use a backtracking search to find a model
DPLL, WalkSAT, etc – lots of algorithms out there!
This is similar to finding solutions in constraint
satisfaction problems
More about CSPs in a later module
Topic #3 – Predicate Logic
Predicate logic
Predicate logic is an extension of propositional logic that
permits concisely reasoning about whole classes of entities
Also termed Predicate Calculus or First Order Logic
The language Prolog is built on a subset of this
Propositional logic treats simple propositions (sentences) as
atomic entities
In contrast, predicate logic distinguishes the subject of a
sentence from its predicate…
Topic #3 – Predicate Logic
Subjects and predicates
In the sentence “The dog is sleeping”:
The phrase “the dog” denotes the subject - the object or
entity that the sentence is about
The phrase “is sleeping” denotes the predicate- a property that
is true of the subject
In predicate logic, a predicate is modelled as a function
P(·) from objects to propositions
i.e. a function that returns TRUE or FALSE
P(x) = “x is sleeping” (where x is any object)
or, Is_sleeping(dog)
Tree(a) is true if a = oak, false if a = daffodil
Topic #3 – Predicate Logic
More about predicates
Convention: Lowercase variables x, y, z... denote
objects/entities; uppercase variables P, Q, R… denote
propositional functions (predicates)
Keep in mind that the result of applying a predicate P to
an object x is the proposition P(x)
But the predicate P itself (e.g. P=“is sleeping”) is not a
proposition (not a complete sentence)
E.g. if P(x) = “x is a prime number”,
P(3) is the proposition “3 is a prime number”
Topic #3 – Predicate Logic
Propositional functions
Predicate logic generalizes the grammatical notion of a
predicate to also include propositional functions of any
number of arguments, each of which may take any
grammatical role that a noun can take
E.g. let P(x,y,z) = “x gave y the grade z”,
then if
x=“Mike”, y=“Mary”, z=“A”
P(x,y,z) = “Mike gave Mary the grade A”
Reasoning
KB:
(1) student(S)∧studies(S,ai) → studies(S,prolog)
(2) student(T)∧studies(T,expsys) → studies(T,ai)
(3) student(joe)
(4) studies(joe,expsys)
(1) and (2) are rules, (3) and (4) are facts
With the information in (3) and (4), rule (2) can fire (but
rule (1) can't), by matching (unifying) joe with T
This gives a new piece of knowledge, studies(joe, ai)
With this new knowledge, rule (1) can now fire
joe is unified with S
Reasoning
KB:
(1) student(S)∧studies(S,ai) → studies(S,prolog)
(2) student(T)∧studies(T,expsys) → studies(T,ai)
(3) student(joe)
(4) studies(joe,expsys)
We can apply modus ponens to this twice (FC), to get
studies(joe, prolog)
This can then be added to our knowledge base as a new
fact
Clause form
We can express any predicate calculus statement in
clause form
This enables us to work with just simple OR (disjunct: ∨)
operators rather than having to deal with implication (→)
and AND (∧)
thus allowing us to work towards a resolution proof
Example
Let's put our previous example in clause form:
(1) ¬student(S) ∨ ¬studies(S, ai) ∨ studies(S,prolog)
(2) ¬student(T) ∨ ¬studies(T,expsys) ∨ studies(T,ai)
(3) student(joe)
(4) studies(joe,expsys)
Is there a solution to studies(S, prolog)?
= “is there someone who studies Prolog?”
Negate it... ¬studies(S, prolog)
Example
(1) ¬student(S) ∨ ¬studies(S, ai) ∨ studies(S,prolog)
(2) ¬student(T) ∨ ¬studies(T,expsys) ∨ studies(T,ai)
(3) student(joe)
(4) studies(joe,expsys)
¬studies(S, prolog)
Resolve with clause (1): ¬student(S) ∨ ¬studies(S,ai)
Resolve with clause (2) (S = T): ¬student(S) ∨ ¬studies(S,expsys)
Resolve with clause (4) (S = joe): ¬student(joe)
Finally, resolve this with clause (3), and we have nothing left –
the empty clause
Example
These are all the same logically...
(1) student(S) ∧ studies(S,ai) → studies(S,prolog)
(2) student(T) ∧ studies(T,expsys) → studies(T,ai)
(3) student(joe)
(4) studies(joe,expsys)
(1) ¬student(S) ∨ ¬studies(S, ai) ∨ studies(S,prolog)
(2) ¬student(T) ∨ ¬studies(T,expsys) ∨ studies(T,ai)
(3) student(joe)
(4) studies(joe,expsys)
(1) studies(S,prolog) ← student(S) ∧ studies(S,ai)
(2) studies(T,ai) ← student(T) ∧ studies(T,expsys)
(3) student(joe) ←
(4) studies(joe,expsys) ←
Note the last two...
(3) student(joe) ←
(4) studies(joe,expsys) ←
Joe is a student, and is studying expsys, so it's not
dependent on anything – it's a statement/fact
So there is nothing to the right of the ←
Topic #3 – Predicate Logic
Universes of discourse
The power of distinguishing objects from predicates is
that it lets you state things about many objects at once
E.g., let P(x)=“x+1>x”. We can then say,
“For any number x, P(x) is true” instead of
(0+1>0) (1+1>1) (2+1>2) ...
The collection of values that a variable x can take is called
x’s universe of discourse (or simply ‘universe’)
Topic #3 – Predicate Logic
Quantifier expressions
Quantifiers provide a notation that allows us to quantify
(count) how many objects in the universe of discourse
satisfy a given predicate
“” is the FORLL or universal quantifier
x P(x) means for all x in the universe, P holds
“” is the XISTS or existential quantifier
x P(x) means there exists an x in the universe (that is, 1 or
more) such that P(x) is true
Topic #3 – Predicate Logic
The universal quantifier
Example:
Let the universe of x be parking spaces at AU
Let P(x) be the predicate “x is full”
Then the universal quantification of P(x), x P(x), is the
proposition:
“All parking spaces at AU are full”
i.e., “Every parking space at AU is full”
i.e., “For each parking space at AU, that space is full”
Topic #3 – Predicate Logic
The existential quantifier
Example:
Let the universe of x be parking spaces at AU
Let P(x) be the predicate “x is full”
Then the existential quantification of P(x), x P(x), is the
proposition:
“Some parking space at AU is full”
“There is a parking space at AU that is full”
“At least one parking space at AU is full”
Topic #3 – Predicate Logic
Nesting of quantifiers
Example: Let the universe of x & y be people
Let L(x,y)=“x likes y”
Then y L(x,y) = “There is someone whom x likes”
Then x (y L(x,y)) =
“Everyone has someone whom they like”
What does this mean?
C (owned-by(C,O) ∧ cat(C) → contented(C))
P (person(P) ∧ lives-in(P, Wales) →
H (harp(H) ∧ plays(P,H)))
Topic #3 – Predicate Logic
Quantifier exercise
If R(x,y)=“x relies upon y,” (x and y are people) express
the following in unambiguous English:
x(y R(x,y))= Everyone has someone to rely on
y(x R(x,y))=
There’s a person whom everyone relies upon
(including himself)!
x(y R(x,y))=
There’s some needy person who relies upon
everybody (including himself)
y(x R(x,y))=
Everyone has someone who relies upon them
x(y R(x,y))=
Everyone relies upon everybody, (including
themselves)!
More fun with sentences
“Every dog chases its own tail”
– d, Chases(d, Tail-of (d))
– Alternative Statement: d, t, Tail-of(t, d) Chases(d, t)
“Every dog chases its own (unique) tail”
– d, 1 t, Tail-of(t, d) Chases(d, t)
t, Tail-of(t, d) Chases(d, t) [ t’, Chases(d, t’) t’ = t]
“Only the wicked flee when no one pursues”
– x, Flees(x) [¬ y, Pursues(y, x)] Wicked(x)
d,
Propositional vs First-Order Inference
Inference rules for quantifiers
x King ( x) Greedy( x) Evil( x)
Infer the following sentences:
King(John) ∧ Greedy(John) → Evil(John)
King(Richard) ∧ Greedy(Richard) → Evil(Richard)
King(Father(John)) ∧ Greedy(Father(John)) → Evil(Father(John))
…
Inference in First-Order Logic
Need to add new logic rules above those in propositional logic
Universal Elimination
Existential Elimination
x Likes( x, Muse ) Likes( Liz , Muse )
Substitute Liz for x
x Likes
( xexist
, Muse
) Likes
(Person1
does not
elsewhere
in KB) ( Person1, Muse )
Existential Introduction
Likes(Glenn, Muse ) x Likes( x, Muse )
Example of inference rules
“It is illegal for students to copy music”
“Joe is a student”
“Every student copies music”
Is Joe a criminal?
Knowledge Base:
x, y Student( x) Music ( y ) Copies ( x, y )
Criminal(x )
Student(Joe)
(1)
x y Student(x) Music(y) Copies ( x, y )
(3)
( 2)
Example cont...
From : x y Student(x) Music(y) Copies ( x, y )
y Student(Joe) Music(y) Copies ( Joe, y )
Universal Elimination
Existential Elimination
Student(Joe) Music(Some Song) Copies ( Joe, SomeSong )
Modus Ponens
Criminal(J oe)
Human reasoning
Analogical
Nonmonotonic/default
Things may change over time
Commonsense
Handling contradictions, retracting previous conclusions
Temporal
We solved one problem one way – so maybe we can solve
another one the same (or similar) way
E.g., we know that humans are generally less than 2.2m tall
We have lots of this knowledge – computers don’t!
Inductive
Induce new knowledge from observations
Beyond true and false
Multi-valued logics
More than two truth values
e.g., true, false & unknown
Fuzzy logic - truth value in [0,1]
Modal logics
Modal operators define mode for propositions
Epistemic logics (belief)
e.g. necessity, possibility
Temporal logics (time)
e.g. always, eventually
Types of Logic
Language
What exists
Belief of agent
Propositional Logic
Facts
True/False/Unknown
First-Order Logic
Facts, Objects, Relations
True/False/Unknown
Temporal Logic
Facts, Objects, Relations, True/False/Unknown
Times
Probability Theory
Facts
Degree of belief 0..1
Fuzzy Logic
Degree of truth
Degree of belief 0..1