Transcript chapter7
CS 2710, ISSP 2610
Chapter 7
Propositional Logic
Reasoning
1
Knowledge Based Agents
•
•
•
•
Central component: knowledge base, or KB.
A set of sentences in a knowledge representation language
Generic Functions
– TELL (add a fact to the knowledge base)
– ASK (get next action based on info in KB)
[Inference?]
2
Note
•
•
•
•
We’re all familiar with logic from mathematics
In AI, we need to think about things we usually take for granted
To use the notion of prime number, you don’t have to think about what prime
means … or number, for that matter
In AI, we create knowledge representation schemes, which define the
objects, predicates, and functions which exist
3
The Wumpus World
•
•
•
•
•
•
•
•
•
•
Game is played in a MxN grid
One player, one wumpus, one or more pits
The wumpus eats anyone who enters its room
A pit traps anyone who moves into its room
(The wumpus can be shot by an agent, but the agent has only one arrow; this
and some other details won’t come up in these notes)
Player starts in 0,0
Forward, turn left, turn right
If player tries to move forward and bumps into a wall, the player does not
move
Player can grab the gold if they are in the same room
Goal: find gold while avoiding wumpus and pits
4
Sensors
•
•
5 sensors, each which gives one bit of information
– Stench (if wumpus and player in adjacent, not diagonal, squares)
– Breeze (if a pit and player in adjacent squares)
– Glitter (if gold and player in the same room)
– Bump (when the agent walks into a wall)
– Scream (when wumpus is killed; sound can be perceived everywhere)
Percepts: 5-tuples
– [stench, breeze, glitter, bump, scream]
5
Environment?
•
•
•
Discrete, static, single-agent (wumpus doesn’t move)
Sequential (player’s decisions can affect future ones)
Partially observable (locations of pits and wumpus are unobservable)
6
[start]
stench
[Wumpus] stench
Glitter
[gold]
stench,
breeze
breeze
[Pit]
breeze
0
0
7
Wumpus
•
•
•
Main difficulty: player doesn’t know locations of pits, gold, and wumpus
Reason about configuration
Knowledge evolves as new percepts arrive and actions are taken.
8
Examples of reasoning [diagram in lecture]
1. Player starts in (0,0). No stench or breeze. So, no pit or wumpus in (1,0) or
(0,1). Player still alive, so no pit or wumpus in (0,0). Player moves to (1,0)
2. Breeze in (1,0). So, there must be a pit in (0,0), (2,0) or (1,1). Player already
ruled out (0,0)
3. Since there is danger in moving to (2,0) and (1,1), player moves back to (0,0)
4. Player moves to (0,1). No positive percepts. Now player knows there is no
pit in (1,1) yeah! and there is a pit in (2,0) (pit-def in diagram)
Player combines knowledge gained at different times to make inferences
9
Fundamental Concepts of logical representation and reasoning
•
•
•
•
Information is represented in sentences, which must have correct syntax
( 1 + 2 ) * 7 = 21 vs. 2 ) + 7 = * ( 1 21
The semantics of a sentence defines its truth with respect to each possible
world – an interpretation assigning T or F to all propositions
W is a model of S means that sentence S is true under interpretation W
[What do the following mean?]
– X |= Y
– X entails Y
– Y logically follows from X
10
Which are true?
Which are not true but useful?
1. {Man, Man Mortal} |= Mortal
2. {Raining,Dog Mammal} |= Mammal
3. {Raining,Raining Wet} |= Wet
4. {Smoke, Fire Smoke} |= Fire
5. {Tall ^ Silly} |= Tall
6. {Tall v Silly} |= Silly
7. {Tall, Silly} |= Tall ^ Silly
[Wumpus world EG illustrating possible worlds and entailment]
11
Entailment (reminder)
•
•
•
•
•
•
A |= B
Under all interpretations in which A is true, B is true as well
All models of A are models of B
Whenever A is true, B is true as well
A entails B
B logically follows from A
12
Inference
KB |-i A
Inference algorithm i can derive A from KB
A can be inferred from KB using i
13
Inference Algorithm Examples
1.
2.
3.
4.
5.
{A,B} |- (^intro) A ^ B
{A, AB} |- (MP) B
{A, B A} |- (abduction) B
Internet says P|- (gullibility) P
Professor says P |- (trust in authority) P
14
Inference Algorithms
•
•
•
•
Notes:
implication, piece of syntax
|= entailment, used to describe semantics
|can be derived from*
Inference algorithm, inference procedure, rule of inference, inference
rule: procedure that derives sentences from sentences
[Definition of soundness of inference algorithm i]
[Definition of completeness of inference algorithm i]
* Though it can mean different things in different math/logic contexts; see “turnstile”
in Wikipedia
15
Monotonicity
•
[A logic is monotonic if…]
16
Propositional Logic Syntax
•
Sentence -> AtomicSent | complexSent
AtomicSent -> true|false| P, Q, R …
ComplexSent ->
sentence
|
( sentence sentence ) |
( sentence sentence ) |
( sentence sentence ) |
( sentence sentence ) |
( sentence )
[no predicate or function symbols]
17
Propositional Logic Sentences
•
•
•
If there is a pit at [1,1], there is a breeze at [1,0]
P11 B10
There is a breeze at [2,2], if and only if there is a pit in the neighborhood
B22 ( P21 P23 P12 P32 )
There is no breeze at [2,2]
B22
18
Semantics of Prop Logic
•
•
In model-theoretic semantics, an interpretation assigns elements of the
world to sentences, and defines the truth values of sentences
Propositional logic: easy! Assign T or F to each proposition symbol; then
assign truth values to complex sentences in the obvious way
19
Logical Equivalences
•
•
•
Sentences A and B are logically equivalent if:
they are true under exactly the same interpretations
A |= B and B |= A
20
Validity
•
•
•
A sentence (or set of sentences) is valid if
it is true under all interpretations
Example: P v ~P
21
Satisfiability
•
•
A sentence (or set of sentences) is satisfiable if there exists some
interpretation that makes it true
An interpretation satisfies a set of sentences if it makes them true
22
Entailment
•
•
A |= B
All interpretations that satisfy A also satisfy B
23
Propositional Logic Inference
•
•
•
Question: Does KB entail S?
Method 1: Truth Table Entailment
Method 2: Proof
– Proof by deduction, induction, contradiction, etc.
24
A
B
C
F
F
F
F
T
T
T
T
F
F
T
T
F
F
T
T
F
T
F
T
F
T
F
T
A
B
F
F
F
F
F
F
T
T
A
C
F
F
F
F
F
T
F
T
BC
F
F
F
T
F
F
F
T
A^C, C
does not
entail
BC
A,B,
Entails
AB
25
Example Proof by Deduction
•
•
Knowledge
S1: B22 ( P21 P23 P12 P32 ) rule
S2: B22
Inferences
S3: (B22 (P21 P23 P12 P32 ))
((P21 P23 P12 P32 ) B22) S1,bi elim
S4: ((P21 P23 P12 P32 ) B22) S3, and elim
S5: (B22 ( P21 P23 P12 P32 )) S4,contrapos
S6: (P21 P23 P12 P32 )
S2,S5, MP
S7: P21 P23 P12 P32
S6, DeMorg
observation
26
Proofs
•
•
•
•
•
A derivation
A sequence of applications of rules of inference
Reasoning by search
Successor function: all possible applications of inference rules
Monotonicity means search can be local, and more efficient
27
Resolution
•
Resolution allows a complete inference mechanism (search-based) using only
one rule of inference
28
Resolution
•
•
•
•
winter v summer
~winter v cold
Either winter or ~winter is true, so we know that summer or cold is true
Resolution rule:
– Given: P1 P2 P3 … Pn, and P1 Q1 … Qm
– Conclude: P2 P3 … Pn Q1 … Qm
Complementary literals P1 and P1 “cancel out”
29
Resolution in Wumpus World
•
•
•
There is a pit at 2,1 or 2,3 or 1,2 or 3,2
– P21 P23 P12 P32
There is no pit at 2,1
– P21
Therefore (by resolution) the pit must be at 2,3 or 1,2 or 3,2
– P23 P12 P32
30
Resolution
•
Any complete search algorithm, applying only the resolution rule, can derive
any conclusion entailed by any KB in propositional logic.
31
Proof using Resolution
•
To try to prove P from KB:
– Convert KB and P into CNF
– To prove P, prove KB P is contradictory (empty clause)
– Specifically, apply resolution using a complete search algorithm until:
• No new clauses can be added, (KB does not entail P)
• The empty clause is derived (KB does entail P).
32
B22 ( P21 P23 P12 P32 )
Conversion to CNF
1.
Eliminate , replacing with two implications
(B22 ( P21 P23 P12 P32 )) ((P21 P23 P12 P32 ) B22)
2.
Replace implication (A B) by A B
(B22 ( P21 P23 P12 P32 )) ((P21 P23 P12 P32 ) B22)
3.
Move “inwards” (unnecessary parens removed)
(B22 P21 P23 P12 P32 ) ( (P21 P23 P12 P32 ) B22)
4. Distributive Law
(B22 P21 P23 P12 P32 ) (P21 B22) (P23 B22) (P12 B22)
(P32 B22)
33
Previous Slide: Sentence is in CNF
•
•
•
Next step (with simpler example):
(P1 v P2 v ~P3) ^ P4 ^ ~P5 ^ (P2 v P3)
Create a separate clause corresponding to each conjunct
– P1 v P2 v ~P3
– P4
– ~P5
– P2 v P3
34
Finally…
– Add the negation of the goal to the set of clauses, and perform
resolution. If you reach the empty clause, you have proved the goal
35
Simple Resolution EG
•
•
•
•
When the agent is in 1,1, there is no breeze, so there can be no pits in
neighboring squares
(B11 (P12 v P21)); ~B11
Prove: ~P12. [lecture]
[What if P and ~P are both in the KB?]
36
Horn Clauses
•
•
•
•
A Horn Clause is a CNF clause with at most one positive literal
Horn Clauses form the basis of forward and backward chaining
The Prolog language is based on Horn Clauses
Deciding entailment with Horn Clauses is linear in the size of the knowledge
base
37
Reasoning with Horn Clauses
•
•
•
Forward Chaining
– For each new piece of data, generate all new facts, until the desired
fact is generated
– Data-directed reasoning
Backward Chaining
– To prove the goal, find a clause that contains the goal as its head, and
prove the body recursively
– Goal-directed reasoning
The state space is an AND-OR graph; see 7.5.4
38
Wrap-up
•
•
You are responsible for everything in Chapter 7 through 7.6-2
– Note: we didn’t cover 7.5.4 or 7.6 in lecture
We will return to topics in 7.7 when we cover planning
39