Transcript ppt
Logical Agents
ECE457 Applied Artificial Intelligence
Spring 2007
Lecture #6
Outline
Logical reasoning
Propositional Logic
Wumpus World
Inference
Russell & Norvig, chapter 7
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 2
Logical Reasoning
Recall: Game-playing with imperfect
information
Partially-observable environment
Need to infer about hidden information
Two new challenges
How to represent the information we have
(knowledge representation)
How to use the information we have to
infer new information and make decisions
(knowledge reasoning)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 3
Knowledge Representation
Represent facts about the environment
Language
Many ways: ontologies, mathematical functions, …
Statements that are either true or false
To write the statements
Syntax: symbols (words) and rules to combine
them (grammar)
Semantics: meaning of the statements
Expressiveness vs. efficiency
Knowledge base (KB)
Contains all the statements
Agent can TELL it new statements (update)
Agent can ASK it for information (query)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 4
Knowledge Representation
Example: Language of arithmetic
Syntax describes well-formed formulas
(WFF)
X + Y > 7 (WFF)
X 7 @ Y + (not a WFF)
Semantics describes meanings of
formulas
“X + Y > 7” is true if and only if the value
of X and the value of Y summed together is
greater than 7
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 5
Knowledge Reasoning
Inference
Discovering new facts and drawing conclusions
based on existing information
During ASK or TELL
“All humans are mortal”
“Socrates is human”
Entailment
A sentence is inferred from sentences
is true given that the are true
entails
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 6
Propositional Logic
Sometimes called “Boolean Logic”
Words of the syntax include propositional
symbols…
Sentences are true (T) or false (F)
P, Q, R, …
P = “I’m hungry”, Q = “I have money”,
R = “I’m going to a restaurant”
… and logical connectives
¬
negation
conjunction
disjunction
implication
biconditional
ECE457 Applied Artificial Intelligence
NOT
AND
OR
IF-THEN
IF AND ONLY IF
R. Khoury (2007)
Page 7
Propositional Logic
Atomic sentences
Propositional symbols
True or false
Complex sentences
Groups of propositional symbols joined
with connectives, and parenthesis if
needed
(P Q) R
Well-formed formulas following grammar
rules of the syntax
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 8
Propositional Logic
Complex
sentences
evaluate to true
or false
Using truth tables
Semantics
ECE457 Applied Artificial Intelligence
P Q R P Q (P Q) R
T T T
T
T
F T T
T F T
F F T
F
F
F
T
T
T
T T F
F T F
T
F
F
T
T F F
F F F
F
F
T
T
R. Khoury (2007)
Page 9
Propositional Logic Semantics
Truth tables for all connectives
Given each possible truth value of each
propositional symbol, we can get the
possible truth values of the expression
P
T
F
T
F
Q
T
T
F
F
¬P P Q P Q P Q P Q
F
T
T
T
T
T
F
T
T
F
F
F
T
F
F
T
F
F
T
T
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 10
Propositional Logic Example
Propositional symbols:
A = “The car has gas”
B = “I can go to the store”
C = “I have money”
D = “I can buy food”
E = “The sun is shining”
F = “I have an umbrella”
G = “I can go on a picnic”
If the car has gas, then
I can go to the store
I can buy food if I can
go to the store and I
have money
(B C) D
If I can buy food and
either the sun is not
shining or I have an
umbrella, I can go on a
picnic
ECE457 Applied Artificial Intelligence
AB
D (¬E F) G
R. Khoury (2007)
Page 11
D E F G ¬E ¬E F D (¬E F) D (¬E F) G
T T T T
F
T
T
T
F T T T
F
T
F
T
T F T T
T
T
T
T
F F T T
T
T
F
T
T T F T
F
F
F
T
F T F T
F
F
F
T
T F F T
T
T
T
T
F F F T
T
T
F
T
T T T F
F
T
T
F
F T T F
F
T
F
T
T F T F
T
T
T
F
F F T F
T
T
F
T
T T F F
F
F
F
T
F T F F
F
F
F
T
T F F F
T
T
T
F
ECE457 Applied Artificial Intelligence
F F F F
T
T
F
R. Khoury (2007)
T
Page 12
Wumpus World
2D cave divided
in rooms
Gold
Pits
Glitters
Agent has to pick
it up
Agent falls in
and dies
Agent feels
breeze near pit
Wumpus
4
3
2
1
1
2
3
4
Agent gets eaten and dies if Wumpus alive
Agent can kill Wumpus with arrow
Agent smells stench near Wumpus (alive or dead)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 13
Wumpus World
Initial state:
Goal:
Get the gold and
get back to (1,1)
Actions:
(1,1)
Turn 90°,
move forward,
shoot arrow,
pick up gold
Cost:
4
3
2
1
1
2
3
4
+1000 for getting gold, -1000 for dying,
-1 per action, -10 for shooting the arrow
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 14
Exploring the Wumpus World
4
3
2
Wumpus?
OK
Pit?
OK
Wumpus?
1
1
ECE457 Applied Artificial Intelligence
OK
Pit?
2
3
4
R. Khoury (2007)
Page 15
Wumpus World Logic
Propositional symbols
Rules
Pi,j = “there is a pit at (i,j)”
Bi,j = “there is a breeze at (i,j)”
Si,j = “there is a stench at (i,j)”
Wi,j = “there is a Wumpus at (i,j)”
Ki,j = “(i,j) is ok”
Pi,j (Bi+1,j Bi-1,j Bi,j+1 Bi,j-1)
Wi,j (Si+1,j Si-1,j Si,j+1 Si,j-1)
Bi,j (Pi+1,j Pi-1,j Pi,j+1 Pi,j-1)
Si,j (Wi+1,j Wi-1,j Wi,j+1 Wi,j-1)
Ki,j (¬Wi,j ¬Pi,j)
Have to be written out for every (i,j)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 16
Wumpus World KB
1. K1,1
2. ¬B1,1
4
3. ¬S1,1
3
a. B1,1 (P2,1 P1,2)
2
b. S1,1 (W2,1 W1,2)
c. K2,1(¬W2,1¬P2,1) 1
d. K1,2(¬W1,2¬P1,2)
ECE457 Applied Artificial Intelligence
1
2
R. Khoury (2007)
3
4
Page 17
Wumpus World Inference
1. K1,1
2. ¬B1,1
3. ¬S1,1
4. ¬P1,2
5. ¬P2,1
B1,1 P1,2 P2,1 ¬B1,1 P1,2P2,1
T
T
T
F
T
T
F
T
F
T
T
T
F
F
T
T
F
F
F
F
F
T
T
T
T
F
F
T
T
T
F
T
F
T
T
F
F
F
T
F
ECE457 Applied Artificial Intelligence
B1,1 (P1,2P2,1)
R. Khoury (2007)
T
T
T
F
F
F
F
T
Page 18
Wumpus World Inference
1. K1,1
2. ¬B1,1
3. ¬S1,1
4. ¬P1,2
5. ¬P2,1
6. ¬W1,2
7. ¬W2,1
S1,1 W1,2 W2,1 ¬S1,1 W1,2W2,1 S1,1 (W1,2W2,1)
T
T
T
F
T
T
T
F
T
F
T
T
T
T
F
F
T
T
T
F
F
F
F
F
F
T
T
T
T
F
F
F
T
T
T
F
F
T
F
T
T
F
F
F
F
T
F
T
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 19
Wumpus World Inference
1. K1,1
2. ¬B1,1
3. ¬S1,1
4. ¬P1,2
5. ¬P2,1
6. ¬W1,2
7. ¬W2,1
8. K1,2
9. K2,1
P1,2 W1,2 K1,2 ¬P1,2 ¬W1,2 ¬W1,2¬P1,2 K1,2 (¬W1,2¬P1,2)
T
T
T
F
F
F
F
F
T
F
T
F
T
F
T
F
F
T
T
F
F
T
T
T
F
F
F
F
T
F
T
F
T
F
T
F
T
T
F
F
T
T
ECE457 Applied Artificial Intelligence
F
F
T
F
F
F
T
F
F
T
T
T
T
F
R. Khoury (2007)
Page 20
Wumpus World KB
1. K1,1
2. ¬B1,1
3. ¬S1,1
4. ¬P1,2
5. ¬P2,1
6. ¬W1,2
7. ¬W2,1
8. K1,2
9. K2,1
10.B2,1
4
11.P2,2
3,1 P3,1
12.¬S2,1
13.¬W2,2
14.¬W3,1
15.¬B1,2
16.¬P1,3
17.¬P2,2
3
2
OK
1
18.S1,2
19.W1,3 W2,2
20.K2,2
ECE457 Applied Artificial Intelligence
Wumpus?
1
Pit?
OK
Wumpus?
OK
Pit?
2
3
R. Khoury (2007)
4
Page 21
Inference with Truth Tables
Sound
Complete
Finds all facts entailed by KB
Time complexity = O(2n)
Only infers true conclusions from true
premises
Checks all truth values of all symbols
Space complexity = O(n)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 22
Inference with Rules
Speed up inference by using inference
rules
Use along with logical equivalences
No need to enumerate and evaluate
every truth value
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 23
Rules and Equivalences
Logical equivalences
(α β) (β α)
(α β) (β α)
((α β) γ) (α (β γ))
((α β) γ) (α (β γ))
¬(¬α) α
(α β) (¬β ¬α)
(α β) (¬α β)
(α β) ((α β) (β α))
¬(α β) (¬α ¬β)
¬(α β) (¬α ¬β)
(α (β γ)) ((α β) (α γ))
(α (β γ)) ((α β) (α γ))
ECE457 Applied Artificial Intelligence
Inference rules
R. Khoury (2007)
(α β), α
β
(α β)
α
α, β
(αβ)
(α β), ¬β
α
(αβ), (¬βγ)
(α γ)
Page 24
Wumpus World & Inference Rules
KB: ¬B1,1
1. B1,1 (P2,1 P1,2)
Biconditional elimination
2. (B1,1 (P2,1 P1,2)) ((P2,1 P1,2) B1,1)
And elimination
3. (P2,1 P1,2) B1,1
Contraposition
4. ¬B1,1 ¬(P2,1 P1,2)
Modus Ponens
5. ¬(P2,1 P1,2)
De Morgan’s Rule
6. ¬P2,1 ¬P1,2
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 25
Resolution
Inference with rules is sound, but only
complete if we have all the rules
Resolution rule is both sound and complete
(αβ), (¬βγ)
(α γ)
But it only works on disjunctions!
Conjunctive normal form (CNF)
1.
2.
3.
4.
Eliminate biconditionals:
(αβ) ((αβ)(βα))
Eliminate implications: (α β) (¬α β)
Move/Eliminate negations: ¬(¬α) α,
¬(α β) (¬α ¬β), ¬(α β) (¬α ¬β)
Distribute over : (α (βγ)) ((αβ)
(αγ))
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 26
CNF Example
1. B1,1 (P2,1 P1,2)
Eliminate biconditionals
2. (B1,1 (P2,1 P1,2)) ((P2,1 P1,2) B1,1)
Eliminate implications
3. (¬B1,1 P2,1 P1,2) (¬(P2,1 P1,2) B1,1)
Move/Eliminate negations
4. (¬B1,1 P2,1 P1,2) ((¬P2,1 ¬P1,2) B1,1)
Distribute over
5. (¬B1,1 P2,1 P1,2) (¬P2,1 B1,1)
(¬P1,2 B1,1)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 27
Resolution Algorithm
Given a KB
Need to answer a query α
KB α ?
Proof by contradiction
Show that (KB ¬α) is unsatisfiable
i.e. leads to a contradiction
If (KB ¬α), then (KB α) must be true
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 28
Resolution Algorithm
Convert (KB ¬α) into CNF
For every pair of clauses that contain
complementary symbols
Apply resolution to generate a new clause
Add new clause to sentence
End when
Resolution gives the empty clause (KB α)
No new clauses can be added (fail)
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 29
Wumpus World & Resolution
(¬B1,1 P1,2 P2,1) (¬P1,2 B1,1)
(¬P2,1 B1,1)
CNF form of B1,1 (P2,1 P1,2)
¬B1,1
Query: ¬P1,2
(¬B1,1 P1,2 P2,1) (¬P1,2 B1,1) (¬P2,1 B1,1) ¬B1,1 P1,2
(¬B1,1 P1,2 P2,1) (¬P1,2 B1,1)
¬P2,1
P1,2
(¬B1,1 P1,2 P2,1) (¬P1,2 B1,1)
Empty clause!
KB ¬P1,2
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 30
Resolution Algorithm
Sound
Complete
Not efficient
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 31
Horn Clauses
Resolution algorithm can be further
improved by using Horn clauses
Disjunction clause with at most one
positive symbol
Can be rewritten as implication
¬α ¬β γ
(α β) γ
Inference in linear time!
Using Modus Ponens
Forward or backward chaining
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 32
Forward Chaining
Data-driven reasoning
Start with known symbols
Infer new symbols and add to KB
Use new symbols to infer more new symbols
Repeat until query proven or no new symbols can
be inferred
Work forward from known data, towards
proving goal
1.
2.
3.
4.
KB: α, β, δ, ε
(α β) γ
(δ ε) λ
(λ γ) q
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 33
Backward Chaining
Goal-driven reasoning
Start with query, try to infer it
If there are unknown symbols in the premise of
the query, infer them first
If there are unknown symbols in the premise of
these symbols, infer those first
Repeat until query proven or its premise cannot
be inferred
Work backwards from goal, to prove needed
information
1.
2.
3.
4.
KB: α, β, δ, ε
(λ γ) q
(δ ε) λ
(α β) γ
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 34
Forward vs. Backward
Forward chaining
Proves everything
Goes to work as soon as new information is
available
Expands the KB a lot
Improves understanding of the world
Typically used for proving a world model
Backward chaining
Proves only what is needed for the goal
Does nothing until a query is asked
Expands the KB as little as needed
More efficient
Typically used for proofs by contradiction
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 35
Assumptions
Utility-based agent
Environment
Fully observable / Partially observable
(approximation)
Deterministic / Strategic / Stochastic
Sequential
Static / Semi-dynamic
Discrete / Continuous
Single agent / Multi-agent
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 36
Assumptions Updated
Learning agent
Environment
Fully observable / Partially observable
Deterministic / Strategic / Stochastic
Sequential
Static / Semi-dynamic
Discrete / Continuous
Single agent / Multi-agent
ECE457 Applied Artificial Intelligence
R. Khoury (2007)
Page 37