Transcript View File

Knowledge Representation
Lecture # 17, 18 & 19
Motivation (1)
 Up to now, we concentrated on search methods in worlds that
can be relatively easily represented by states and actions on them
 a few objects, rules, relatively simple states
 problem-specific heuristics to guide the search
 complete knowledge: know all what’s needed
 no new knowledge is deduced or added
 well-defined start and goal states
 Appropriate for accessible, static, discrete problems.
Motivation (2)
 What about other types of problems?
 More objects, more complex relations
 Not all knowledge is explicitly stated
 Dynamic environments: the rules change!
 Agents change their knowledge
 Deduction: how to derive new conclusions
 Examples
1. queries on family relations
2. credit approval
3. diagnosis of circuits
Knowledge Representation
 A complete intelligent agent needs to be able to perform several
tasks:
– Perception: what is my state?
– Deliberation: what action should I take?
– Action: how do I execute the action?
 State recognition implies some form of knowledge
representation
 Figuring out the right action implies some form of inference
Two levels to think about:
 Knowledge level; what does the agent know?
 Implementation level: how is the knowledge represented?
Key question: What is a good representation?
Knowledge Bases
The golden dream:
 Tell the agent what it needs to know
 The agent uses rules of inference to deduce consequences
This is the declarative approach to building agents.
Agents have two different parts:
• A knowledge base, which contains a set of facts
expressed in some formal, standard language
• An inference engine, with general rules for deducing
new facts
Knowledge Base
Knowledge Base: set of sentences represented in a knowledge
representation language and represents assertions about the world.
ask
tell
Inference rule: when one ASKs questions of the KB, the answer should
follow from what has been TELLed to the KB previously.
CONTD…






The Knowledge Base is a set of sentences.
Syntactically well-formed
Semantically meaningful
A user can perform two actions to the KB:
Tell the KB a new fact
Ask the KB a question
Syntax of Sentences
 English acceptable an one is sentence This
vs.
 This English sentence is an acceptable one.
 V P – ^Q R
vs.
 P V –Q ^ R
Semantics of Sentences
This hungry classroom is a jobless moon.
 Why is this syntactically correct sentence not meaningful?
P V –Q ^ R
 Represents a world where either P is true, or Q is not true
and R is true.
Logical Agents
 Reflex agents find their goal state by dumb luck
 Logic (Knowledge-Based) agents combine general
knowledge with current percepts to infer hidden aspects of
current state prior to selecting actions
 Crucial in partially observable environments
Knowledge-Based Agent
sensors
?
environment
agent
actuators
Domain-specific content
Knowledge base
Inference Engine
Domain-independent algorithms
A Knowledge-Based Agent
 A knowledge-based agent consists of a knowledge base (KB) and
an inference engine (IE).
 A knowledge-base is a set of representations of what one knows
about the world (objects and classes of objects, the fact about
objects, relationships among objects, etc.)
 Each individual representation is called a sentence.
Abilities KB agent
 Agent must be able to:
 Represent states and actions,
 Incorporate new percepts
 Update internal representation of the world
 Deduce hidden properties of the world
 Deduce appropriate actions
Knowledgebase Agents
 The sentences are expressed in a knowledge representation
language.
 Examples of sentences
 The moon is made of green cheese
 If A is true then B is true
 A is false
 All humans are mortal
 Confucius is a human
 The Inference engine derives new sentences from the input and
KB
 The inference mechanism depends on representation in KB
 The agent operates as follows:
 1. It receives percepts from environment
 2. It computes what action it should perform (by IE and KB)
 3. It performs the chosen action (some actions are simply inserting
inferred new facts into KB).
Input from
environment
Inference
Engine
Output
(actions)
Learning
(KB update)
Knowledge
Base
The Wumpus World
The Wumpus computer game
 The agent explores a cave consisting of rooms connected by
passageways.
 Lurking somewhere in the cave is the Wumpus, a beast that
eats any agent that enters its room.
 Some rooms contain bottomless pits that trap any agent that
wanders into the room.
 Occasionally, there is a heap of gold in a room.
 The goal is to collect the gold and exit the world without
being eaten
Wumpus PEAS description
 Performance measure:
gold +1000, death -1000, -1 per step, -10 use arrow
 Environment:
 Squares adjacent to wumpus are smelly
 Squares adjacent to pit are breezy
 Glitter iff gold is in the same square
 Bump iff move into a wall
 Woeful scream iff the wumpus is killed
 Shooting kills wumpus if you are facing it
 Shooting uses up the only arrow
 Grabbing picks up gold if in same square
 Releasing drops the gold in same square
 Sensors: Stench, Breeze, Glitter, Bump, Scream
 Actuators: Let turn, Right turn, Forward, Grab, Release, Shoot
Exploring the Wumpus World
[1,1] The KB initially contains the rules of the environment.
The first percept is [none, none,none,none,none],
move to safe cell e.g. 2,1
Exploring the Wumpus World
[2,1] = breeze
indicates that there is a pit in [2,2] or [3,1],
return to [1,1] to try next safe cell
Exploring the Wumpus World
[1,2] Stench in cell which means that wumpus is in [1,3] or [2,2]
YET … not in [1,1]
YET … not in [2,2] or stench would have been detected in [2,1]
(this is relatively sophisticated reasoning!)
Exploring the Wumpus World
[1,2] Stench in cell which means that wumpus is in [1,3] or [2,2]
YET … not in [1,1]
YET … not in [2,2] or stench would have been detected in
[2,1]
(this is relatively sophisticated reasoning!)
THUS … wumpus is in [1,3]
THUS [2,2] is safe because of lack of breeze in [1,2]
THUS pit in [1,3] (again a clever inference)
move to next safe cell [2,2]
Exploring the Wumpus World
[2,2] move to [2,3]
[2,3] detect glitter , smell, breeze
THUS pick up gold
THUS pit in [3,3] or [2,4]
Representation, Reasoning, and Logic
 The objective of knowledge representation is to express knowledge
in a computer-tractable form, so that agents can perform well.
 A knowledge representation language is defined by:
 Its syntax which defines all possible sequences of
symbols that constitute sentences of the language
(grammar to form sentences)
Representation, Reasoning, and Logic
 Its semantics determines the facts in the world to
which the sentences refer (meaning of sentences)
 Each sentence makes a claim about the world.
 Its proof theory (inference rules and proof
procedures)
Logic in general
 Logics are formal languages for representing information such
that conclusions can be drawn
 Syntax defines the sentences in the language
 Semantics define the "meaning" of sentences;
 i.e., define truth of a sentence in a world
 E.g., the language of arithmetic
 x+2 ≥ y is a sentence; x2+y > {} is not a sentence
 x+2 ≥ y is true iff the number x+2 is no less than the number y
 x+2 ≥ y is true in a world where x = 7, y = 1
 x+2 ≥ y is false in a world where x = 0, y = 6
Types of Logic




Logics are characterized by what they commit to as “primitives”
Ontological commitment: what exists—facts? objects? time?
beliefs?
Epistemological commitment: what states of knowledge?
Interpretations
 We want to have a rule for generating (or testing) new sentences that







are always true
But the truth of a sentence may depend on its interpretation!
Formally, an interpretation is a way of matching objects in the
world with symbols in the sentence (or in the knowledge database)
A sentence may be true in one interpretation and false in another
Terminology:
A sentence is valid if it is true in all interpretations
A sentence is satisfiable if it is true in at least one interpretation
A sentence is unsatisfiable if it is false in all interpretations
Entailment
 Entailment means that one thing follows from another:
KB ╞ α
 Knowledge base KB entails sentence α if and only if α is
true in all worlds where KB is true
 For example
⊨
(P ^ Q)
(P V R)
 Entailment is a relationship between sentences (i.e., syntax) that
is based on semantics
Models
 Models are formal definitions of possible states of the world
 We say m is a model of a sentence  if  is true in m
 M() is the set of all models of 
 Then KB  if and only if M(KB)  M()
M()
M(KB)
Inference
 KB |-i  : sentence  can be derived from KB by inference
procedure I
 For example
-(AVB) ⊢ (-A ^ -B)
 Soundness: KB
⊢ f → KB ⊨ f
i.e. all conclusions arrived at via the proof procedure are correct:
they are logical consequences.
 Completeness: KB
⊨ f → KB ⊢ f
i.e. every logical consequence can be generated by the proof procedure
Propositional Logic: Syntax
Propositional Logic: Semantics
Truth Tables
A
B
A
AB
AB
AB
True
True
False
True
True
True
True
False
False
False
True
False
False
False
True
False
False
True
False
True
True
False
True
True
Propositional Inference: Truth Table Method
Logical equivalence
 Two sentences are logically equivalent iff both are true in same models:
α ≡ ß iff α╞ β and β╞ α
Wumpus world sentences
 Let Pi,j be true if there is a pit in [i,j]
 Let Bi,j be true if there is a breeze in [i,j]
 R1: ¬P1,1
 “Pits cause breezes in adjacent squares”
 R2: B11  P12 v P21
 R3: B2,1 
(P1,1  P2,2  P3,1)
 Include breeze precepts for the first 2 moves
 R4: ¬B1,1
 R5: B2,1
Normal Forms
Horn Sentences
Example: Conversion to CNF
Any KB can be converted into CNF
B1,1  (P1,2  P2,1)
1. Eliminate , replacing α  β with (α  β)(β  α).
(B1,1  (P1,2  P2,1))  ((P1,2  P2,1)  B1,1)
2. Eliminate , replacing α  β with α β.
(B1,1  P1,2  P2,1)  ((P1,2  P2,1)  B1,1)
3. Move  inwards using de Morgan's rules and doublenegation:
(B1,1  P1,2  P2,1)  ((P1,2  P2,1)  B1,1)
4. Apply distributive law ( over ) and flatten:
(B1,1  P1,2  P2,1)  (P1,2  B1,1)  (P2,1  B1,1)
Validity and Satisfiability
Complementary Literals
 A literal is a either an atomic sentence or the negated atomic
sentence, e.g.:
P, P
 Two literals are complementary if one is the negation of the
other, e.g.:
P and P
Reasoning Patterns/Rules of Inference
Proofs using Inference in Wumpus World
Proof Methods
Proof methods divide into (roughly) two kinds:
 Model checking:
Truth table enumeration (sound and complete for propositional
logic)
 Heuristic search in model space (sound but incomplete)

 Application of inference rules:
 Legitimate (sound) generation of new sentences from old
 A proof is a sequence of inference rule applications
 Inference rules can be used as operators in a standard search
algorithm!
PL is Too Weak a Representational Language
 Consider the problem of representing the following
information:
 Every person is mortal. (S1)
 Confucius is a person. (S2)
 Confucius is mortal.
(S3)
 S3 is clearly a logical consequence of S1 and S2. But how can
these sentences be represented using PL so that we can infer the
third sentence from the first two?
 We can use symbols P, Q, and R to denote the three
propositions, but this leads us to nowhere because knowledge
important to infer R from P and Q (i.e., relationship between
being a human and mortality, and the membership relation
between Confucius and human class) is not expressed in a way
that can be used by inference rules
Weakness of PL
Weakness of PL
Summary
 Looked at the requirements of a representation
language
 Knowledge is stored in sentences in a knowledge
representation language and stored in a knowledge
base.
 A knowledge based agent is composed of a knowledge base and
inference mechanism
 Inference is the process of deriving new
 sentences from old ones.
 Propositional logic is very simple language consisting of
propositional symbols and logical connectives.