Week 12: (one class) Resolution Theorem Proving. Uncertainty start
Download
Report
Transcript Week 12: (one class) Resolution Theorem Proving. Uncertainty start
Lecture of 11/13
Resolution theorem proving (end)
Propositional Probabilistic Logic (start)
Announcements:
1. Homework 4 socket closed; Due next week
2. Homework 5 and Project 5 made available
for perusal
(these will be the only other assignments)
Rest of the semester:
~3 lectures on Uncertainty (project 5)
~3 lectures on learning
1 last class (loose ends+ interactive review)
Example of FOPC Resolution..
Everyone is loved by someone
xy loves( y, x) loves( SK ( x' ), x' )
If x loves y, x will give a valentine card to y
xy loves( y, x) GV ( y, x) loves( y, x) GV ( y, x)
Will anyone give Rao a valentine card?
zGV ( z, Rao ) zGV ( z, Rao ) zGV ( z, Rao ) GV ( z, Rao )
y/z;x/Rao
~loves(z,Rao)
z/SK(rao);x’/rao
Finding where you left your key..
Atkey(Home) V Atkey(Office) 1
Where is the key?
Ex Atkey(x)
Negate
Forall x ~Atkey(x)
CNF ~Atkey(x) 2
Resolve 2 and 1 with x/home
You get Atkey(office) 3
Resolve 3 and 2 with x/office
You get empty clause
So resolution refutation “found”
that there does exist a place
where the key is…
Where is it?
what is x bound to?
x is bound to office once
and home once.
so x is either home or office
Existential proofs..
• The previous example shows that resolution
refutation is powerful enough to model existential
proofs. In contrast, generalized modus ponens is
only able to model constructive proofs..
• (We also discussed a cute example of existential
proof—is it possible for an irrational number
power another irrational number to be a rational
number—we proved it is possible, without
actually giving an example).
GMP vs. Resolution Refutation
• While resolution refutation is a complete inference for
FOPC, it is computationally semi-decidable, which is a far
cry from polynomial property of GMP inferences.
• So, most common uses of FOPC involve doing GMP-style
reasoning rather than the full theorem-proving..
• There is a controversy in the community as to whether the
right way to handle the computational complexity is to
– a. Develop “tractable subclasses” of languages and require the
expert to write all their knowlede in the procrustean beds of those
sub-classes (so we can claim “complete and tractable inference”
for that class) OR
– Let users write their knowledge in the fully expressive FOPC, but
just do incomplete (but sound) inference.
– See Doyle & Patil’s “Two Theses of Knowledge Representation”
Probabilistic Logic
Need for modeling uncertainity
•
Consider a simple scenario: You know that rain makes grass wet. Sprinklers
also make grass wet. Wet grass means wet news paper. You woke up one
morning and found that your newspaper (brought in by your trusted
dog/kid/spouse) was wet
– Can we say something about whether it rained the previous day?
– Will logic allow you to do it?
•
You hear the whooshing sound of the sprinklers outside the window
– Does your belief in rain-the-previous-night change?
– Will logic capture this?
• No—our “belief” in rain has reduced… that makes it “non-monotonic” change
• Standard logic is MONOTONIC
• (By the way, this is a form of inference called “explaining away”—increased belief in one
explanation for a cause reduces the belief in the competing explanations).
“Monotonic Logics”
• Standard logic is monotonic
– Given a database D and fact f, such that D|=f
• Adding new knowledge to D doesn’t reverse the entailment
– D+d |= f if D|=f
• Plausible reasoning doesn’t have this property
– Told that Tweety is a bird, we believe it will fly. Told that it is an
ostrich, we believe it doesn’t. Told that it is a magical ostrich, we
believe it does…
– Probabilistic reasoning (effectively) allows non-monotonicity
– (So does a class of logics called “default logics”—Chitta Baral is
the Big Cheese in the default logic community).
Qualification problem
--impossible to enumerate all preconditions
Ramification problem
--impossible to enumerate all effects
Frame problem
--impossible to enumerate all that stays
unchanged
Prob. Prop logic: The Gameplan
• We will review elementary “discrete variable” probability
• We will recall that joint probability distribution is all we need to answer
any probabilistic query over a set of discrete variables.
• We will recognize that the hardest part here is not the cost of inference
(which is really only O(2n) –no worse than the (deterministic) prop logic
• The real problem is assessing probabilities.
– You could need as many as 2n numbers (if all variables are dependent on all
other variables); or just n numbers if each variable is independent of all other
variables. Generally, you are likely to need somewhere between these two
extremes.
– The challenge is to
• Recognize the “conditional independences” between the variables, and exploit them
to get by with as few input probabilities as possible and
• Use the assessed probabilities to compute the probabilities of the user queries efficiently.