Introduction to CLIPS (Chapter 7)
Download
Report
Transcript Introduction to CLIPS (Chapter 7)
Tues. March 9, 1999
The Logic of Probability Theory
–
–
–
–
–
–
Foundations, Notation and Definitions
Axioms and Theorems
Conditional Probability, Independence
Chain Rule
Probability Trees
Bayes’ Theorem
Thursday, Discrete Random Events
Differing Views of
Probabilities
Frequentist View
P = W/N is the probability of event W,
where in an experiment or trial:
–
–
W=number of events occuring
N=Total possible outcomes
Example: Role of a die thumbtack
Subjective or Bayesian
Definition
A probability is a measure assigned to an
individual's subjective assessment of the
likelihood of an event based on his state of
information. The probability measure
–
–
–
–
(1) depends on the state of information,
(2) may change with new information,
(3) may vary among individuals, and
(4) corresponds to the areas on a Venn diagram.
Conditional Probability
Notation
I will use the following notation to represent
probabilities:
Pr(A|B,S) the probability of an event A
given B has occurred for a given state of
information S.
Axioms of Probability
Theory
(1) For any event A, the probability measure
is a real non-negative number such that
Pr(A|S) >0.
(2) Pr(I|S) = 1, where I = the universe of
all events.
(3) If AB=ø (i.e., if A and B are mutually
exclusive) then Pr(A+B|S) = Pr(A|S) +
Pr(B|S).
Axioms in Terms of
Venn Diagrams
I
A
B
(1) Pr(A|S), Pr(B|S) ≥ 0
(2) Pr(I|S) = 1
(3) Pr(A+B|S) = Pr(A|S) +
Pr(B|S) if A and B are
mutually exclusive
(1) restricts the area in any
region to be non-negative.
(2) requires that the area of
the entire diagram,
corresponding to the
universe of all events "I", be
normalized to one.
(3) implies that the area
contained in two
nonoverlapping regions be
the sum of their individual
areas.
Fundamental Theorems
(1) Pr(A'|S) = 1 - Pr(A|S)
(2) Pr(ø|S) = 0
(3) Pr(A + B|S) = Pr(A|S) + Pr(B|S) Pr(AB|S)
Conditional Probability
Pr(B|A, S) =
Pr(AB|S)
¾¾¾¾¾
Pr(A|S)
Pr(A|B, S) =
Pr(AB|S)
¾¾¾¾¾
Pr(B|S)
Venn Diagrams and
Conditional
Probabilities
The conditional probability can be shown in the Venn
diagram as the ratio of the area shared by both events
(intersection) to the area of the conditioned event. We can
look at conditional probability as a means of
"renormalizing" the probability measure when new
information shows that an event is known with certainty
(i.e., with a probability equal to one).
I
I=A
A
B
I=B
A B
A B
Pr(AB| A,S)
Pr(AB| B,S)
Independence
If the probability of the product of two
events is the product of their probabilities,
then the two events are independent.
–
Pr(AB|S)=Pr(A|S) Pr(B|S) implies
independence
In other words, A is probabilistically
independent of B if having knowledge
about B gives you no new information
about A and visa versa.
Independence and
Conditional Probability
Pr(AB|S)
Pr(B|A, S) = ¾¾¾¾¾¾ =
= Pr(B|S)
Pr(A|S)
Pr(AB|S)
Pr(A|B, S) = ¾¾¾¾¾¾ =
= Pr(A|S)
Pr(B|S)
Pr(A |S) Pr(B |S)
¾¾¾¾¾¾_________¾
Pr(A| S)
Pr(A|S) Pr(B|S)
¾¾¾¾¾¾_________¾
Pr(B|S)
Influence Diagrams and
Independence
Conditional probabilistic influence and independence
can be explicitly represented as expansions of joint
probability distributions in influence diagrams.
The strong piece of information is the lack of an
influence or the identification of (conditional)
independence.
We must assume that influences may be present if no
information indicating independence is provided.
x
y
(a) Pr(x,y|S) = Pr(x|y,S) Pr(y|S)
x
y
(b) Pr(x,y|S) = Pr(x|S) Pr(y|S)
Definition:
Independence
Consider two (uncertain) state variables x
and y. y does not influence x if x is
independent of y given S, i.e., if Pr(x|y,S) =
Pr(x|S).
x
y
(a) Pr(x,y|S) = Pr(x|y,S) Pr(y|S)
x
y
(b) Pr(x,y|S) = Pr(x|S) Pr(y|S)
Chain Rule
The concept of conditional probability can
be expanded to accommodate any number
of events. For example, with three events
A,B, and C:
Pr(ABC|S) = Pr(AB|C,S)Pr(C|S)
= Pr(A|BC,S)Pr(B|C,S)Pr(C|S)
Or Pr(ABC|S) = Pr(BC|A,S)Pr(A|S)
= Pr(B|CA,S)Pr(C|A,S)Pr(A|S)
[1]
[2]
Three Influence
Diagram Expansions for
Pr(ABC|S)
There are six different combinations possible if conditional
independence is not considered. If conditional independence
exists, any of the three arcs may be deleted. For example, the
joint expansion of Pr(ABC|S) is as described in equation [3]
below where B is independent of C:
Pr(ABC|S) = Pr(AB|C,S) Pr(C|S) = Pr(A|BC,S) Pr(B|S)Pr(C|S)
A
B
C
(a) Expansion [eqn. 1]
A
B
C
(a) Expansion [eqn. 2
A
B
C
(a) Expansion [eqn. 3]
Probability Trees
A probability tree is a succession of circular
nodes (uncertain state variables) with
branches.
The branches emanating from each node
represent the different possible values of the
uncertain variables associated with the
node.
Probability Tree Pr(ABC|S)
= Pr(A|BC,S)Pr(B|C,S)Pr(C|S)
PROBABILITY
EVENT
Pr(B|C,S)
Pr(C|S)
B1
B2
C1
C2
Pr(A|BC,S)
A1
A2
A3
A1 B 1 C
1
A2 B 1 C
1
A3 B 1 C
1
Pr( A B C|S)
1
1
1
Pr( A B C|S)
2 1
1
Pr( A B C|S)
3
1
1
Bayes’ Theorem
Bayes' theorem is the single most important relation in
probabilistic expert systems.
It forms the basis for probabilistic inference.
Consider
N
mutually exclusive and collectively
exhaustive events: A1, A2, . . . An
Ai Aj = ø, i≠j [1,N]
A1 + A2 + . . . + An = I
(mutually exclusive)
(collectively exhaustive)
Bayes’ Question
Consider another event B that may overlap
several events Ai, i=1,2 . . .
If we know all the probabilities
Pr(Ai|S) and Pr(B|Ai,S) (i.e., we know all of
the corresponding area relationships in the
Venn diagram above), then we know
Pr(AiB|S) = Pr(Ai|S)Pr(B|Ai,S)
Bayes’ Question
We know:
Pr(Ai|S) and Pr(B|Ai,S) and thus
Pr(AiB|S) = Pr(Ai|S)Pr(B|Ai,S)
What is Pr(Ai|B,S)?
A
B
A
B