A Hybrid Algorithm to Compute Marginal and Joint Beliefs in

Download Report

Transcript A Hybrid Algorithm to Compute Marginal and Joint Beliefs in

A Hybrid Algorithm to Compute Marginal and Joint Beliefs in
Bayesian Networks and its complexity
Mark Bloemeke
Artificial Intelligence Laboratory
University of South Carolina
Presentation by
Sreeja Vallabhan
December 04 2003
Marco Valtorta
Artificial Intelligence Laboratory
University of South Carolina
Instructor
Marco Valtorta
Marginal and Joint Beliefs in BN
1
Abstract
Methods (algorithms) to Update Probability in Bayesian Network
 Using a structure (Clique Tree) and perform local message
based calculation to extract the belief in each variable.
 Using Non Serial Dynamic Programming Techniques to extract
the belief in some desired group of variables.
Goal
Present a hybrid algorithm based on Non Serial Dynamic
Programming Techniques and possessing the ability to retrieve
the belief in all single variables.
Symbolic Probabilistic Inference (SPI)
Consider the Bayesian Network with DAG G = (V, E)
and conditional probability tables P(vi |  (vi ))
where  (vi ) are the parents of vi in G.
Total joint probability using Chain Rule of Bayesian Network
P(V )   P(vi | (vi ))
viV
(1)
Using marginalization to retrieve belief in any subset of variables
V’ as
P (V ' )   P (V )
(2)
v v '
SPI is based on these two equations
Symbolic Probabilistic Inference (SPI)
In SPI, to maintain control over the size and time complexity of
the resulting tables:
 Variables are ordered before calculations
 Summations are pushed down into products
Symbolic Probabilistic Inference (SPI)
Consider the following Bayesian Network
From equation (1) and (2), the joint
probability of the variable A and C
P( A, C )  B , D , E P( E | C ) * P( D | B, C ) * P(C | A) * P( B | A) * P( A)
Assuming that each variable has two states, P(A,C) require a total of
92 significant operations
Symbolic Probabilistic Inference (SPI)
With a single re-ordering of the terms combined by equation (1)
followed by the distribution of the summation from (2):
P( A, C )  P( A) * [ P(C | A) * E [ P( E | C ) * [B P( B | A)*D P( D | B, C )]]]
This requires only 32 significant operations.
Factor Trees
Two Stage method for deriving the desired joint and single
beliefs.
• Creation of Factor Tree.
• Passing algorithm on the Factor Tree to retrieve desired joint
and single beliefs.
Factor Trees Algorithm
 Start by Calculating the optimal factoring order for the
network given the target set of variables whose joint is
desired.
 Construct a Binary Tree showing the combination of initial
probability table and conformal table.
 Label edges between table along which variables are
marginalized with the variables marginalized before
combination.
 Add an additional head that has an empty label above the
current root, a conformal table labeled with the target set of
variables, that has no variables.
Factor Trees Algorithm