Bayesian Probabilistic reasoning and learning

Download Report

Transcript Bayesian Probabilistic reasoning and learning

Bayesian Probabilistic reasoning
and learning
Tang Ying
State Key Lab of Cad&CG, Zhejiang Univeristy
03/10/04
Outline
• Probability axioms
• The meaning of probability
• Forward probabilities and inverse
probabilities
• Facial modeling example
Probability Axioms
• Marginal Probability – sum the joint probability
• Conditional Probability
• Product Rule (chain rule)
– Obtained from the definition of conditional probability
• Sum Rule
– A rewriting of the marginal probability definitions
• Bayes’ Theorem
– obtained from the product rule
• Independence
Example
The Meaning of Probability (1)
• Describe frequencies of outcomes in random
experiments
– Probability = percentage of events in infinite
trials
– Related to random variables
– Used in Medicine, Biology, etc. where we can
repeat random experiments
The Meaning of Probability (2)
• Describe degrees of belief in proposition
that do not involve random variables
– “the probability that Mr. S. was the murderer
of Mrs. S., given the evidence”
– “the probability that Shakespear’s play were
written by Francis Bacon”
– “the probability that a particular signature on a
particular cheque is genuine”
Belief
•
•
Let B(X) = “belief in X”,
B(¬X) = “belief in not X”
1. An ordering of beliefs exists
2. B(X) = f(B(¬X))
3. B(X) = g(B(X|Y),B(Y))
Cox axioms
R.T. Cox, “Probability, frequency, and reasonable
expectation,” American J. Physics, 14(1):1-13,
1946
Bayesian Viewpoint
• You cannot do inference without making
assumptions
• Real world is uncertain
– Don’t have perfect information
– Don’t really know the model
– Model is non-deterministic
Forward Probabilities
• Forward probability problems involve a
generative model that describes a process
that is assumed to give rise to some data;
the task is to compute the probability
distribution or expectation of some quantity
that depends on the data
Inverse Probabilities
• Like forward probability problems, inverse
probability problems involve a generative model
of a process, but instead of computing the
probability distribution of some quantity produced
by the process, we compute the conditional
probability of one or more of the unobserved
variables in the process, given the observed
variables.
• This invariably requires the use of Bayes’ theorem
Terminology of inverse probability
P( A | B) P( B)
P( B | A) 
P( A)
likelihood  prior
posterior 
evidence
P(B)
Prior probability of B
P( A | B)
The Likelihood of B
P( B | A)
The posterior probability of B given A
P( A)
Evidence
An example
• Bill tosses a coin N times, obtaining a
sequence of heads and tails, suppose k
heads have occurred in N tosses. We assume
that the coin has a probability f of coming
up heads. We do not know f.
what is the probability distribution of f ?
• Assume we have a uniform prior (subjective), p (h)  1
P(k | f , N ) p( f )
p( f | k , N ) 
P(k | N )
N k
  f (1  f ) N  k
k


P(k | N )
f  arg max p( f | k , N )
p( f | k , N )
k
0 f 
f
N
Maximum a Posteriori
(MAP)
Learning
• Maximum a Posterior (MAP)
– Maximize the posterior
• Maximum Likelihood(ML)
– The MAP estimate under uniform priors
Learning in a nutshell
• Create a mathematical model
• Get data
• Solve for unknowns
Face modeling
• Blanz, Vetter, “A Morphable Model for the Synthesis of
3D Faces,” SIGGRAPH 99
Generative model
• Faces come from a Gaussian
Learning
Bayes Rule
Often
Learning a Gaussian
Maximization trick
• Maximize
<->minimize
Fitting a face to an image
• Generative Model
Fitting a face to an image
Maximize
Minimize
General features
•
•
•
•
Models uncertainty
Applies to any generative model
Merge multiple sources of information
Learn all the parameters
Caveats
• Still need to understand the model
• Not necessarily tractable
• Potentially more involved than ad hoc
methods
My current work on texture
compression
• Find the most reused parts in the texture image
• Use the parts as samples and for each block in the
original image find the corresponding similar
block in these samples
• If we cannot find the similar blocks in the given
threshold, we just cut the current block and paste it
in our codebook.
• For each block in the original image, record the
positions in the codebook.
Thank you!