A New Algorithm for Radial Distortion

Download Report

Transcript A New Algorithm for Radial Distortion

Fields of Experts:
A Framework for Learning
Image Priors
2006. 7. 10 (Mon)
Young Ki Baik, Computer Vision Lab.
Fields of Experts
• References
• On the Spatial Statistics of Optical Flow
•
Stefan Roth and Michael J. Black (ICCV 2005)
• Fields of Experts: A Framework for Learning Image Priors
•
Stefan Roth, Michael J. Black (CVPR 2005)
• Products of Experts
•
G. Hinton (ICANN 1999)
• Training products of experts by minimizing contrastive divergence
•
G. Hinton (Neural Comp. 2002)
• Sparse coding with an over-complete basis set
•
B. Olshausen and D. Field (VR1997)
2
Fields of Experts
• Contents
• Introduction
• Products of Experts
• Fields of Experts
• Application : Image denoising
• Summary
3
Fields of Experts
• Introduction (Image denoising)
• Spatial filter
•
Gaussian, Mean, Median … .
4
Fields of Experts
• Introduction (Image denosing)
Y
X
P( X | Y )  P(Y | X ) P( X )
5
Fields of Experts
• Introduction
• Target
•
Developing a framework for learning rich, generic
image priors (potential function) that capture the
statistics of natural scenes.
• Special features
•
•
•
Sparse Coding methods and Products of Experts
Extended version of Products of Experts.
MRF(Markov Random Field) model with learning
potential function in order to solving conventional
PoE problems.
6
Fields of Experts
• Sparse Coding
• Sparse coding represent an image patch in terms of a
linear combination of learned filters( or bases).
min E (a, J )   x  j    ai , j J i
a, J
j
i
2
j : index of vectors
i : index of filters
ai , j : coefficients
J i : filtersor bases
• To express the image probability with small parameters
• An example of mixture model
7
Fields of Experts
• Products of Experts
• Mixture model
•
•
Build a model of a complicated data distribution
by combining several simple models.
Mixture models take a weighted sum of the
distributions.
p(x)   m pm (x)
 : mixturepropotion
m
Mixture model:
Scale each
distribution down
and add them
together
8
Fields of Experts
• Products of Experts
• Mixture model
•
Mixture models are very inefficient in highdimensional spaces.
9
Fields of Experts
• Products of Experts
• PoE model
•
•
•
Build a model of a complicated data distribution by
combining several simple models.
multiply the distributions together and renormalize.
The product is much sharper than the individual
distributions.
Product model:
Multiply the two
densities together
at every point and
then renormalize.
10
Fields of Experts
• Products of Experts
• PoE model
•
•
PoE’s work well on high dimensional distributions.
A normalization term is needed to convert the
product of the individual densities into a combined
density.
p ( x) 
p
m
( x)
m
Z
11
Fields of Experts
• Products of Experts
• Geoffrey E. Hinton : Products of Exports
•
Most of perceptual systems produce a sharp posterior
distribution on high-dimensional manifold.
•
PoE model is very efficient to solve vision problem.
12
Fields of Experts
• Products of Experts
• PoE framework for vision problem

1 N
T
p ( x) 

J

i
i x;  i
Z  i 1

  1 ,, N 
i   i , J i 
Learning sparse topographic representation with products of
Student-t distributions
-M. Welling, G. Hinton, and S. Osindero(NIPS 2003)
13
Fields of Experts
• Products of Experts
• PoE framework for vision problem
•
Experts : Student-t distribution

Responses of linear filters applied to natural images
typically resemble Studient-t experts


 
 1 T 2
T
i J i x;i  1  J i x 
 2

 i
Learning sparse topographic representation with products of
Student-t distributions
-M. Welling, G. Hinton, and S. Osindero(NIPS 2003)
14
Fields of Experts
• Products of Experts
• PoE framework for vision problem
•
Probability density in Gibbs form

1 N
T
p ( x) 

J

i
i x;  i
Z  i 1
px  

1
exp EPoE x, 
Z 
N

EPoE x,    logi J Ti x; i

i 1
15
Fields of Experts
• Products of Experts
• Problems
•
•
Patch based method
Patch can be set to whole image or collection of
patch with specific location in order to treat
whole image region.
16
Fields of Experts
• Products of Experts
• Problems
•
•
•
The number of parameters to learn would be too
large.
The model would only work for one specific
image size and would not generalize to other
image size.
The model would not be translation invariant,
which is a desirable property for generic image
priors.
17
Fields of Experts
• Fields of Experts
• Key idea
•
Combining MRF models
G  V , E 
E
V
V : nodes (or thepixelsin an image)
E : theedges connectingnodes
18
Fields of Experts
• Fields of Experts
• Key idea
•
•
Define a neighborhood system that connects all
nodes in an m x m rectangular region.
Defines a maximal clique x k  in the graph k  1,...,K 
E
V
19
Fields of Experts
• Fields of Experts
• The Hammersley-Clifford theorem
•
Set the probability density of graphical model as
a Gibbs distribution.
1


px  exp  Vk xk  
Z
 k

x : an image
Vk xk  : thepotentialfunctionfor clique xk 
• Translation-invariance of an MRF model
•
assume that potential function is same for all
cliques.
Vk xk    V xk  
20
Fields of Experts
• Fields of Experts
• Potential function V xk  
•
Learn from training images
V xk    EPoE xk  , 
•
Probability density of a full image under the FoE
1
px  
exp EFoE x, 
Z 
N

EFoE x,    logi J iT xk  ; i
k

i 1
21
Fields of Experts
• Learning
• Parameter i and filter Ji can be learned from a set
of training images by maximizing its likelihood.
• Maximizing the likelihood for the PoE and the FoE
model is equivalent.
• Perform a gradient ascent method on the loglikelihood
 E
FoE
 i   
  i
EFoE

i
p


X

 : a user - defined learningrate
p
X
: theexpectation value with respect tothemodeldistribution p(x)
: theaverageover thetrainingdata X
22
Fields of Experts
• Application
• Image denoising
•
•
Given an observed noisy image y,
Find the true image x that maximizes the
posterior probability.
px | y   py | x px
•
Assumption

The true image has been corrupted by additive,
i.i.d Gaussian noise with zero mean and known
standard deviation.
 1
2
py | x   exp  2 y j  x j  
 2

j
23
Fields of Experts
• Application
• Image denoising
•
•
In order to maximize the posterior probability,
gradient ascent method on the logarithm of the
posterior probability is used.
The gradient of the log-likelihood
 x log py | x  
•
1

2
y  x 
The gradient of the log-prior
N
 x log px    J i * i J i * x 
i 1
J i * x : theconvolution of image x with filter J
J i : thefilterobtainedby mirroringJ aroundits center
 i y  :  / y logi  y;  i 
24
Fields of Experts
• Application
• Image denoising
•
The gradient ascent denoising algorithm
x
t 1

N 
t 
t  
 x    J i * i J i * x  2 y  x 

 i 1

t 




 x log px   x log py | x
t : iterationindex
 : updaterate
 : an optionalweight
25
Fields of Experts
• Applications
• Image denoising
a) Original image
b) Noisy image
c) Denoising image
26
Fields of Experts
• Summary
• Contribution
•
Point out limitation of conventional Product of
Experts.

•
PoE focus on the modeling of small image patches
rather than defining a prior model over an entire
image.
Propose FoE which models the prior probability of
an entire image in term of random field with
overlapping cliques, whose potentials are
represented as a PoE.
27