OSU-MURI-Srivastava_2008ver1

Download Report

Transcript OSU-MURI-Srivastava_2008ver1

A BAYESIAN APROACH FOR FUSION OF
FEATURES IN IMAGES FOR SHAPE
CLASSIFICATION
ANUJ SRIVASTAVA
Department of Statistics
Florida State University
PROBLEM MOTIVATION
Data
Front-End Processing: Feature Detection
An important problem is:
HOW TO COMBINE FEATURES INTO HYPOTHESES OF INTEREST?
SHREK WISDOM
Man, that ain’t nothing but
a bunch of little dots
You know, donkey, sometimes
things are more than they appear
LIST OF TOPICS
A. Problem Introduction and Motivation
B. Classification via Bayesian Formulation:
1. Nuisance integral involving shape, pose and sampling
2. Stochastic models for shapes (past work)
3. Stochastic models for sampling shapes (current work)
C. Estimation of Class Posterior
1. Synthesize possible configurations from prior
2. Joint transformation-registration.
D. Experimental Results (Simulated and Real Data)
TOY PROBLEM
TOY PROBLEM
TOY PROBLEM
TOY PROBLEM: ANOTHER EXAMPLE
TOY PROBLEM: ANOTHER EXAMPLE
Ordering is lost and clutter points are added.
TOY PROBLEM: ANOTHER EXAMPLE
Two-Dimensional Point Cloud
PROBLEM FORMULATION
Is there a shape of interest in this cloud?
Given:
-- A list of shape classes (Kimia database):
bottle, cat, glass, fountain, mickey
-- Training shapes (exemplars)
PROBLEM FORMULATION
Given:
-- A list of 16 shape classes (Kimia database):
bottle, cat, glass, fountain, mickey
PROBLEM FORMULATION
Given:
-- Training shapes (exemplars)
OUR CHALLENGES
Clutter Rejection
Ordering
Classification
OUR CHALLENGES
Clutter Rejection
m=40
Ordering
n=20
1011 possibilities
1018 possibilities
Classification
PROBLEM MOTIVATION
Image
Edge
Detection
Random
Selection
Thinning
A simple, fast pre-processing of images will generate point clouds
Typical scenario: scene is dominated by a single object
a large proportion of points falls on object’s boundary
PROBLEM MOTIVATION
Image
Edge
Detection
Random
Selection
Thinning
LIST OF TOPICS
A. Problem Introduction and Motivation
B. Classification via Bayesian Formulation
1. Nuisance integral involving shape, pose and sampling
2. Stochastic models for shapes (past work)
3. Stochastic models for sampling shapes (current work)
C. Estimation of Class Posterior
1. Synthesize possible configurations from prior
2. Joint transformation-registration.
D. Experimental Results (Simulated and Real Data)
OUR APPROACH
Analysis by Synthesis : Grenander’s Pattern Theory
OUR APPROACH
Analysis by Synthesis
 Select a shape class
OUR APPROACH
Analysis by Synthesis
 Select a shape class
 Generate a shape
in that class
OUR APPROACH
Analysis by Synthesis
 Select a shape class
 Generate a shape
in that class
 Sample that shape using
points
OUR APPROACH
Analysis by Synthesis
 Select a shape class
 Generate a shape
in that class
 Sample that shape using
points
 Match this point set with the data set
BAYESIAN CLASSIFICATION
MAP Estimation of Shape Class
Posterior Probability
where
is the shape of the curve
is the placement of the curve
is the sampling function
BAYESIAN CLASSIFICATION
 Develop probability models for shapes in each class.
 Develop probability models for sampling curves in each class.
 Synthesize point configurations from joint shape-sampling
models.
 Compare synthesized configurations with given data and
evaluate them under a likelihood function.
 Use this to estimate the posterior for each class.
MODELING VARIABILITY IN SHAPES
Given exemplars (training shapes) for each class:
We want to develop a conditional probability density
PAST WORK ON SHAPE ANALYSIS

1. Klassen et al.,Analysis of Shapes Using Geodesic Paths on Shape
Spaces, IEEE Transactions on PAMI, 2004.

2. Srivastava et al., Statistical Analysis of Shapes: Clustering, Learning
and Testing, IEEE Transactions on PAMI, 2005.

3. Mio et al., On Shape of Plane Elastic Curves, International Journal of
Computer Vision, 2007.

4. Kaziska and Srivastava, Classification of Cyclostationary Processes on
Shape Manifolds for Gait Recognition, JASA, 2007.

5. Joshi and Srivastava, Intrinsic Bayesian Active Contours, International
Journal of Computer Vision, accepted for publication, 2008.
SQUARE-ROOT ELASTIC (SRE) FRAMEWORK
For a parameterized curve:
Properties:
 Translations are already removed
 Curves are scaled to unit length – removes scaling
Unit sphere !!
 Only rotation and reparametrization are left
(action of re-parameterization group is an isometry)
STATISTICAL SUMMARIES OF SHAPES
Karcher Mean:
Shape variations are studied in the tangent space.
STATISTICAL SUMMARIES OF SHAPES
Mean Shape
Eigen modes of shape variations
WRAPPED NORMAL DISTRIBUTION
Random samples from a WRAPPED NORMAL distribution
THREE SOURCES OF VARIABILITY
 Shapes - Pose – placement - Sampling --
SAMPLING A CURVE
Consider a unit-length curve, parameterized by the arc-length
Define
Uniform sampling of the curve
SAMPLING A CURVE
Take a positive diffeomorphism
Non-Uniform sampling of the curve
SAMPLING A CURVE
SAMPLING A CURVE
 The variability in sampling a curve is represented by
 To model sampling, we need to impose a probability model on
 Note that is a cumulative distribution function on [0,1]. Its
derivative is a probability density on [0,1].
 We can use the Fisher-Rao metric on probability metrics to
impose a Riemannian structure on
A PRIOR MODEL ON SAMPLING FUNCTION
Goal: Develop a probability model
exemplars in that shape class
• Step 1: Compute a preferred
on
using given
for each given curve
• Step 2: Compute an average of these
• Step 3: Define a “wrapped normal” density on
estimated mean and variance.
using the
How can we learn such a model from the training shapes?
A PREFERRED SAMPLING FUNCTION
For a curve , let
be its curvature function
We prefer a sampling function that is inversely proportional
to exponential of
 :[0,1]  [0,1]
t
1
 (t)   exp(|  (s) | /  )ds
Z0
1
For each training shape, we can compute a sampling function
KARCHER MEAN OF OBSERVED SAMPLING
FUNCTIONS
For three different shape classes (origins kept fixed)
LIST OF TOPICS
A. Problem Introduction and Motivation
B. Classification via Bayesian Formulation:
1. Nuisance integral involving shape, pose and sampling
2. Stochastic models for shapes (past work)
3. Stochastic models for sampling shapes (current work)
C. Estimation of Class Posterior
1. Synthesize possible configurations from prior
2. Joint transformation-registration.
D. Experimental Results (Simulated and Real Data)
BAYESIAN CLASSIFICATION
MAP Estimation of Shape Class
Posterior Probability
where
is the shape of the curve
is the placement of the curve
is the sampling function
BAYESIAN CLASSIFICATION
MAP Estimation of Shape Class
Posterior Probability
where
is the shape of the curve
is the optimal placement of the curve (MLE)
is the sampling function
OBSERVATION MODEL / LIKELIHOOD
Observation Model:
 For observations coming from an object:
where
is additive Gaussian noise
 For observations coming from background clutter, we assume
an independent homogenous Poisson process
OPTIMAL MATCHING OF POINTS
 Need to search over all rotations, translations, and scalings of
 Solve for optimal
and
 Evaluate the likelihood function
JOINT TRANSFORMATION &
REGISTRATION
 Iterative optimization:
 Fix transformation and register points using Hungarian algorithm
 Fix registration and transform points using Procrustes method
JOINT TRANSFORMATION &
REGISTRATION
LIST OF TOPICS
A. Problem Introduction and Motivation
B. Classification via Bayesian Formulation:
1. Nuisance integral involving shape, pose and sampling
2. Stochastic models for shapes (past work)
3. Stochastic models for sampling shapes (current work)
C. Estimation of Class Posterior
1. Synthesize possible configurations from prior
2. Joint transformation-registration.
D. Experimental Results (Simulated and Real Data)
EXPERIMENTAL RESULTS
Simulated Data
EXPERIMENTAL RESULTS
Simulated Data
EXPERIMENTAL RESULTS
Simulated Data
(chopper)
EXPERIMENTAL RESULTS
Simulated Data
(bottle)
EXPERIMENTAL RESULTS
Simulated Data
(glass)
EXPERIMENTAL RESULTS
Simulated Data
(glass)
EXPERIMENTAL RESULTS
Simulated Data
EXPERIMENTAL RESULTS
Simulated Data
EXPERIMENTAL RESULTS
Simulated Data
EXPERIMENTAL RESULTS
Simulated Data
EXPERIMENTAL RESULTS
Real Image Data
EXPERIMENTAL RESULTS
Real Image Data
EXPERIMENTAL RESULTS
Real Image Data
EXPERIMENTAL RESULTS
Real Image Data
EXPERIMENTAL RESULTS
Real Image Data
EXPERIMENTAL RESULTS
Real Image Data
SUMMARY
Presented a framework for finding shapes in point clouds
•
Developed mathematical representations and probability
models on shapes
•
Developed mathematical representations and probability
models for sampling functions
•
Used random realizations from these models to synthesize
point configurations.
•
Used the Hungarian algorithm to compute the likelihood
function.
•
Estimated the posterior probability of having a particular
shape class in the given point cloud.