ppt - TAMU Computer Science Faculty Pages
Download
Report
Transcript ppt - TAMU Computer Science Faculty Pages
CPSC 643, Presentation 1
Random Sample Consensus: A Paradigm for
Model Fitting with Application to Image
Analysis and Automated Cartography
Martin A. Fischler, Robert C. Bolles
Artificial Intelligence Center
SRI International, CA
Graphics and Image Processing,
Volume 24, Number 6, June 1981.
Martin A. Fischler
Research Focus
Artificial Intelligence, Machine Vision, Switching Theory,
Computer Organization, Information Theory
B.E.E Degree – City College of New York, NY
M.S and PhD – Stanford University, CA
Computer Scientist – SRI International in 1977
•
Published the RANSIC paper firstly in work report of SRI
International in 1980
•
Published the RANSIC paper in Graphics and Image
Processing in 1981
•
Currently working on Visual Odometry and Visual SLAM
Back to 1981
Computer Vision in 1981
•
•
•
•
•
•
Focused on classification and recognition
Science-based (hadn’t gotten to applications yet)
Initially focused largely on artificial worlds.
Images were hard to come by.
3-D range sensing was almost viewed as cheating.
Research was driven by sponsor’s interests.
IBM first PC, 1981
4.77MHz
Apple II-Plus, 1981
Max of 64K RAM
Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
Motivation
Least Square Algorithm
Optimize the fit of a functional description to ALL of the
presented data.
2
m
m
X
i 1
j 1
ij
j y j min
Adapted from http://en.wikipedia.org/wiki/Linear_least_squares
Motivation
Least Square Algorithm
Least square is an averaging technique that considers all
the presented data, and therefore is sensitive to outliers.
Adapted from http://www.cs.unc.edu/~lazebnik/spring09/
Motivation
Robust Estimator
•
The robust function ρ behaves like squared distance to
small ri but saturates to large ri , where ri is the residual
of point i w.r.t. model parameters θ, σ is scale parameter.
r x , ; min
i
i
i
•
•
Nonlinear optimization that must be solved iteratively.
Least squares solution can be used for initialization.
Adapted from http://www.cs.unc.edu/~lazebnik/spring09/
Motivation
Two types of error
•
•
Measurement error – inliers
Classification error – outliers
Existing Problem
•
•
•
Least square and robust estimator (initialization) treat
inliers and outliers equally, as a whole.
Robust estimator tries to extract the outliers in the later
iteration, while fitting inliers and extracting outliers should
be in the same process.
Why not randomly choose data subset to fit – RANSAC.
RANSAC
Notations
U= {xi}
p
k
set of data points, |U|=N
model parameters
function f computes model parameters p given
a sample S from U
the cost function for a single data point x
times of iteration
Algorithm
•
•
•
•
Select random set
,
Compute parameters
Compute cost
Stop of Ck < C* or k > k*
RANSAC
Example
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
•
Calculate model
parameters p
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
•
Calculate model
parameters p
•
Calculate cost for
each data point
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
•
Calculate model
parameters p
•
Calculate cost for
each data point
•
Select the data that fit
the current model
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
•
Calculate model
parameters p
•
Calculate cost for
each data point
•
Select the data that fit
the current model
•
Repeat sampling
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
•
Calculate model
parameters p
•
Calculate cost for
each data point
•
Select the data that fit
the current model
•
Repeat sampling
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
•
Select data subset
•
Calculate model
parameters p
•
Calculate cost for
each data point
•
Select the data that fit
the current model
•
Repeat sampling
•
Ck < C* or k > k*
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
How many iterations
•
The average step number k is a function of the
sample size m and the fraction of outliers
m
E ( k ) 1 1
•
Choose K so that, with probability p, at least one random
sample is free from outliers
1 1
m k
1 p
k log 1 p / log 1 1
proportion of outliers
m
2
3
4
5
6
7
8
5%
2
3
3
4
4
4
5
10%
3
4
5
6
7
8
9
20%
5
7
9
12
16
20
26
25%
6
9
13
17
24
33
44
m
, p=0.95
30%
7
11
17
26
37
54
78
40%
11
19
34
57
97
163
272
50%
17
35
72
146
293
588
1177
RANSAC
Application: Location Determination Problem
• Existence proofs of multiple solutions for the P3P, P4P, and
P5P problems.
• An algorithm for solving the general P3P.
• An algorithm for solving the planar P4P problem.
• An automatic gross-error filtering technique (RANSAC).
Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
RANSAC
Results: Location Determination Problem
Final result (Deviations)
X: 0.1 ft Heading: 0.01O
Y: 0.1 ft Pith: 0.10O
Z: 0.1 ft Roll: 0.12O
Adapted from http://www.ai.sri.com/people/fischler/
RANSAC
Other Applications
Adapted from http://graphics.cs.cmu.edu/courses/15-463/2006_fall/www/463.html
RANSAC
Other Applications
Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
RANSAC
Pros
•
•
•
Simple and general.
Applicable to many different problems.
Often works well in practice.
Cons
•
•
•
•
•
Sometimes too many iterations are required.
Can fail for extremely low inlier ratios.
Lots of parameters to tune.
Can’t always get a good initialization of the model.
We can often do better than brute-force sampling.
END
For more information please visit the website of 25 Years
RANSAC Workshop: http://cmp.felk.cvut.cz/ransac-cvpr2006/
1981 Ford F150
Adapted from http://http://www.lmctruck.com/Fordcust_photos.htm