Transcript M 1

Shape Analysis and Retrieval
(600.658)
(Michael) Misha Kazhdan
Short Bio
• Undergraduate degree in mathematics
• Started Ph.D. in mathematics
• Switched to computer graphics
Research
Research Focus
+ Methods for automatically analyzing 3D models
- Methods for visualization
Past research
•
•
•
•
Shape representations
Shape alignment
Shape matching
Symmetry detection
Seminar
Shape matching:
Given a database of 3D models and a query
shape, determine which database models are
most similar to the query.
Applications
•
•
•
•
•
Entertainment
Medicine
Chemistry/Biology
Archaeology
Etc.
Applications
• Entertainment
– Model generation
•
•
•
•
Medicine
Chemistry/Biology
Archaeology
Etc.
Movie Courtesy of Summoner
Applications
• Entertainment
• Medicine
– Automated diagnosis
• Chemistry/Biology
• Archaeology
• Etc.
Images courtesy of NLM
Applications
• Entertainment
• Medicine
• Chemistry/Biology
– Docking and binding
• Archaeology
• Etc.
Image Courtesy of PDB
Applications
•
•
•
•
Entertainment
Medicine
Chemistry/Biology
Archaeology
– Reconstruction
• Etc.
Image Courtesy of Stanford
Seminar
• Whole shape matching
– How do you test if two models are similar?
• Alignment
• Partial shape matching
Seminar
• Whole shape matching
• Alignment
– How do you match across transformations that
do not change the shape of a model?
• Partial shape matching
=
Seminar
• Whole shape matching
• Alignment
– How do you match across transformations that
do not change the shape of a model?
• Partial shape matching
Seminar
• Whole shape matching
• Alignment
• Partial shape matching
– How do you test if one model is a subset of
another model?

Course Structure
Paper presentation:
• Two papers a week
• Everybody reads
• Students present
Final project:
• New method / implementation of existing ones
• Proposals due October 19th
• Presented December 6th, 7th (last week of classes)
About you
Background:
– Graphics?
– Mathematics?
– Coding?
Specific interests?
Undergrad/Masters/Ph. D.?
Year?
Shape Matching
General approach:
Define a function that takes in two models
and returns a measure of their proximity.
D
M1
,
M2
D
M1
M1 is closer to M2 than it is to M3
,
M3
Database Retrieval
• Compute the distance from the query to
each database model
M1
M2
D(Q,Mi)
Q
3D Query
Mn
Database Models
Database Retrieval
• Sort the database models by proximity
M1
~
~
D (Q , M i )  D (Q , M j )
M2
i  j
~1
M
~2
M
D(Q,Mi)
Q
3D Query
Mn
Database Models
~n
M
Sorted Models
Database Retrieval
• Return the closest matches
M1
~
~
D (Q , M i )  D (Q , M j )
M2
i  j
~1
M
~2
M
D(Q,Mi)
~
M1
Q
3D Query
~
Mn
Database Models
~n
M
Sorted Models
M2
Best Match(es)
Evaluation
Classify models:
– Retrieval is good if the closest matches in the
database are in the same class as the query
Query
1
2
3
4
5
6
7
8
9
Ranked Matches
Similarity Matrix
Given a database of models {M1,…,Mn}:
Generate the nxn matrix whose (i,j)th entry
is equal to D(Mi,Mj).
–
–
Darkness represents
similarity
If models are sorted
by class, good results
give dark diagonal
blocks
Precision vs. Recall
A graph giving the accuracy of the retrieval.
Answers the question:
How easy is it to get back n% of the models
1
2
3
in the query’s class?
Query
4
5
6
7
8
9
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = retrieved_in_class / total_in_class
– Precision = retrieved_in_class / total_retrieved
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = 0 / 5
– Precision = 0 / 0
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = 1 / 5
– Precision = 1 / 1
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = 2 / 5
– Precision = 2 / 3
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = 3 / 5
– Precision = 3 / 5
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = 4 / 5
– Precision = 4 / 7
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
• Precision-recall curves
– Recall = 5 / 5
– Precision = 5 / 9
1
1
2
3
4
5
6
7
8
9
Precision
0.8
0.6
0.4
0.2
Query
0
0
0.2 0.4
0.6
Recall
0.8
1
Ranked Matches
Precision vs. Recall
Average the p/r plots over all the queries
• Recall normalizes for class size
• Graphs that are shifted up correspond to
better retrieval