Transcript Tracking

Tracking
Image Processing Seminar
2008
Oren Mor
1
What is Tracking?


Estimating pose
Possible from a variety of measured
sensors






Electrical
Mechanical
Inertial
Optical
Acoustic
magnetic
2
Tracking applications - Examples






Tracking missiles
Tracking heads/hands/drumsticks
Extracting lip motion from video
Lots of computer vision applications
Economics
Navigation
SIGGRAPH 2001
3
Noise



Each sensor has fundamental
limitations related to the
associated physical medium
Information obtained from
any sensor is part of a
sequence of estimates
One of the most well-known
mathematical tools for
stochastic estimation from
noisy sensor measurement is
the Kalman Filter
City of Vancouver
4
Part I – Kalman Filter
5
Rudolf Emil Kalman





Born in 1930 in Hungary
BS and MS from MIT
PhD 1957 from Columbia
Filter developed in 1960-61
Now retired
SIGGRAPH 2001
6
What is the Kalman Filter?






Just some applied math
A linear system: f(a+b)=f(a)+f(b)
Noisy data in → hopefully less noisy out
But delay is the price for filtering…
Predictor-corrector estimator that
minimizes the estimated error covariance
We’ll talk about The Discrete Kalman filter
SIGGRAPH 2001
7
Recursive Filters




Sequential update of previous
estimate
Allow on-line processing of data
Rapid adaptation to changing
signals characteristics
Consists of two steps:


Prediction step:
Update step:
8
General Idea
9
The process to be estimated

Discrete time controlled process
governed by linear stochastic
difference equation
o A is the state transition
model applied to the
previous state
o B is the control-input
model applied to the
control vector
o W is the process noise
wikipedia
A
10
The process to be estimated


With a measurement
that is
H is the observation model, which
maps the process space into the
observed space
A
wikipedia
11
The process to be estimated



wikipedia
and
are the process and
measurement noise respectively
We assume
The noise vectors
are assumed to be
mutually independent
A
12
Computational Origins of the filter


is the a priori state estimated
at step k given knowledge of the
process prior to step k
is the a posteriori state
estimated at step k given
measurement
Kalman filter web site
13
Computational Origins of the filter




The a priori error estimate
The a posteriori error estimate
The a priori estimate error
covariance is
The a posteriori estimate error
covariance is
Kalman filter web site
14
Computational Origins of the filter



Computing a posteriori state estimate as a
linear combination
The residual
reflects the
discrepancy between predicted
measurement
and the actual
measurement
K minimizes the a posteriori error
covariance equation
Kalman filter web site
15
Interesting Observations

As the measurement error covariance R
approaches zero, K weights the residual
is more “trusted”,

is “trusted” less
As the a priori estimate error covariance
approaches zero, K weights the residual less
is less “trusted”,
Kalman filter web site
is “trusted” more
16
The Discrete Kalman Filter Algorithm


Time update equations are responsible for
projecting forward the current state and
error covariance – predictor equations
Measurement update equations are
responsible for the feedback – corrector
equations
Kalman filter web site
17
Predict → Correct


Predicting the new state and its
uncertainty
Correcting with the new
measurement
Kalman filter web site
18
The Discrete Kalman Filter Algorithm
Kalman filter web site
19
Example





wikipedia
A truck on a straight frictionless endless
road
Starts from position 0
Has random acceleration
Measured every ∆t, imprecisely
We’ll derive a model from which we create
a Kalman filter
20
Example - Model



No control, so B and u is ignored
The position and velocity is described by linear state
space
We assume that the acceleration ak, normally
distributed, with mean 0 and STD
from Newton
laws of motion
where
wikipedia
21
Example - measurement

A noisy measurement of the true position.
Assume the noise is normally distributed with
STD
where
wikipedia
22
Example – initialization

We know the starting state with perfect
percision
and the covariance matrix is zero
And we’re ready to run the KF iterations!
wikipedia
23
Extended Kalman filter





wikipedia
Most non-trivial systems are non-linear
In the extended Kalman filter the state
transition and observation model
The EKF is not an optimal estimator
If initial state or process model is wrong,
the filter quickly diverges
The de facto standard in navigation
systems and GPS
24
Part II – The Condensation Algorithm
25
Introduction



We talked about Kalman filter, which
relied on a Gaussian model
Another family of algorithms uses
nonparametric models
No restrictions to linear processes
or Gaussian noise
26
Dynamic System





State transition:
State only depends on previous state
Measurement equation:
Both functions are not necessarily linear
The state noise on both equations is usually
not Gaussian
27
Propagation in one time step
Isard and Blake, IJCV
28
General Prediction-Update Framework





Assume that
time k-1
Prediction step:
is available at
(using Chapman-Kolmogoroff equation)
This is the prior to state
at time k without
knowing the measurement
Update step:
Compute posterior from predicted prior and
new measurement
29
Particle Filter – general concept

If we cannot solve the integrals required
for a Bayesian recursive filter analytically,
we represent the posterior probabilities
by a set of randomly chosen weighted
samples
Vampire-project
30
Sequential Importance Sampling

Let
set of support points (samples,
particles)
Whole trajectory for each particle
Let
associated weights, normalized to

Then:


(discrete weighted approximation to the true
posterior)
31
Sequential Importance Sampling


Usually we cannot draw samples
from
directly. Assume we sample directly from a
different importance function
. Our
approximation is still correct if
The trick: We can choose
freely!
32
Factored sampling
Isard and Blake, IJCV
33
Probability distribution example


Each sample is shown as
a curve
Thickness proportional
to the weight
Isard and Blake, IJCV

Estimator of the
distribution mean
Movie
34
Sequential Importance Sampling

If the importance function is chosen to
factorize such that
Then one can augment old particles
by
to get new particles
35
Sequential Importance Sampling


Weight update (after some lengthly
computations)
Furthermore, if
(only depends on last states and observations)
And we don’t need to preserve trajectories
and the histories
36
SIS algorithm – Pseudo code
37
Problem – Degeneracy problem
Problem with SIS approach: after a few
iterations, most particles have negligible
weight (the weight is concentrated on a few
particles only)
Counter measures:
o Brute force: many samples
o Good choice of importance density
o resampling

38
Resampling Approaches


Whenever degeneracy rises above
threshold: replace old set of samples
(+weights) with new set of samples
(+weights), such that sample density
better reflects posterior
This eliminates particles with low weight
and chooses more particles in more
probable regions
39
General Particle filter – Pseudo code
Movie
40
Problems


Particles with high weight are
selected more and more often, others
die out slowly
Resampling limits the ability to
parallelize the algorithm
41
Advantages of PF





Can deal with non-linears
Can deal with non-Gaussian noise
Can be implemented in O(Ns)
Mostly parallelizable
Easy to implement
42
Questions?
43
Part III – Multiple Object Tracking
44
Intoduction

Difficulties for Multiple Object
Tracking
Objects may interact.
 The objects may present more or less
the same appearances
BraMBLe: A Bayesian Multiple-Blob Tracker,
M. Isard and J. MacCormick, ICCV’01.


Qi Zhao
45
Problem

Qi Zhao
The goal is to track an unknown
number of blobs from static camera
video.
46
Solution


The Bayesian Multiple-BLob (BraMBLe)
tracker is a Bayesian solution.
It estimates
p(Xt | Zt , Zt1,..., Z1 )
State at frame t
Image Sequence
Number,
Positions,
Shapes,
Velocities,
…
Qi Zhao
47
Modeling idea
p( X | Z)  p(Z | X) p( X)
Posterior State Observation Prior
Likelihood
Distribution
Sequential Bayes•
p( Xt | Z1:t )  p(Zt | Xt )  p( Xt | Xt 1 ) p( Xt 1 | Z1:t 1 )dXt 1
Posterior State
Distribution
Observation
Likelihood
Prior
Instead of modeling p(Xt | I1:t ) directly,
BraMBLe models p(It | Xt ) and p(Xt | Xt1 ).
Qi Zhao
48
Object Model X

The blob configuration is

X  m, x1 , ..., x m
Number
of objects

Object
State
1
x
Qi Zhao
3
x
x
2
m3
49
Object Model X

X  m, x1 , ..., x m

x  i, l, v, s 
Identity Velocity
Shape
Location
Qi Zhao
50
Person Model
Calibrated
Camera
Generalized-Cylinder Model
Qi Zhao
51
Prediction Model p(Xt | Xt1 )

Qi Zhao
The number of objects can change:
 Each object has a constant probability r of
remaining in the scene.
 There is a constant probability i that a new
object will enter the scene.
52
Model Review
p( Xt | Z1:t )  p(Zt | Xt )  p( Xt | Xt 1 ) p( Xt 1 | Z1:t 1 )dXt 1



The observation likelihood p(Zt | Xt ) is fast
to compute for different hypotheses Xt .
The prediction model p(Xt | Xt1 ) allows
generation of Xt from Xt 1.
Estimating p(Xt | Z1:t ) requires an efficient
way of


Qi Zhao
Representing p(Xt | Z1:t ) .
Computing the multiplications and integrations.
53
Efficient Representation
is represented by a set of
particles, {(Xit ,  ti )}
 p(Xt | Z1:t )
1
t
N Points:
X
X
N Weights:
 t1
 t2

Qi Zhao
2
t
...
XtN 1

N 1
t
XtN
 tN
Sampling from the set using the weights
approximates sampling from p(Xt | Z1:t ) .
54
Multiplication by likelihood

Given particle set p(Xt | Z1:t1 ), compute
p(Xt | Z1:t )  p(Zt | Xt ) p(Xt | Z1:t 1 ).
p(Xt | Z1:t1 )
p(Zt | Xt )
p(Xt | Z1:t )

Qi Zhao
Weight particles, setting
{(Xit ,  ti )}
 ti  p(Zt | Xti ).
55
People Tracking
Qi Zhao
56
Weaknesses




The algorithm is sensitive to reflections.
The algorithm sometimes switches the labels
when one object passes in front of another.
The multi-object dynamical model does not
model any form of interaction.
The static camera assumption may not be valid.
57
Questions?
msn movies group
58
References





Wikipedia – Kalman filter
- particle filter an overview
by Matthias Muhlich
Isard and Blake, IJCV – Oxford 1998
Kalman filter web site
(http://www.cs.unc.edu/~welch/kalman/)
Qi Zhao, May 2006
59
Math Intro - Probabaility
60
Math Intro - Statistics
61
Math Intro – Normal or Gaussian
Distribution
62
The Observer Design Problem



Estimate the internal states of a linear system given
access to system’s output
Can be represented as a linear stochastic difference
equation
The measurement model
and
are the process and measurement noise
respectively
63