Transcript Document

Calibrating Optical Images and Gamma Camera Images
for Motion Detection
1,2
Gennert ,
Michael A.
1
Philippe P. Bruyant ,
1
Manoj V. Narayanan ,
1
Michael A. King
1University
of Massachusetts Medical School, Worcester, MA • 2Worcester Polytechnic Institute, Worcester, MA
Abstract
Objectives: One approach to motion detection in SPECT is to observe the patient using
optical cameras. Patient motion is estimated from changes in the images and is used to
modify the reconstruction algorithm. An important subproblem is calibrating the optical
cameras and the gamma camera. That is, it is necessary to determine the transformation
from the gamma camera coordinate system to the optical camera coordinate system such
that given a gamma camera point, one may compute the corresponding optical camera
point. Conversely, given an optical camera point, one may compute the corresponding
patient ray. Methods: We have devised a calibration phantom that can be imaged using
both optical and gamma cameras. The phantom comprises a set of Lucite disks; each disk
supports 2 low-intensity light bulbs and a 0.8mm diameter hole centered between the bulbs
to hold a 99mTc point source. The radioactive source location for each disk in image
coordinates is taken to be the midpoint of the bulbs. The radioactive source location in
gamma camera coordinates is found by segmenting the reconstructed source distribution
and computing the centroid of the activity of each source. At least 6 such point pairs are
needed, although 7 are used in practice to provide increased accuracy. Using procedure
PROJ_MAT_CALIB of Trucco & Verri Introductory Techniques for Computer Vision, we
compute the 11 parameters of the coordinate transformation and the residual error.
Because we do not know in advance which optical camera points match which gamma
camera points, an exhaustive search is used to find lowest-error matches. Results: We
have been able to match optical and gamma camera points and determine the
transformation. Tomographic reconstruction and segmentation take up most of the
processing time; point matching and parameter calculation take less than 14 seconds of
processor time on a Digital Alpha 433au workstation. Conclusions: A calibration phantom
can be imaged simultaneously to calibrate optical and gamma cameras and the
transformation computed with no other input required.
Algorithm
Optical Blobs
Read image
Threshold
Segment
Select regions
X Centroid
235.57628
248.13846
463.48648
476.22018
341.06384
351.514
335.50632
347.50485
173.41379
184.41176
294.75
303.3356
153.42073
164.48685
Optical: manually
Gamma: largest
regions
Compute centroids
Gamma Blobs
Y Centroid X Centroid
57.86440
88.3193
60.53846
66.9417
108.54054
42.0171
113.28440
53.7263
237.17021
70.5912
240.28038
53.8215
329.51898
82.4415
336.35922
329.89655
334.80392
385.03906
394.97260
431.84756
439.45395
Y Centroid
39.7305
87.3680
40.2948
86.8387
86.4228
87.5115
87.1382
Z Centroid
62.2723
79.6488
71.6796
29.7913
6.88985
48.3729
31.0288
Figure 4. Blob Detection. Processing is similar for optical and gamma blob
detection. For optical blobs, final centroids are computed by taking midpoints of
pairs of closest blobs.
Match Generation
 Generate all possible imageworld point matches
 If N points, generate N! permutations
 Compute camera parameters for each possible match
N
2
i
i
 For each parameter set, calculate residual error defined as  x j,actual  x j,predicted 
j 1
 Select parameter set with lowest residual error
Figure 5. Match Generation. Best match is found by exhaustive search.
1. Introduction
One approach to motion detection in SPECT is to observe the patient
using optical cameras. Patient motion is estimated from changes in the
images and is used to modify the reconstruction algorithm. In order to
relate changes in patient position as observed by the optical cameras,
to SPECT data as observed by the gamma camera, it is necessary to
determine the camera parameters. This is the calibration problem—
determining the transformation from the gamma camera coordinate
system to an optical camera coordinate system and vice versa.
2. Procedure
We designed a calibration phantom, comprising a set of Lucite disks,
that can be imaged by an optical camera and a gamma camera. The
disk arrangement is non-coplanar and asymmetric, guaranteeing a
unique solution for the calibration parameter equations.
Acquire
Optical
Image
Image
Detect
Blobs
x
c
c w
 Rwx
c
 Tw
 Equations
World point xw=(xw,yw,zw) projects to xi=(xi,yi) where
m11x w  m12 y w  m13 z w  m14
i
x 
m31x w  m32 y w  m33 z w  m34
x
w
Patient
w
w
w
m
x

m
y

m
z
 m24
i
21
22
23
y 
m31x w  m32 y w  m33 z w  m34
Each world / image point pair gives
Generate
Matches
Image
Optical camera coordinates xc is rotation &
translation from world xw
Image coordinates xi is projection from camera xc
m31x w  m32 y w  m33z w  m34 xi  m11x w  m12 y w  m13z w  m14
m31x w  m32 y w  m33z w  m34 yi  m21x w  m22 y w  m23z w  m24
possible Calculate
camera
matches Parameters parameters
Select
Best
world point list
Acquire
Gamma
Image
 Trucco & Verri, “Intro Techniques for 3-D Comp Vision”, PROJ_MAT_CALIB
 Write worldimage transformation equations
f: camera focal vector
 2 equations in 12 unknowns/point pair
along optical axis
 Use enough point pairs to determine
c: camera center offset
 Solve system of equations using SVD
 Extract parameters from SVD solution
cf2
x
i
x  c c
 Coordinate systems
x f
Gamma camera / World coordinates xw
Rewrite to be linear in mij as
image point list
Calibration
Phantom
Calculate Parameters
Detect
Blobs
Figure 1. Calibration Processing Flow. Figures 2–7 show module details.
 xw

0

yw zw 1 0
0 0 0 xw
0 0 0  xi x w  xi y w  xi z w
y w z w 1  yi xw  yi y w  yi z w
 m11 
 
i
 x  m12   0 

  
i 

y  
0
   
m 
 34 
N point pairs yield 2N equations, so need  6 point pairs to compute mij
Solve Am=0 where
 x1w

 0

A 
 xw
 N
 0
0 0  x1i x1w  x1i y1w  x1i z1w  x1i 
 m11 
w
i w
i w
i w
i 


z1 1  y1 x1  y1 y1  y1 z1  y1
 m12 

 



  ,m  


w
w
w


0 0  xiN x N
 xiN y N
 xiN z N
 xiN 


m

34


w w w
i w
i w
i w
i 
0 0 0 xN y N z N 1  y N xN  y N y N  y N z N  y N 
y1w z1w 1 0
0 0 0 x1w
   
w w
yN
zN 1 0
0
y1w

0
Find m in A’s nullspace using Singular Value Decomposition
Light bulbs
Figure 6. Parameter Calculation. Parameters R, T, f, and c are found from mij.
Sample Output
99mTc
well (approx 0.1 mCi)
Figure 2. Calibration Phantom comprising 7 Lucite disks (left), each holding 2
light bulbs and a 99mTc source. Complete phantom is shown at right.
IPL = ImagePointList[
ImagePoint(177.5, 82.0), ImagePoint(222.5, 211.0),
ImagePoint(193.5, 349.0), ImagePoint(293.5, 289.5),
ImagePoint(359.0, 83.5), ImagePoint(312.0, 347.0),
ImagePoint(269.5, 437.0)]
WPL = WorldPointList[
WorldPoint[88.3, 39.7, 62.2], WorldPoint[66.9, 87.3, 79.6],
WorldPoint[82.4, 87.1, 31.0], WorldPoint[53.8, 87.5, 48.3],
WorldPoint[42.0, 40.2, 71.6], WorldPoint[53.7, 86.8, 29.7],
WorldPoint[70.5, 86.4, 6.8]]
Res = 9.05
CPs = Camera Parameters[
T:[79.8, -19.6, 104.2],
R:[[-0.923, -0.262, -0.280],
[-0.010, 0.747, -0.664],
[-0.383, 0.610, 0.692]],
IC:[[317.2], [238.6]],
fx:650.7, fy:672.4]
Figure 7. Sample Output showing image points and world points in correct
correspondence, with residual error=9.05. Camera parameters are computed
from matrix entries mij. For a 640x480 image the expected camera center offset
is [319.5,239.5]. Note the close agreement with image center IC=[317.2,238.6].
3. Conclusions
Figure 3. Optical (left) and gamma (right) images of the phantom. The 7 pairs
of light bulbs are clearly visible in the optical image. The reconstructed gamma
image shows 64 out of 128 slices, with 5 out of 7 gamma source blobs.
We have successfully calibrated optical and gamma cameras. The
best match residual error is <10 pixel2. The next lowest residual error
is >1000 pixel2, giving confidence that the best match of optical and
gamma points has been found.