Multi-Sensor Image Fusion

Download Report

Transcript Multi-Sensor Image Fusion

MULTI-SENSOR IMAGE FUSION
Sponsored by
The Northrop Grumman Corporation
Student Participants:
Phu Kieu, Keenan Knaur, and Chris Kolodin
Advisor:
Dr. Eun-Young (Elaine) Kang
PROJECT GOAL
Develop an algorithm to fuse two images of the
same scene from two space sensors.
 Outline of the algorithm:

1.
2.
3.
4.
5.
6.
Analysis of Input Data
Image Rectification
Image Segmentation and Object Extraction
Matching and Disparity Extraction
Altitude Calculations
Data Visualization
2
ANALYSIS OF INPUT DATA
IMAGE RECTIFICATION
IMAGE SEGMENTATION AND OBJECT EXTRACTION
MATCHING ALGORITHMS
ALTITUDE CALCULATIONS
DATA VISUALIZATION
The input data was thoroughly analyzed to select
the method(s) that would best achieve the aimed
results.
SATELLITE INFORMATION
Acquired test data from NOAA (National Oceanic
and Atmospheric Administration).
 GOES 11 (west) and GOES 12 (east)



Geostationary Operational Environmental Satellites
Weather/cloud monitoring
4
GOES DATA CONTAINER

Grayscale image with pixels stored in raster
order


Grayscale represents measured AOD (Aerosol Optical
Depth).
Ten channels of information


First channel used as input to our program.
Nine other channels aside from it are not used directly.


A cloud mask channel is used solely to check the accuracy of
our cloud detection algorithm results. It is not used as a
regular part in the program.
Latitude and longitude location information per
pixel
One file for latitude and another for longitude.
 Stored as floating point values in raster order.

5
INHERENT PROPERTIES OF INPUT DATA
Images are taken by different satellites from
different locations, and an overlap exists.
 Duration of image scan differ:

West satellite takes 5 min.
 East satellite takes 15 min.


Complications:
Cloud movement and morphing from differing scan
times
 Artifacts caused by satellite geometry

6
GOES 11 (WEST)—2500X912
7
GOES 12 (EAST)—2000X850
8
IMAGE OVERLAP AREA
9
IMAGE OVERLAP AREA
10
IMAGE OVERLAP AREA
11
ANALYSIS OF INPUT DATA
IMAGE RECTIFICATION
IMAGE SEGMENTATION AND OBJECT EXTRACTION
MATCHING ALGORITHMS
ALTITUDE CALCULATIONS
DATA VISUALIZATION
Image rectification is a preprocessing step that
allows us to perform matching techniques to
calculate the disparity between two images more
easily.
EQUIRECTANGULAR MAP PROJECTION


A map projection is used to rectify two images to a
common coordinate system.
Equirectangular map projection is used
 Latitude
and longitude lines are parallel and equidistant
from each other

After projection, pixels are scaled to the desired
resolution.
13
MAP PROJECTION—ILLUSTRATION
14
OVERLAP AREA PROJECTION
—WEST (1143X735)
15
OVERLAP AREA PROJECTION
— EAST (1143X735)
16
ANALYSIS OF INPUT DATA
IMAGE RECTIFICATION
IMAGE SEGMENTATION AND OBJECT
EXTRACTION
MATCHING ALGORITHMS
ALTITUDE CALCULATIONS
DATA VISUALIZATION
This step distinguishes cloud pixels from other
pixels. Then, it clusters cloud pixels based on the
proximity and identifies cloud objects.
K-MEANS CLUSTERING
K-means clustering labels image pixels into k
groups having similar intensities.
 Clouds are generally bright, so they will be
identified according to their brightness.
 Note:

Cloud mask channel is used here to verify the
accuracy of this method.
 Our method yielded an accuracy in the neighborhood
of 85-90%.

18
CLOUD IDENTIFICATION—WEST
19
CLOUD IDENTIFICATION—EAST
20
CONNECTED COMPONENT ANALYSIS
(CCA)
After the k-means clustering, CCA routine
groups identified cloud pixels together with
specially neighboring/connected pixels to form a
cloud object.
 After CCA, clouds with an area lower than 500
pixels are ignored.

21
CCA—WEST
22
CCA—EAST
23
ANALYSIS OF INPUT DATA
IMAGE RECTIFICATION
IMAGE SEGMENTATION AND OBJECT EXTRACTION
MATCHING ALGORITHMS
ALTITUDE CALCULATIONS
DATA VISUALIZATION
For each cloud in one image, the matching
algorithm finds the best matched cloud in the
other image and then it calculates the disparity of
the cloud. The disparity is fed to calculate the
altitude.
MATCHING ALGORITHMS
 Two
matching algorithms were
implemented for the project.
 Both methods performs object-level
matching.


The first method uses the traditional mean
squared difference (MSD).
The second performs shape histogram
matching using feature vectors.
 Finds
area
best match over a specified search
25
OBJECT-LEVEL MSD
 For


each cloud in one image:
Compute the bounding box of that cloud.
Find the best matched box in other image
using the matching criterion (MSD—the mean
squared difference of two equal size boxes).
?
?
26
OBJECT-LEVEL MSD—RESULT
27
SHAPE HISTOGRAM MATCHING
Uses the cloud mask instead of original image
data.
 Calculates a feature vector for the bounding box
being matched in the image.



The feature vector includes a count of the white
pixels in each row and each column.
The best match results in the minimum
Euclidian distance between the feature vectors.
28
SHAPE HISTOGRAM MATCHING - EXAMPLE
2

(2, 3, 4, 3, 3, 2, 3, 2, 5, 3)
3
4
3
3
2
3
2
5
The feature vector in
this example would be

The best match
between objects occurs
when the distance
between the two
vectors are minimal.
3
29
SHAPE HISTOGRAM MATCHING—RESULT
30
ANALYSIS OF INPUT DATA
IMAGE RECTIFICATION
IMAGE SEGMENTATION AND OBJECT EXTRACTION
MATCHING ALGORITHMS
ALTITUDE CALCULATIONS
DATA VISUALIZATION
The disparity found using the matching
algorithms are used to derive some more
meaningful information. We derived a formula to
calculate the altitude of an object given its
disparity and the reverse.
CALCULATING ALTITUDE


The first ground intersect G1 is assumed to be the
cloud’s centroid, and the second intersect G2 is the
first displaced by the disparity from the matching
algorithm.
Vectors can be constructed from the satellite to the
corresponding ground intersect G1 and G2, and the
best point of intersection can be computed.
32
CALCULATING MOTION (1)



The disparity computed from the previous step might
be caused by a combination of the cloud altitude and
the cloud motion.
We deliberately alter the altitude of the cloud
(reference point) and compute the disparity.
Vectors can be constructed with the satellites and this
reference point; the intersection with earth is
computed.
33
CALCULATING MOTION (2)



With these two new intersection points, we can
calculate a new disparity.
The motion is computed as the difference between the
original disparity and this new disparity.
The velocity is computed by dividing the motion of
cloud by 10 min., since this is the time difference
between the two images.
34
ANALYSIS OF INPUT DATA
IMAGE RECTIFICATION
IMAGE SEGMENTATION AND OBJECT EXTRACTION
MATCHING ALGORITHMS
ALTITUDE CALCULATIONS
DATA VISUALIZATION
To make sense of what these results mean, having
a visual representation of these figures is
important.
PLOTTING ALTITUDE VS. VELOCITY
Calculated Velocity (km/h)
Assumed Altitude vs. Calculated Velocity
600
500
400
300
200
100
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Assumed Altitude (km)
36
REFERENCES




G. Bradski and A. Kaehler, Learning OpenCV:
Computer Vision with the OpenCV Library,
Sebastopol, CA: O'Reilly Media Inc., 2008.
USA. U.S. Department of Commerce, National
Oceanic and Atmospheric Administration (NOAA),
and National Environmental Satellite, Data, and
Information Services (NESDIS). Earth Location
User's Guide, U.S. Department of Commerce, 1998.
D. Fanning, "Navigating GOES Images," March 2008.
[Online]. Available:
http://www.dfanning.com/map_tips/goesmap.html
[Accessed: Mar. 10, 2010].
"Mercator Projection", Feb. 27, 2010. [Online].
Available:
http://en.wikipedia.org/wiki/Mercator_projection
[Accessed: Mar. 10, 2010].
43
Q&A