Transcript Slide 1
2009-10 CEGEG046 / GEOG3051
Principles & Practice of Remote Sensing (PPRS)
7: scanning redux, photography, lidar
Dr. Mathias (Mat) Disney
UCL Geography
Office: 113, Pearson Building
Tel: 7670 0592
Email: [email protected]
www.geog.ucl.ac.uk/~mdisney
Recap
• Last week
– storage/transmission
– pre-processing stages (raw data to products)
– sensor scanning mechanisms
• This week
– scanning mechanisms redux
– photography
– time-resolved signals (e.g. LiDAR)
2
Scanning mechanisms: examples
• Discrete detectors and scanning mirrors
– Landsat MSS, TM, ETM+, NOAA GOES, AVHRR, ATSR
• Multispectral linear arrays
– SPOT (1-3) HRV, HRVIR & SPOT-VGT, IKONOS, ASTER & MISR (both
on board NASA Terra)
• Imaging spectrometers using linear and area arrays
– AVIRIS, CASI, MODIS (on NASA Terra and Aqua)
From: http://ceos.cnes.fr:8100/cdrom/ceos1/irsd/pages/datacq4.htm & Jensen (2000)
3
Scanning mechanisms: examples
•MODIS scan mirror
http://modis.gsfc.nasa.gov/about/scanmirror.php
•Continuously rotating and double-sided
•SEVIRI (Spinning Enhanced Vis and IR Imager)
on board MSG
•Whole satellite rotates
•Vertical scan plus rotation = image
4
Scanning mechanisms: continued
• Image frame created by scanning detector footprint
•n pixels per line, pixel size r * r
nr
•Along track speed v ms-1 so footprint
travels distance r in r/v secs
line
r
pixel
• One line of data must be acquired in <=
r/v secs
•Typical v?
•Orbital period T ~ 100 mins, Earth radius
~ 6.4x103m
Frame
•v = 2*6.4x103 / 100*60 = 6.7x103ms-1
Across track
5
Scanning mechanisms: single detector
• Even if we obtain 1 line in r/v secs say.....
• Significant along-track displacement from start to end of x-track scan
line
X-track scan
(whiskbroom)
Start
rv
Platform has moved r in
rv secs
6
Scanning mechanisms: single detector
• Zig-zag mechanism
– active scan lasts r/2v secs
– n pixels per line, so “dwell time” (seconds per
pixel) is r/2nv secs/pixel
– ok for low res e.g. AVHRR, as large r
– But problems for mod - high res.
– E.g. Landsat MSS, r = 70m, v = 7x103ms-1
n=3000 so dwell time = 70/2*3000*7x103 =
1.7secs (OK for SNR)
– BUT with single detector, required length of scan
cycle r/v is 10msecs (70/7x103)
– = 100 scan cycles per second
– TOO FAST!
Active scan
r/2
r
flyback
Speed, v
7
Scanning mechanisms: e.g. MSS
• MSS has 4x6 array of receptors - 4 bands, 6 receptors per band
• 6 lines scanned simultaneously
– ‘footprint’ of single receptor follows a zig-zag track
– ~30 cycles per second
T=0
WEST
T = 53ms
Active scan
474m
EAST
T = 73.4ms
185km (swath width)
8
Scanning mechanisms: boustrophedon
Active
• Alternative right left, left right
– 2 n line pixels scanned in r/v secs
– r/2nv secs/pixel
– For TM for e.g. r = 30m v = 20/3 x 103ms-1
n = 6000
– dwell time 0.38 sec (not long enough
for good SNR)
– scan cycle ~4.5 msecs (~220 per second)
– Way too fast i.e. single detector operation
inadequate for TM
– use 6 detectors per band (vis), and 16
lines at a time in vis, 4 at a time in thermal
– 100 detectors total
From: http://rst.gsfc.nasa.gov/Intro/Part2_20.html
r
Active
Active
Speed, v
9
Photography
• Largely obsolete due to electromechanical sensors
• Still used for
– some mapping and monitoring applications
• partic. aerial surveys and photogrammetry
– BUT requirement to get film back and process it
– Pan-chromatic (B&W) and colour (vis and some IR) but limited spectrally
– Radial image distortion away from focal point
• Relatively easy to correct if camera geometry known
10
Photography
– E.g. Wild RC10 aerial camera + tracker software as used by NERC
Airborne Research and Survey Facility
– www.nerc.ac.uk/arsf
– Software allows pilot to gauge coverage and overlap
11
Photography
•
•
•
•
AP of Barton Bendish, Norfolk
Acquired 1997 by NERC
aircraft
Scan of original
Note flight info and fiducial
marks @ corners
12
Photography: parameters
• Photographic camera uses whole-frame image capture
– near instantaneous snapshot of projected field-of view on ground
– i.e. IFOV == whole FOV
– Imaged region (A) focused by lens/mirror system onto focal plane
(C)
– Spectral sensitivity from 0.3 to 0.9m i.e. Uv/vis/NIR
From: http://www.ccrs.nrcan.gc.ca/ccrs/learn/tutorials/fundam/chapter2/chapter2_7_e.html
13
Photography: parameters
•Large and small apertures in camera system
•aperture compared to diameter of lens
FROM: http://cdoswell.com/tips2.htm
14
Photography: parameters
•Focal length of photographic system
•pros and cons
•Amount of light v. depth of field
FROM: http://cdoswell.com/tips2.htm
15
Optical mechanisms: e.g. MSS
• MSS optical system uses reflecting (Cassegrain) telescope
– Lens with hole in centre (concave)
– Convex focusing mirror
Detector
plane
Principal plane
23cm
9cm diam mirror
f = 82cm
Equivalent to a lens of
focal length 82cm
16
Photography: parameters
• Normally adjust 4 parameters
– focus - by altering position of focusing lens relative to focal plane
– F-stop (f-number), defined as f/d i.e. Focal length / effective diameter of lens
opening
– Shutter speed
• e.g. 1/2000, 1/1000, 1/500 .... 1/2, 1/1, 2/1, 4/1 seconds
• Faster shutter = less motion blur, but less light
– Film “speed” - exposure level over which film responds (ISO/ASA number)
• Faster film responds to lower light BUT poorer spatial resolution
• ISO 25-100 (slow), 200-1000 (faster)
17
Photography: parameters
• General film exposure equation
– E = exposure in Joules (J) mm-2, s = intrinsic scene brightness,
in J mm-2s-1, d = diameter of lens opening in mm, t = time in
seconds, f = lens focal length, mm
– So E is measure of recorded energy
• E increases with d2 , s and t
• E decreases with f2
– Note that any lens system diffraction limited i.e. can’t resolve
objects smaller than s/D
• s = distance of object from object-side focal point; D =
demagnification (Altitude/focal length i.e. D = 1/magnification =
1/s/f = f/s)
18
Photography
• Historical archives of photography
– many military applications now declassified
– e.g. Surveillance (U2, Cuba, Bay of Pigs.....)
– Vietnam, N. Korea etc. etc.
?
From: Dr. S. Lewis, PhD thesis, 2003 UCL.
19
Time-resolved signals: LIDAR
• Light Detection And Ranging
–
–
–
–
optical wavelength analogue of RADAR
active remote sensing
used for laser altimetry (height measurement) but also other information
Why use optical???
• Velocity of light ~ 3x108 ms-1
– one light year = 9.46 × 1015 m (10 trillion m)
– used for cosmological distances BUT also useful for smaller distances
– Light travels ~ 30cm in 1 nanosecond (10-9s)
20
Time-resolved signals: LIDAR
Laser footprint
LIDAR – light detection and ranging - optical equivalent of RADAR
First/last (discrete) return LIDAR
Full waveform LIDAR more information BUT harder to
generate & interpret
See Baltsavias paper for lidar equations
21
Waveform LIDAR
• If we can resolve more than just
first/last return
– record shape of returning waveform?
– Waveform LIDAR
– Contains information about e.g.
Vegetation canopy structure
– Requires v. accurate timing information
– Again, typically green or red
From:http://denali.gsfc.nasa.gov/research/laser/slicer/slicer.html
22
Time-resolved signals: LIDAR
•
So for LIDAR
–
–
•
range of target from sensor (and source) is time of round trip for a pulse of light
return pulse very weak (function of surface reflectance) & (usually) spread out
LIDAR
–
–
–
–
laser light from source (coherent - narrow range of wavelengths) - typically 670-700nm
Spreads out as it is a wave (e.g. 10 to 100m spots on surface)
Roughness variation within spot (IFOV) mean energy returns sooner from some bits than
others
Needs short, powerful laser pulses
•
safety?
From: http://www.nasa.gov/offices/oce/appel/knowledge/publications/VCL.html
23
Lidar signal: single birch tree
More examples at:
http://www2.geog.ucl.ac.uk/~mdisney/3Dmovies/
24
Lidar signal: single birch tree, materials
More examples at:
http://www2.geog.ucl.ac.uk/~mdisney/3Dmovies/
25
E.g. First/last return LIDAR data
•
Structural information from LIDAR
•
Possibly in situ laser scanning
•
Information?
–
Canopy height
–
Canopy gap fraction and vertical
profile of foliage
26
E.g. Waveform LIDAR data
•
Canopy height AND density information
– intensity of return related to density
– from http://ltpwww.gsfc.nasa.gov/eib/projects/airborne_lidar/slicer.html
27
LIDAR missions?
• SLICER
– Scanning Lidar Imager of Canopies by Echo
Recovery
– http://denali.gsfc.nasa.gov/research/laser/slicer/s
licer.html
• MOLA
– Mars Orbital LIDAR altimeter on Mars Global
Surveyor
– V. Accurate info on Martian topography
– Clues to geological formation
• GLAS
– Geoscience Laser Altimeter System on IceSAT
• Altimetry uses only first and last return
signal
28
ICESat (aka: Laser Altimetry Mision) The Ice, Cloud, and Elevation Satellite
•
Launched Jan 12, 2003
– Jan 15, 2003 Earth pointing
•
Measures
–
–
–
–
–
–
•
•
ice sheet elevations
changes in elevation through time
height profiles of clouds and aerosols
land elevations
vegetation cover
approximate sea-ice thickness.
Geoscience Laser Altimeter System (GLAS)
- sole instrument
Combinination surface lidar with dual
wavelength cloud and aerosol lidar
Images and info from http://icesat.gsfc.nasa.gov/
29
Time-resolved signals: LIDAR
• VCL didn’t get launched
– NASA budget cuts
– http://earthobservatory.nasa.gov/Library/VCL/VCL.html
– http://www.geog.umd.edu/vcl/vcltext.html
• ASCOPE – Proposed ESA Explorer mission (didn’t get selected)
– http://www.esa.int/esaCP/SEMHQH9ATME_index_0.html
• DESDyni – Deformation, Ecosystem Structure and Dynamics of Ice
– L-band interferometric SAR
– Canopy lidar
• But being applied in airborne projects
– rapid way to generate information on standing biomass
– Wood volume per hectare
• Used in carbon studies
• useful for forestry, inventory etc. etc.
From: http://earthobservatory.nasa.gov/Library/VCL/VCL_2.html
30
E.g. Waveform LIDAR modelling
•
Use Monte
Carlo Ray
Tracing to
model LIDAR
signal of GLAS
ICEsat
•
Images
courtesy of U.
Heyder
31
Simulating spaceborne LIDAR: ASCOPE
Based on field measurements in UK, Sweden, Finland
See Disney et al (2009) IEEE TGRSS, ASCOPE paper
32
Ground-based laser scanning?
• Tripod-mounted LIDAR
– developed for surveying
– BUT has uses for collecting information on forest density and structure
– Typically records point cloud from several known locations then use
software to reconstruct scene in 3D
From: http://www.geospatialonline.com/geospatialsolutions/article/articleDetail.jsp?id=65014&pageID=4
33
The next generation! ECHIDNA
• Scanning (multi-beam) ground-based LIDAR
– Developed by Jupp et al. at CSIRO (Aus.) specifically for vegetation
From talk by D. Jupp at ISPMSRS, Beijing, October 17-19 2005.
34
ECHIDNA
From talk by D. Jupp at ISPMSRS, Beijing, October 17-19 2005.
35
ECHIDNA
•Generalise hemispherical information
•But much more than for photography (discriminate canopy compnents)
From talk by D. Jupp at ISPMSRS, Beijing, October 17-19 2005.
36
ECHIDNA
From talk by D. Jupp at ISPMSRS, Beijing, October 17-19 2005.
37
Simulating ground-based (canopy) LIDAR
• Hemispherical
full waveform terrestrial laser scanner.
Abisko, Sweden
– Generates volumetric canopy data.
Echidna: Ground-based full-waveform scanning
White Fir, Sierra Nevada (A. Strahler)
Jupp et al. (2009) Estimating forest LAI profiles and structural parameters using a ground-based laser called
38
Echidna, Tree Physiology 29(2) 171-181
Steve Hancock EPSRC, NCEOI
LIDAR sounding (up/down)
• For studying atmospheric aerosols, clouds etc.
–
–
–
–
–
Use backscatter properties of atmosphere
e.g. LITE (1994 Shuttle mission)
Upward looking? e.g. ELF
Coherence of laser gives narrow beam
better azimuthal sampling than thermal, RADAR
From:http://alg.umbc.edu/elf/elf.html
39
Ground-based: GPR
• Ground penetrating RADAR
– gives v. accurate information on sub-surface density and structure
– Use for surveying hidden pipes for e.g.
– Archaeology
• Hidden graves
• dinosaur tracks!
– Geophysics
• ice and snow density & movement
– Hidden objects?
• Landmines...
From:www.geomodel.com &&
http://www.du.edu/~lconyer/picketwire_canyonlands_dinosaur_.htm
40
Summary
• Sensor scanning mechanisms
– Limitations (dwell-time/SNR, scan rate)
– Striping of detector lines and arrays
– CCD
• Photography
– Becoming less widely-used but still some applications
• Time-resolved: LiDAR
– For altimetry AND imaging (veg. structure) – higher vertical resolution than
RADAR
• Ground-based
– Upward-looking for atmospheric studies
– GPR for sub-surface surveying: archaeology, geophysical dynamics
41
REVISION
MISCELLANEOUS EXAMPLES,
TOPICS
42
Revision: orbits and swaths
• Example: polar orbiter period, if h = 705x103m
– T = 2[(6.38x106 +705x103)3 / (6.67x10-11*5.983x1024)]1/2
– T = 5930.6s = 98.8mins
• Example: show separation of successive ground tracks
~3000km
–
–
–
–
–
Earth angular rotation = 2/24*60*60 = 7.27x10-5 rads s-1
So in 98.8 mins, point on surface moves 98.8*60*7.27x10-5 = .431 rads
Remember l =r* for arc of circle radius r & in radians
So l = (Earth radius + sat. altitude)*
= (6.38x106 +705x103)* 0.431 = 3054km
43
Revision: Planck’s Law
•Planck was able to explain energy spectrum of blackbody
•Based on quantum theory rather than classical mechanics
E
2c 2 h
5
1
e
hc
kT
1
•dE()/d gives constant of Wien’s Law
•E() over all results in Stefan-Boltzmann relation
•Blackbody energy function of , and T
http://www.tmeg.com/esp/e_orbit/orbit.htm
44
Revision: Planck’s Law
•Explains/predicts shape of blackbody curve
•Use to predict how much energy lies between given
•Crucial for remote sensing
http://hyperphysics.phy-astr.gsu.edu/hbase/bbrc.html#c1
45
Atmospheric “windows”
Atmospheric
windows
•As a result of strong dependence of absorption
•Some totally unsuitable for remote sensing as most
radiation absorbed
46
Revision: the atmosphere
•SCATTERING: caused by presence of particles (soot, salt, etc.) and/or
large gas molecules present in the atmosphere
•Rayleigh, Mie, Non-selective
•ABSORPTION: gaseous components (CO2, CO, CH4, H2O etc)
•Very strong function of wavelength
•Atmospheric windows
47
Revision: the surface: BRDF
•Reflectance of most real surfaces is a function of not only , but viewing and
illumination angles
•Described by the Bi-Directional Reflectance Distribution Function (BRDF)
•BRDF of area A defined as: ratio of incremental radiance, dLe, leaving
surface through an infinitesimal solid angle in direction (v, v), to
incremental irradiance, dEi, from illumination direction ’(i, i) i.e.
BRDF(Ω, Ω' )
dLe (Ω, Ω' )
sr 1
dEi (Ω' )
• is viewing vector (v, v) are view zenith and azimuth angles; ’ is illum.
vector (i, i) are illum. zenith and azimuth angles
•So in sun-sensor example, is position of sensor and ’ is position of sun
After: Jensen, J. (2000) Remote sensing of the environment: an Earth Resources Perspective.
48
Revision: the surface: BRDF
•Note that BRDF defined over infinitesimally small solid angles , ’ and
interval, so cannot measure directly
•In practice measure over some finite angle and and assume valid
viewer
exitant solid
angle
incident solid
angle
incident
diffuse
radiation
direct irradiance
(Ei) vector
v
i
2-v
surface tangent
vector
i
surface area A
Configuration of viewing and illumination vectors in the viewing
hemisphere, with respect to an element of surface area, A.
49
From: http://www.geog.ucl.ac.uk/~mdisney/phd.bak/final_version/final_pdf/chapter2a.pdf
Revision: examples
• Planck function
– Gravitational force Fg = GMEms/RsE2
• where G is universal gravitational constant (6.67x10-11 Nm2kg2); ME is Earth
mass (5.983x1024kg); ms is satellite mass (?) and RsE is distance from
Earth centre to satellite i.e. 6.38x106 + h where h is satellite altitude
– Centripetal (not centrifugal!) force Fc = msvs2/RsE
• where vs is linear speed of satellite (=sRsE where is the satellite angular
velocity, rad s-1)
– for stable (constant radius) orbit Fc = Fg
– GMEms/RsE2 = msvs2/RsE = ms s2RsE2 /RsE
– so s2 = GME /RsE3
From:http://csep10.phys.utk.edu/astr161/lect/history/kepler.html
50
Revision problems: Planck’s Law
•Fractional energy from 0 to F0? Integrate Planck function
•Note Eb(,T), emissive power of bbody at , is function of product
T only, so....
Radiant energy from 0 to
E0 , T
Eb , T
F0 , T
d , T
4
5
T
T
0
T
Total radiant energy
for =0 to =
51
Revision: Planck’s Law example
•Q: what fraction of the total power radiated by a black body
at 5770 K fall, in the UV (0 0.38µm)?
•Need table of integral values of F0
•So, T = 0.38m * 5770K = 2193mK
T (mK x103)
•Or 2.193x103 mK i.e. between 2 and 3
2
3
4
5
6
8
10
12
14
16
18
20
•Interpolate between F0 (2x103) and F0 (3x103)
F00.38 , T F00.38 2 x103
2.193 2
0.193
F00.38 3x103 F00.38 2 x103
3 2
F00.38 , T 0.067
0.193
0.273 0.067
F0(T)
(dimensionless)
.067
.273
.481
.634
.738
.856
.914
.945
.963
.974
.981
.986
•Finally, F00.38 = 0.193*(0.273-0.067)+0.067 = 0.11
•i.e. ~11% of total solar energy lies in UV between 0 and 0.38m
52
Orbits: examples
• Orbital period for a given instrument and height?
– Gravitational force Fg = GMEms/RsE2
• where G is universal gravitational constant (6.67x10-11 Nm2kg2); ME is Earth
mass (5.983x1024kg); ms is satellite mass (?) and RsE is distance from
Earth centre to satellite i.e. 6.38x106 + h where h is satellite altitude
– Centripetal (not centrifugal!) force Fc = msvs2/RsE
• where vs is linear speed of satellite (=sRsE where is the satellite angular
velocity, rad s-1)
– for stable (constant radius) orbit Fc = Fg
– GMEms/RsE2 = msvs2/RsE = ms s2RsE2 /RsE
– so s2 = GME /RsE3
From:http://csep10.phys.utk.edu/astr161/lect/history/kepler.html
53
Orbits: examples
• Orbital period T of satellite (in s) = 2/
– (remember 2 = one full rotation, 360°, in radians)
– and RsE = RE + h where RE = 6.38x106 m
– So now T = 2[(RE+h)3/GME]1/2
• Example: geostationary altitude? T = ??
– Rearranging: h = [(GME /42)T2 ]1/3 - RE
– So h = [(6.67x10-11*5.983x1024 /42)(24*60*60)2 ]1/3 - 6.38x106
– h = 42.2x106 - 6.38x106 = 35.8km
54