lecture3 - University of Alberta
Download
Report
Transcript lecture3 - University of Alberta
CMPUT 412
Sensing
Csaba Szepesvári
University of Alberta
1
Defining sensors and actuators
Actuators
Sensors
Environment
Sensations
(and reward)
actions
Controller
= agent
2
Perception
Sensors
Uncertainty
Features
3
How are sensors used?
4
HelpMate,
Transition Research Corp.
5
B21, Real World Interface
6
Robart II, H.R. Everett
7
Savannah, River Site Nuclear
Surveillance Robot
8
BibaBot, BlueBotics SA,
Switzerland
Omnidirectional Camera
Pan-Tilt Camera
IMU
Inertial Measurement Unit
Sonar Sensors
Emergency Stop Button
Laser Range Scanner
Wheel Encoders
Bumper
9
Taxonomy of sensors
10
Classification of Sensors
Where is the information coming from?
Inside: Proprioceptive sensors
motor speed, wheel load, heading of the robot,
battery status
Outside: Exteroceptive sensors
distances to objects, intensity of the ambient light,
unique features
How does it work? Requires energy emission?
No: Passive sensors
temperature probes, microphones, CCD
Yes: Active sensors
Controlled interaction -> better performance
Interference
Simple vs. composite
(sonar vs. wheel sensor)
11
General Classification (1)
12
General Classification (2)
13
Sensor performance
14
How Do (Simple) Sensors Work?
Electrical current
Environment
input
Analog to
digital
conversion
Physical
process
output
Analog signals
00101011010100
11010111010101
Digital signals
15
Mathematical Models
Signal in => signal out: response
Memoryless: Vout = S( Ein , Noiset)
With memory: Vout = f( Vout, Ein , Noiset)
Sampling rate,
aliasing,
dithering
Electrical
current
Environment
input
Physical
process
output
Analog to
digital
conversion
00101011010100
11010111010101
16
Nominal Sensor Performance
Valid inputs
Emin: Minimum detectable energy
Emax: Maximum detectable energy
Dynamic range = Emax/Emin , or 10 log(Emax/Emin ) [dB]
power measurement or volt? (V2 ~ power)
Operating range (Nmin, Nmax): Emin · Nmin · Nmax · Emax
No aliasing in the operating range (e.g., distance sens.)
Response
Sensor response: S(Ein)=?
Linear? (or non-linear)
Hysteresis
Resolution (¢):
E1-E2· ¢ ) S(E1)¼ S(E2); often ¢=min(Emin , ¢A/D )
Timing
Response time (range): delay between input and output [ms]
Bandwidth: number of measurements per second [Hz]
17
In Situ Sensor Performance: Sensitivity
Characteristics .. especially relevant for real world environments
Sensitivity:
How much does the output change with the input?
Memoryless sensors: min{ [d/dE S] (Ein) | Ein }
Sensors with memory: min{ f(V,Ein)/Ein | V, Ein }
Cross-sensitivity
sensitivity to environmental parameters that are orthogonal
to the target parameters
e.g. flux-gate compass responds to ferrous buildings,
orthogonal to magnetic north
Error: ²(t) = S(t) - S(Ein(t))
Systematic: ²(t) = D(Ein(t))
Random: ²(t) is random, e.g., ²(t) ~ N(¹,¾2)
Accuracy (systemacity): 1-|D(Ein)|/Ein, e.g., 97.5% accuracy
Precision (reproducability): Rangeout/ Var(²(t))1/2
18
In Situ Sensor Performance: Errors
Characteristics .. especially relevant for real world environments
Error: ²(t) = S(t) - S(Ein(t))
Systematic: ²(t) = D(Ein(t))
Predictable, deterministic
Examples:
Calibration errors of range finders
Unmodeled slope of a hallway floor
Bent stereo camera head due to an earlier collision
Random: ²(t) is random, e.g., ²(t) ~ N(¹,¾2)
Unpredictable, stochastic
Example:
Thermal noise ~ hue calibration, black level noise in a camera
Accuracy – accounts for systemic errors
1-|D(Ein)|/Ein,
e.g., 97.5% accuracy
Precision – high precision ~ low noise
Rangeout/ Var(²(t))1/2
19
Challenges in Mobile Robotics
Systematic vs. random errors
Error distributions
20
Systematic vs. Random?
Sonar sensor:
Sensitivity to: material, relative positions of sensor and
target (cross-sensitivity)
Specular reflections (smooth sheetrock wall; in general
material, angle)
Systematic or random? What if the robot moves?
CCD camera:
changing illuminations
light or sound absorbing surfaces
Cross-sensitivity of robot sensor to robot pose and
robot-environment dynamics
rarely possible to model -> appear as random errors
systematic errors and random errors might be well defined
in controlled environment. This is not the case for mobile
robots !!
21
Error Distributions
A convenient assumption: ²(t) ~ N(0,§)
WRONG!
Sonar (ultrasonic) sensor
Sometimes accurate, sometimes overestimating
Systematic or random? “Operation modes”
Random => Bimodal:
- mode for the case that the signal returns directly
- mode for the case that the signals returns after multipath reflections.
Errors in the output of a stereo vision system
(distances)
Characteristics of error distributions
Uni- vs. Multi-modal,
Symmetric vs. asymmetric
Independent vs. dependent (decorrelated vs.
correlated)
22
About Some Sensors
Wheel Encoders
Active Ranging
23
Wheel Encoders
24
Wheel/Motor Encoders (1)
Principle: Photo detection + optical grid
Direction of motion: Quadrature encoder
Output: Read values with polling or use interrupts
Resolution: 2000 (->10K) cycles per revolution (CPR).
for higher resolution: interpolation, sine waves
Accuracy: no systematic error (accuracy~100%)
Rotating optical grid
time
25
Wheel/Motor Encoders (2)
Measures position or speed
of the wheels or steering
Use: odometry,
position estimation,
detect sliding of motors
scanning
reticle
fields
scale
slits
Direction change:
26
Active Range Sensors
27
Range Sensors
Large range distance measurement
-> “range sensors”
Why?
Range information is key for localization and
environment modeling
Cheap
Relatively accurate
How?
Time of flight
Active sensing (sound, light)
28
Time of flight - principles
Time delay of arrival (TDOA)
TDOA – impulses
Sound, light
TDOA – phase shift
Light
Geometry
Triangulation – single light beam
Light
Triangulation – structured light
Light
Light sensor; 1D or 2D camera
29
Time Delay of Arrival
d=vt
d – distance travelled (computed)
v – speed of propagation (known)
t – time of flight (measured)
D
Source
& sensor
Target
2D = v t
30
TDOA: Limitations
What distances can we measure?
Must wait for the last package to arrive
before sending out the next one
=> Speed of propagation determines
maximum range!
Speeds
Sound: 0.3 m/ms
Electromagnetic signals (light=laser):
0.3 m/ns, 1M times faster!
3 meters takes..
Sound: 10 ms
Light: 10 ns
.. But technical difficulties => expensive and
delicate sensors
31
TDOA: Errors
Time measurement
Exact time of arrival of the reflected signal
Time of flight measure (laser range sensors)
Opening angle of transmitted beam
(ultrasonic range sensors)
Interaction with the target (surface,
specular reflections)
Variation of propagation speed
Speed of mobile robot and target (if not at
stand still)
32
Ultrasonic Sensor
33
Ultrasonic (US) Sensor
transmit a packet of US pressure waves
The speed of sound v (340 m/s) in air is
q
v=
°: adiabatic index
°
R
M
T
(sound wave->compression->heat)
R: moral gas constant [J/(mol K)]
M: molar mass [kg/mol]
T: temperature [K]
34
Operation
Wave packet
Transmitted sound
threshold
Analog echo
signal &
threshold
Digital echo signal
Integrated time
& output signal
integrator
Time of flight (output)
Blanking time
35
Ultrasonic Sensor
Piezo transducer
Frequencies: 40 - 180 kHz
Sound source: piezo/electrostatic transducer
transmitter and receiver separated or not separated
Propagation: cone
opening angles around 20 to 40 degrees
regions of constant depth
segments of an arc (sphere for 3D)
0°
-30°
Electrostatic transducer
-60°
measurement cone
30°
60°
Amplitude [dB]
Typical intensity distribution of an ultrasonic sensor
36
Example
37
Imaging with an US
Issues:
Soft surfaces
Sound surfaces that are far from being
perpendicular to the direction of
the sound -> specular reflection
a) 360° scan
b) results from different geometric primitives
38
Characteristics
Range: 12cm – 5 m
Accuracy: 98%-99.1%
Single sensor operating speed: 50Hz
3m -> 20ms ->50 measurements per
sec
Multiple sensors:
Cycle time->0.4sec -> 2.5Hz
->limits speed of motion (collisions)
39
Laser Range Sensor
40
Laser Range Sensor: Physics
Laser=
•Low divergence
•Well-defined wavelength
41
Time of flight measurement methods
Pulsed laser
Direct measurement of elapsed time
Receiver: Picoseconds accuracy
Accuracy: centimeters
Beat frequency between a frequency
modulated continuous wave and its
received reflection
Phase shift measurement
Technically easier than the above two
methods
42
Distance from phase-shift
Reflected
Target
beam (r(x))
Amplitude
[V]
Transmitted
beam (s(x))
r (x) = s(2d ¡ x)
Phase [m]
z
r (z) = 0 ,
d
¸
s(2d ¡ z) = 0 ,
2d ¡ z = k¸ ,
z = 2d + k 0¸
sinced < ¸ =2; k0 = 0 ) z = 2d ) µ = 2¼2d
¸
d=
µ¸
4¼
Ambiguity! d and d+¸/2 give the same µ
43
Laser Range Sensor
Phase-Shift Measurement
D
Transmitter
P
Target
L
Phase
Measurement
l = c/f
D L 2 D L
c: speed of light (0.3 m/ns)
Transmitted Beam
Reflected Beam
l
2
f: the modulating frequency
D’: distance covered by the emitted light
for f = 5 Mhz (as in the AT&T sensor), l = 60 meters
44
Laser Range Sensor
Confidence in the range (phase estimate) is inversely proportional to the square of the
received signal amplitude.
Hence dark, distant objects will not produce such good range estimated as closer brighter objects …
45
Laser Range Sensor
Typical range image of a 2D laser range sensor with a rotating mirror. The length
of the lines through the measurement points indicate the uncertainties.
46
Triangulation Ranging
Geometry -> distance
Unknown object size: project a known
light pattern onto the environment and
use triangulation
Known object size: triangulation without
light projecting
47
Laser Triangulation (1D)
D
Laser / Collimated beam
P
Target
L
D f
x
Lens
L
x
Transmitted Beam
Reflected Beam
Position-Sensitive Device (PSD)
or Linear Camera
48
Sharp IR Rangers
49
Conclusions
Why & how?
Sensing: Essential to deal with
contingencies in the world
Sensors: Make sensing possible
Anatomy of sensors:
Physics, A/D, characteristics
Wheel encoders
Distance sensors
Time of flight
Triangulation
50