Embedded Networked Sensing Systems: motivations and challenges
Download
Report
Transcript Embedded Networked Sensing Systems: motivations and challenges
Embedded Networked Sensing for
Environmental Monitoring:
Applications and Challenges
Deborah Estrin
Center for Embedded Networked Sensing (CENS), Director
UCLA Computer Science Department, Professor
Work summarized here is largely that of students and staff at
CENS
Embedded Networked Sensing Potential
• Micro-sensors, onboard processing,
wireless interfaces
feasible at very small
scale--can monitor
phenomena “up close”
Ecosystems, Biocomplexity
Marine Microorganisms
• Enables spatially and
temporally dense
environmental
monitoring
Embedded
Networked Sensing
will reveal previously
unobservable
phenomena
Contaminant Transport
Seismic Structure Response
ENS enabled by
Networked Sensor Node Developments
LWIM III
AWAIRS I
UCLA, 1996
UCLA/RSC 1998
Geophone, RFM
Geophone, DS/SS
radio, PIC, star
Radio, strongARM,
network
Multi-hop networks
Sensor Mote
Medusa, MK-2
UCB, 2000
UCLA NESL
RFM radio,
2002
Atmel
Predecessors in
• DARPA Packet Radio program
• USC-ISI Distributed Sensor Network Project (DSN)
ENS: Technology Design Themes
•
Long-lived systems that can be untethered (wireless) and unattended
• Communication will be the persistent primary consumer of scarce
energy resources (Mote: 720nJ/bit xmit, 4nJ/op)
• Autonomy requires robust, adaptive, self-configuring systems
•
Leverage data processing inside the network
• Exploit computation near data to reduce communication, achieve
scalability
• Collaborative signal processing
• Achieve desired global behavior with localized algorithms
(distributed control)
•
“The network is the sensor” (Manges&Smith, Oakridge Natl Labs,
10/98)
• Requires robust distributed systems of hundreds of
physically-embedded, unattended, and often untethered, devices.
ENS Architecture Drivers
DRIVERS
TECHNICAL CAPABILITIES
Varied and variable
environments
Adaptive Self-Configuring
Wireless Systems
Energy and scalability
Distributed Signal and
Information Processing
Heterogeneity of devices
Networked Info-Mechanical
Systems
Smaller component size
and cost
Embeddable
Microsensors
CENS Systems
under design/construction
•
Biology/Biocomplexity
• Microclimate monitoring
• Triggered image capture
• Canopy-net (Wind River
Canopy Crane Site)
•
Contaminant Transport
• County of Los Angeles
Sanitation Districts (CLASD)
wastewater recycling project,
Palmdale, CA
•
Seismic monitoring
• 50 node ad hoc, wireless,
multi-hop seismic network
• Structure response in USGSinstrumented Factor Building
w/ augmented wireless
sensors
Ecosystem Monitoring
•
Sensor system logical
components
•
•
•
•
•
Tasking, configuration
(sample rates, event
definition, triggering)
Data Transport
Device management,
sample manipulation
and caching with
timing
Duty cycling
Other important
examples of habitat
monitoring systems
•
Berkeley/Intel GDI
and Botanical
gardens
Extensible Sensing System (ESS) Software*
•
Tiered architecture components
• Mica2 )motes (8 bit microcontrollers w/TOS with Sensor Interface Board
hosting in situ sensors
• Microservers are solar powered, run linux, 32-bit processors
• Pub/sub bus over 802.11 to Databases, visualization and analysis tools,
GUI/Web interfaces
•
Data multicast over
Internet on
publish-and-subscribe
bus system
(called Subject Servers)
to databases, GUIs,
other data analysis tools,
clients.
* Osterweil, Rahimi, Mysore, Wimbrow
Long-lived, Self-configuring Systems
Localization &
Time Synchronization
Irregular deployment and
environment
Dynamic network topology
Hand configuration will fail
• Scale, variability,
maintenance
Calibration
Information Transport,
Aggregation and Storage
Common theme:
local adaptation and redundancy
Programming Model
Event Detection
Network Architecture: Can we adapt Internet
protocols and “end to end” architecture?
• Internet routes data using IP Addresses in Packets
and Lookup tables in routers
• Humans get data by “naming data” to a search
engine
• Many levels of indirection between name and IP
address
• Works well for the Internet, and for support of
Person-to-Person communication
• Embedded, energy-constrained (un-tethered, smallform-factor), unattended systems cant tolerate
communication overhead of indirection
Directed Diffusion*--Data Centric Routing
•
Data centric approach has the
right scaling properties
• name data (not nodes)
with externally relevant
attributes (data type, time,
location of node, SNR, etc)
• diffuse requests and
responses across network
using application driven
routing (e.g., geo
sensitive)
• support in-network
aggregation and
processing
• Not end to end data
delivery
• Not just a database query
* Heidemann et.al. SOSP ‘01, ** Krishnamachari et al. ‘02
Diffusion: One Phase Pull *
Sources
Sink
•
•
•
Optimized
version of
general
diffusion
(Heidemann et
al.)
Pulls data out
to only one
sink at a time
(saves
energy)
Used in
Ecosystem
application
over Mica 2
motes:
TinyDiffusion
(Osterweil et
al)
Interest
Gradient
Routed Data
Voronoi Scoping:
Restricted Floods from Multiple Sinks*
•
Benefits of multiple sinks
• Reduce average path length
• Equalize load over multiple trees
• Tiered architecture, redundancy
• BUT: Linear increase in interests
flooded!
•
•
Voronoi clusters: partition topology,
each subset contains nodes closest to
associated sink.
Only fwd interests from closest sink
• No overlap between floods
• Motes receive interest from their
closest sink
• Scalable: both tiers grow, load per
mote remains constant.
*With Henri Dubois-Ferrière, EPFL
•Live network (emstar/emview)
•3 sinks, 55 motes
• color-coded clusters
Multi-hop data extraction characteristics using
Tiny Diffusion
•
Collected basic network characteristics to verify readiness for sensor
deployment
• Average system loss rates analyzed over fixed intervals and related to
nodes of with various: average, minimum, and maximum hop counts (under
3% end to end)
• Additional nodes deployed to augment persistent ESS topology to study
effects such as loss experienced by nodes introduced with less ground
clearance.
•
UCB/Intel GDI deployment has good results from their fielded borrow
monitoring system using same Mote platform
Characterizing wireless channels*
•
•
•
Great variability over distance (50-80% of communication range, vertical lines).
• Reception rate not normally distributed around mean and standard deviation.
• Real communication channel is not circular.
5 to 30% asymmetric links.
• Not correlated with distance or transmission power.
• Primary cause: differences in hardware calibration (rx sensitivity, energy
levels, etc.).
Time variations correlated to mean reception rate, not distance from transmitter.
*Cerpa, Busek et. al
Research Challenge:
Networked Info Mechanical Systems
(NIMS)*
NIMS Architecture: Robotic, aerial
access to full 3-D environment
Enable sample acquisition
Coordinated Mobility
Enables self-awareness of
Sensing Uncertainty
Sensor Diversity
Diversity in sensing resources,
locations, perspectives,
topologies
Enable reconfiguration to reduce
uncertainty and calibrate
NIMS Infrastructure
Enables speed, efficiency
Low-uncertainty mobility
Provides resource transport for
sustainable presence
* (Kaiser, Pottie, Estrin, Srivastava,
Sukhatme, Villasenor)
Broadband ad hoc seismic array *
* P. Davis
•
Core requirement is multi-hop time synchronization to eliminate dependence on
GPS access at every node
GPS is the usual way to time-sync data collection -but satellites are blocked in some interesting places
Under Foliage
Underwater
Indoors
Sensor networks can propagate time
from nodes that have a sky view
to those that don’t.
Canyons
* Elson et al. OSDI 12/02
Enabling technology: “RBS” -- a new form of
synchronization that exploits the nature of a
wireless channel to achieve exceptional precision*
Time Synchronization in Sensor Networks
•
•
•
•
•
Also crucial in many other contexts
• Ranging, tracking, beamforming,
security, MAC, aggregation etc.
Global time not always needed
NTP: often not accurate or flexible
enough; diverse requirements!
New ideas
• Local timescales
• Receiver-receiver sync
• Multihop time translation
• Post-facto sync
Mote implementation
• ~10 s single hop
• Error grows slowly over hops
Sender
Receiver Receiver
NIC
NIC
I saw it
at t=4
NIC
I saw it
at t=5
Propagation Time
1
2
5
A
3
Physical Media
1
B
4
7
2
5
6
6
3
4
7
8
9
10
11
C
8
9
D
* Elson et al. OSDI 12/02
10
11
Contaminant Transport Monitoring:
Palmdale Pivot Study *
•
Regulators require proof that
the nitrate-laden treated water
will not impact groundwater if
used for irrigation.
•
•
Vertical array of sensors will
measure rate of diffusion of
water and nitrate levels
•
•
monitoring wells cost of $75K
each
Observed nitrate levels, local
model will trigger contribute to
field-wide estimate of
hazardous Nitrate levels
Field wide estimate re.
concentrations and trends fed
back to sprinkler quantity
* T. Harmon
Research Challenge:
Distributed Representation, Storage, Processing
K V
K V
K V
K V
K V
K V
•
K V
K V K V
In network interpretation of spatially distributed data
•
Statistical or model based filtering
•
In network “event” detection and reporting
•
Direct queries towards nodes with relevant data
•
Trigger autonomous behavior based on events
• Expensive operations: high end sensors or sampling
• Robotic sensing, sampling
•
Support for Pattern-Triggered Data Collection
•
Multi-resolution data storage and retrieval
• Index data for easy temporal and spatial searching
•
Spatial and temporal pattern matching
• Trigger in terms of global statistics (e.g., distribution)
Time
K V
K V
•
Exploit tiered architectures
Tiered Data Processing*
•
•
•
Processing uses a two tiered
network.
• Task divided into local
computation and cluster head
computation.
Scope of local computation depends
on relative cost of local (blue-blue)
and cluster-head (blue-red)
communication
Example: identify regions over
which large gradient occurring
•
•
•
Locally, large gradients detected and
traversed (up to some scope)
Gradients paths over length
threshold identified and reported
Each cluster head combines
identification results and classifies
* T. Schoellhammer, et al
Research Challenge:
Calibration, or lack thereof
Un-calibrated Sensors
•
•
Storage, forwarding, aggregation,
triggering useless unless data values
calibrated
Calibration = correcting systematic
errors
•
•
•
Sources of error: noise, systematic
Causes: manufacturing, environment, age,
crud
72º
72º
72º
72º
must account for coupling of sensors to
environment
72º
72º
Factory Calibrated Sensors; Later
62º
70º
Significant concern that faulty sensors can
wreak havoc on in network processing
72º
71º
Dust
* Bychkovskiy , Megerian, Potkonjak
61º
72º
85º
Factory Calibrated Sensors: T0
Nearer term: identify faulty sensors and
flag data, discard for in-network
processing
•
73º
69º
Traditional in-factory calibration not
sufficient
•
•
70º
72º
72º
Research Challenge:
Macroprogramming*
• How to specify what, where and when?
• data modality and representation, spatial/temporal
resolution, frequency, and extent
• How to describe desired processing?
• Aggregation, Interpolation, Model parameters
• Triggering across modalities and nodes
• Adaptive sampling
• Primitives
• Annotated topology/resource discovery
• Region identification and characterization
• Intra-region coordination/synch
• System health data, alerts
• Topology, Resources (energy, link, storage)
• Sensor data management (buffering, timing)
•…
* Greenstein,
Culler, Kohler…
Lessons
•
Channel models
• Simplistic circular channel models can be very deceiving so
experimentation and emulation are critical
•
Named data
• Is the right model but its only a small step toward the bigger
problem of Macroprogramming
•
Duty cycling
• Critical from the outset…and tricky to get right--granularity,
level (application or communication)
•
Tiered Architectures
• One size doesn’t fit all and maybe it doesn’t fit any-distribution of resources (energy, storage, comm, cpu) across
the distributed system is an interesting problem
•
Its all a lot harder, and even more interesting than it looked 5
years ago
Follow up regarding IT aspects
•
Embedded Everywhere: A Research Agenda for Networked Systems of
Embedded Computers, Computer Science and Telecommunications Board,
National Research Council - Washington, D.C., http://www.cstb.org/
•
Conferences: ACM Sensys (Nov 03), WSNA (today), IPSN, SNPA (ICC),
Mobihoc, Mobicom, Mobisys, Sigcomm, Infocom, SOSP, OSDI, ASPLOS,
ICASSP, …
•
Whose involved:
• Active research programs in many CS (networking, databases, systems,
theory, languages) and EE (low power, signal processing, comm,
information theory) departments
• Industrial research activities at Intel, PARC, Sun, HP, Agilent, Motorola…
• Startup activity at Crossbow, Sensicast, Dust Inc, Ember, …
•
Related Funding Programs
• DARPA SenseIT, NEST
• NSF ITR, Sensors and sensor networks