Multi-Paradigm Evaluation of Heterogeneous Wireless Networks
Download
Report
Transcript Multi-Paradigm Evaluation of Heterogeneous Wireless Networks
Multi-Paradigm Evaluation of Embedded
Networked Wireless Systems
Rajive L. Bagrodia
Professor
Computer Science Dept, UCLA
[email protected]
DAWN PI Meeting,
October 14, 2008
Joint work with Yi-Tao Wang and M.Varshney
1
Next Generation DOD Networks
Network Characteristics: Heterogeneous, Scalable, Mobile
‹#›
DAWN Meeting Oct, 2008
The Multi-Paradigm Evaluation Framework
Multi-Paradigm Testbed for Heterogeneous wireless networks
In-situ evaluation of applications, protocols or sub-networks
... in a high fidelity simulation environment
Simulated components: Radio, channel or complete sub-networks
Model heterogeneous
networks
QuickTime™ and a
PNG decompressor
are needed to see this picture.
Application-centric
evaluation
Computer Science
Department
Physical
Emulation
Simulation
Design and Evaluation of Environmental Adaptive Wireless Systems
3
Embedded Networked Systems
• Large class of embedded system are resource constrained
devices (e.g. sensors)
• Important to capture the interactions between applications
and protocols with
– Operating systems
– Hardware
– Other resources such as memory, CPU, clock drifts etc
• Typically part of heterogeneous networks
• Emulation of embedded systems:
– Should capture the execution environment (OS & H/W)
– Model environmental resources with high fidelity
– Support heterogeneous networks (e.g. UAVs with UGS)
‹#›
DAWN Meeting Oct, 2008
Current Tools
• Simulators (TOSSIM, EmTOS, etc)
– Validates basic application behavior
– Lacks detailed simulation models
• Restricts accuracy and expressiveness of their simulations
• Cannot evaluate applications in deployment conditions
• Physical testbeds
–
–
–
–
‹#›
Accurate
Lacks spatial and temporal scalability
Difficult to perform simulations under complex conditions
Can’t repeat simulations
DAWN Meeting Oct, 2008
Our Target Heterogeneous Network
• Deploy TinyOS motes
to replace aging SOS
motes
• Must ensure:
– TinyOS motes can
co-exist with SOS
motes
– Maintain the network as SOS motes die off
• Impossible to model using existing tools
– Too complex for current simulators
– Too large for physical testbeds
‹#›
DAWN Meeting Oct, 2008
Outline
•
•
•
•
•
•
‹#›
Motivation
Related approaches
OS emulation
SOS emulation via SenQ
TinyOS emulation via TinyQ
Heterogeneous network evaluation
DAWN Meeting Oct, 2008
Need for Operating System Emulation
Case Study: SMAC Mac Protocol
Sleep Duration
(1024 ms)
Wakeup Duration
(300 ms)
Sender
Timer 1:
backoff=rand(0,63ms
)
Timer 2:
300-backoff ms
Destination
Protocol Implemented as:
a) Pure simulation model
b) Emulation with SOS operating system
‹#›
DAWN Meeting Oct, 2008
Emulation vs. Simulation
•
•
•
‹#›
Diagnosis: sleep schedule
fall out of synchronization
Backoff timer <10ms will
expire at 10ms
Timeouts > 250ms broken
down in chunks of 250ms
DAWN Meeting Oct, 2008
9
Culprit: OS Timer Management
What should have happened ...
t
Set timer
for 7ms
t+7
t+300
Timer Expire;
Tx packet,
set timer for 293ms
Timer Expire;
Set timer for 1024 ms,
Sleep
What actually happened....
t
Skew = 3ms!!
t+10
t+303
SOS minimum timer latency = 10ms
Set timer
for 7ms
t
Set timer
for 44ms
‹#›
t+44
t+294
Skew = 4ms!!
t+304
256ms = 250ms timer + 6ms timer
Max timer interval = 250ms
DAWN Meeting Oct, 2008
10
OS Emulation Approaches
• Operating System emulation
– Model the applications and protocol stack in context of the real
operating system
– SenQ [msl.cs.ucla.edu/projects/senq]
• For TinyOS and SOS
• Underlying QualNet network simulator
• Support multi-tiered heterogeneous networks
• High fidelity models for sensing channels, clock drifts, battery, power
consumption, CPU power
– TOSSIM [www.cs.berkeley.edu/~pal/research/tossim.html]
• For SOS
• Custom network simulator.
• Logical connectivity, lack of multi-tiered networks and environmental
models
‹#›
DAWN Meeting Oct, 2008
Emulation approaches (2)
• Hardware Emulation
–
–
–
–
–
–
–
–
Avrora [compilers.cs.ucla.edu/avrora]
Atemu [www.hynet.umd.edu/research/atemu]
Instruction cycle level emulation
Hardware resources modeled
(+) Highest fidelity for protocol emulation
(-) Slow execution time (much slower than real time)
(-) Lack scalability
(-) Lack of detailed models for channel and environment
– Good for small scale T&E (2-5 nodes) to be followed by OS level
emulation (10-10000 nodes)
‹#›
DAWN Meeting Oct, 2008
State-of-the-art
•Network Simulators with sensor models
– SensorSim(2001), SWAN(2001)
– (+) No need to migrate away from familiar platforms
– (-) No emulation support
•Emulators with networking support
– TOSSIM(2003), EmStar(2004), EmTOS(2004)
– (+) Easy development-debugging-deployment cycle
– (-) Discrete event simulation engine and channel models not
accurate
– (-) Specific for given OS platform
– (-) Does not support heterogeneous networks (IP, WiMAX etc)
•Instruction Cycle Emulation
– Atemu(2004), Avrora(2005), Worldsens(2007)
– (+) Greatest measure of software modeling fidelity
– (-) Intractable for even moderate sized networks
Computer Science
Department
‹#›
13
DAWN Meeting Oct, 2008
SenQ Objectives
• Ability to emulate sensor
applications & OS
• Independent of underlying
sensor operating system
• Integrate multiple sensor
OS in a single execution
• Scalable to 10,000+ radios
• Real-time or faster
execution
• Support heterogeneous
networks
• Flexible in configuring
scenarios
‹#›
Clock drift, Battery,
Power consumption
Real implementations
of TinyOS, SOS
Sensing channel
DAWN Meeting Oct, 2008
SenQ Approach
Network Simulator
Sensor Node
Applications
Parallel Sim.
Protocols
WiMax Stack
OS
IP Stack
Step 1
Driver: H/W 1
Mobility
Radio,Channel
Step 2
Virtual Sensor Layer
Step 4
Process-oriented
simulation
Driver: H/W 2
Driver: Virtual H/W
Step 3
Battery
Processor
Step 5
Clock
Sensing
‹#›
DAWN Meeting Oct, 2008
SenQ Emulation: Key Features
• Sensor node appear as a “layer” in the discrete
event network simulator (QualNet)
• Network simulator masquerades as “hardware
platform” to sensor node
• Architecture supports any OS / multiple OS
• Efficient implementation: ~10,000 nodes
• Supports parallel emulation
• Supports modeling Heterogeneous Networks
‹#›
DAWN Meeting Oct, 2008
SOS Emulation Results
17
Clock Drift Models
• Clock = Oscillator + Counter + Zero-time
• Error in Oscillator ⇒ Clock Drift
– (Different definition of a second)
• f = fnom + ∆f0 + a.(t - t0) + ∆fn(t) + ∆fe(t)*
Nominal
frequency
*White
‹#›
Long term
variations
Aging rate
Short term
variations
(noise)
Environmental factor
(temperature etc)
paper, Symmetricom
DAWN Meeting Oct, 2008
Sensing
t+∆t1
t
ϕ
t+∆t2
t+∆t3
∆t1
∆t2
∆t3
‹#›
Algorithm
Angle ϕ
DAWN Meeting Oct, 2008
SenQ simulation study
• Benefits of SenQ
Variance: 0.64o
2.64o
–Easy configuration
–Repeatable execution
–Study tradeoffs
Clock drift: rand(0,5) μs/sec
Sync. Protocol: RATS
Sync Beacon period: 2sec,
60sec
‹#›
DAWN Meeting Oct, 2008
Power Consumption Model
• Inaccurate model of battery
– Reservoir of charge (U mA-sec)
– Subtract, I (mA) * t (sec) from U after each event (Tx, Rx etc)
• Accurate model of battery
– Non linear discharge
– Recovery
Ideal
Observed
– Model (Rakhmatov, 2002):
‹#›
DAWN Meeting Oct, 2008
Model Optimizations
• Why the model does not
work?
– Computationally expensive
– Large memory requirement
• Optimizations
– For small load magnitude and
intervals, found invariants that
simplified the model
– Precompute the function in a
lookup table (saves execution
time)
– Merge multiple entries into one
with a correction term (saves
space)
Loss in accuracy < 0.1%
‹#›
DAWN Meeting Oct, 2008
Impact of Processor Power Consumption
• Knowledge of power consumed by nodes in network is essential to
avoid hot-spots or compare performance.
• Claim: power consumed by processor is substantial portion of total
power consumed.
• Power consumed by processor is not a constant overhead.
• It is state and context dependent but depends on what action is taken on
events
• Ignoring this component can predict incorrect trends or even inversion
of results if only radio is considered
‹#›
DAWN Meeting Oct, 2008
Processor Power Consumption
• Simulation
– 49 node grid topology
– 802.11b at 1Mbps
– 600mA and 400mA for Tx and
Rx, resp
– SA1100 processor at 133MHz.
– 190.4mA/instruction
– 3 CBR sessions between
random pairs.
– Pre-computed routes
% Power consumed by processor
The ratio is not constant and depends on
what role the node plays in simulation.
‹#›
DAWN Meeting Oct, 2008
Incorrect Simulation Results
•
Hot-spots are ignored (case 1 and 2)
or mis-predicted (case 5).
Relative % incorrectly predicted
(Case 4, 6% vs. 21%).
Inversion of results compared with
simple model (cases 1, 3, 5).
•
•
•
Summary
–
–
–
Processor power consumption is
significant contributor.
This overhead is not constant that can be
easily modeled.
Ignoring this component can predict
incorrect result.
White bars: Power consumption
by radio only (current simulators).
Black bars: Power consumption
by both radio and processor
(detailed model).
‹#›
DAWN Meeting Oct, 2008
Incorrect Simulation Results
Summary
– Processor power consumption is significant contributor.
– This overhead is not constant that can be easily modeled.
– Ignoring this component can predict incorrect result.
‹#›
DAWN Meeting Oct, 2008
Emulation of TinyOS
• Override drivers
to communicate
with QualNet
• TinyOS applications
think QualNet is just
another hardware
platform
• Hardware
interactions
(i.e., sending a packet) creates an event in QualNet
• Each mote runs in a separate thread
‹#›
DAWN Meeting Oct, 2008
Accuracy of TinyQ
• Compared results from TinyQ to those from Harvard’s MoteLab
Avg. Packet Delivery Ratio (%)
– Application routes periodic packets from one sender to one receiver
– Accurate wireless model provides identical results
100
99
98
MoteLab
97
TinyQ
96
95
94
2
6
10
14
18
22
26
30
Number of Motes
‹#›
DAWN Meeting Oct, 2008
Impact of Accurate Channel Models
• Multihop Oscilloscope:
– Application distributed with TinyOS 2.x
– Route sensor readings to root mote using tree routing & CSMA MAC
– As node density increases MAC layer interference must decrease PDR
‹#›
DAWN Meeting Oct, 2008
Performance of TinyQ (cont.)
Application Description Execution Time (s)
TosSim
Blink
Blink Config
DictionaryDemo
PriorityApp
Sense
RadioCountToLeds
TestFTSP
Blinks LEDs
Blinks LEDs
Tracks reboots
Tests pre-emption
read and show sensor val
increment counter &
broadcast
Imlement FTSP
TINYQ
0.52
0.53
0.64
0.53
0.63
1.19
1.25
1.28
1.15
1.94
512.13
723.59
63.89
73.64
896.9
106.55
FreqHop
sync wake times among all
nodes before sleeping
Listen on 2 freq and xmit
on empty
578.64
66.02
RadioSenseTpLeds
read & broadcast sensor val
624.63
57.96
PacketParrot
log and rexmit incoming pkt
periodically read & bcast
sensor val
use tree routing to xmit
655.63
59.96
436.73
725.63
45.21
74.96
Wwakker
Oscilloscope
MultihopOscilloscope
‹#›
DAWN Meeting Oct, 2008
Performance of TinyQ
• Compared against TOSSIM using Blink (no radio)
‹#›
DAWN Meeting Oct, 2008
Performance of TinyQ (cont.)
•
‹#›
Compared against TOSSIM using RadioCountToLeds (uses radio)
DAWN Meeting Oct, 2008
Performance of TinyQ (cont.)
• TinyQ was able to execute most applications that
TOSSIM could
• Performs worse than TOSSIM on applications that
don’t use the radio due to the extra emulation
overhead
• Performs 10X better on applications that used the
radio
– TOSSIM uses a connectivity graph which leads to
thrashing when the network gets large
‹#›
DAWN Meeting Oct, 2008
Heterogeneous Networks
Emulation of Heterogeneous networks
35
Case Study 1
Sensors: OS Level Emulation
WiFi: True Emulation
Interoperable!
OS:
TinyOS/SOS
OS:
Linux/Windows
Stack: Surge/Tree
Stack:
routing/BMAC
IP/AODV/802.11
Radio:
100kbps, 10s
Radio:
802.11a/b
meter range
‹#›
DAWN Meeting Oct, 2008
Heterogeneous Networks in SenQ
•
•
•
‹#›
Sensor subnet (1000 nodes)
-
SOS emulation, CC1100 radio model
IP subnet (50 nodes)
-
QualNet simulation
IP, AODV, UDP/TCP, 802.11 radio and MAC
Gateway nodes (1-10 nodes)
-
Gateway batches K packets from sensors and then transmits
SenQ support for interfacing diverse networks
DAWN Meeting Oct, 2008
Heterogeneous Networks in SenQ
‹#›
DAWN Meeting Oct, 2008
Heterogeneous Network: Case Study 2
•
•
•
•
‹#›
Sensor network is emulated
IP network is simulated
Gateways are modeled as nodes with two network interfaces
500 SOS nodes, 500 TinyOS nodes, 50 IP nodes spread randomly
over 400m x 400m terrain
DAWN Meeting Oct, 2008
Results: Heterogeneous Network
• TinyOS motes boot
up at 30 simulation
seconds
• SOS motes die
between 40 and 60
simulation seconds
‹#›
DAWN Meeting Oct, 2008
Results: Heterogeneous Network
• TinyOS application
behaves correctly
• Also shows that 500
motes is not the
optimum number of
motes to cover area
– Many motes are isolated
and cannot route to a
gateway
‹#›
DAWN Meeting Oct, 2008
Future Work
• Hybrid sensor network modeling (Yi-tao Wang)
•Integration of transport and vehicular comm
network simulator (Yi Yang)
•Cross layer interactions between routing protocol
and MAC interface using Multi-Paradigm
Framework (Shrinivas Mohan)
Physical
Emulation
Simulation
‹#›
DAWN Meeting Oct, 2008