CMSA Overview - Villanova University

Download Report

Transcript CMSA Overview - Villanova University

Validation Methodologies for Netcentric
Warfare Simulations
Wesley N. Colley, Ph.D.
Center for Modeling, Simulation and Analysis
University of Alabama in Huntsville
Sponsor: Naval Air Warfare Center, Weapons Division
CMSA
SPARTA
Jared Biggs
Chris Noller
Randy Harrell Jeff Roberts
Jim Walsh
Validation for Netcentric Warfare Simulations
Presentation outline
•
•
•
•
Motivation
Validation Context
Quantitative Validation
InterTEC Joint Fires Exercise
Validation for Netcentric Warfare Simulations
Problem Statement / Project Focus
• Netcentric Warfare is new
– Classic “face” validation by battle-experienced
SMEs not generally possible
• Networks are classically difficult to
characterize
– Nonlinear/brittle behaviors mean much more
careful quantitative analysis is necessary
• Focus: Develop quantitative validation
techniques for NCW simulations.
Validation for Netcentric Warfare Simulations
Project Overview
Mathematical
V&V
stimuli
Test
Component
V&V
Analysis
metrics
Metrics
Gathering outputs
Post
Processing
valid
metric
values
Design
MOP/MOE
experiment design
statistics
responses
Real World
Test Data
Metrics
Analysis
Validation for Netcentric Warfare Simulations
Scope of Validation
• Our Goal: Validate the NCW-specific aspects of systems (in
constructive simulation)
• Two possibilities:
– Inherent NCW system (Link-16)
– NCW components of non-NCW system (Link-16 on F35)
Netcentric
C2 Environment
NCW System (e.g., Link-16)
Weapons System (e.g., F35)
Weapons System NCW
Interface (F35 Link-16 Hardware)
Validation for Netcentric Warfare Simulations
NCW Effectiveness and Networks
• NCW Goal: Full-spectrum information
dominance
timeliness
Easiest axis: timeliness
“Simply” improve network
Network metric is thus
Timeliness of messages
relevance
Information
Superiority
Product
(Albert’s NCW, 2000)
accuracy
What Does
the Network
to Do?
Validation
for Netcentric
Warfare Have
Simulations
Infrastructure
Service
Registry
B2B Enterprise C2 Logic
w/Distributed Services
Track
Identity
Service
Alert
Service
•Logon & Authenticate
•Publish
L&A
•Sensor (MP-RTIP)
•Forward Pass (SM)
•PLI (Position)
•Subscribe
•Hand-Off (SM)
•Target ID (HRR)
•Logon & Authenticate
•Publish
•PLI (Position)
•Subscribe
•Threat
•Environment
•Blue Force ID
•Combat ID (IFF)
•ROE
•Alerts
Weapon Data Link
DETECT
•Logon & Authenticate
•Publish
•Sensor (SPY)
Weapon Data Link
•Distributed Sensor CPU
•Fires (SM)
•Alerts
•PLI (Position)
Weapon Provider
(Sensor to Shooter)
Weapon Delivery
(Sensor to Weapon)
•Subscribe
•Threat
•Environment
•Blue Force
•Combat ID (IFF)
•ROE
•Target ID
•Alerts
Establishing Services
Delivering Services
.
Validation for Netcentric Warfare Simulations
Network Composeability Table
Communications linkages
Operational networking
Information flow
management
Distributed information
processing and storage
Full spectrum military
and civil affairs program
applications
Virtual collaboration
Functional, temporal, and
geospacial visualization
Dial-a-Comm Link
Select the software programmable or
networked radios connection and waveform
JTRS, VRC-99, EHF
MDR, TC, Teleport,
SHF/CA FCs, TCDL,
MUOS
Compose the kind of connectivity
and raw bandwidth you need to
support your mission
Dial-a-Network
Operational network formation to meet the
force and mission needs (e.g. GIG-BE, TCS,
JTF Warnet, EHF MDR)
GIG-BE, TCS, JTF
Warnet, EHF MDR,
MUOS, TSAT
Compose the community of
interests you need to support your
mission
Dial-a-Precedence
Establish bandwidth allocation and priority for
applications and key individuals
ADNS, BMAC QOS,
TCS
Compose lanes with different
speed limits and priority for your
mission related data flows
Dial-a-Computer
Establish roles of the computers that will
support the mission, their interfaces to your
FORCEnet, and the information managers who
run them
NCES, XTCF, DJC2,
OA, RAPIDS, CORBA,
GRS, IT-21, NMCI, CAS,
COWAN/CENTRIX
Compose your information
management environment
Dial-an-Application
Establish the operational cells and specific
applications that will pubish information into
the FORCEnet
GCCS-M, GCSS-M, OA,
TMIP, IBV, RAMIS,
NTCSS, TBMCS,
ADOCS, JBMCS
Compose the sensing, planning,
decision support, and weapons
applications that will publish into
the FORCEnet to support support
your mission
Dial-a-Meeting
Establish the collaborative environment that
the participants in the mission use to
coordinate actions and activities
GeoViz, IWS, DCTS,
NetMeeting, VoIP, IP
VTC
Compose the virtual rooms,
participants, schedule, and battle
rhythm
Dial-a-GUI
Establish the standards for the form of
presentation and FORCEnet subscription rules
to be used to support your mission
GeoViz, WebCOP, KWEB,
Compose the Alerting, Status
Board, and COP views to be
shared by the force
Scads of systems with scads of uses
Validation for Netcentric Warfare Simulations
Recommendation
• Use Missions × Means breakdown for validation
– Provides hierarchical decomposition of
• Missions: the desired tasks or capabilities
• Means: the hardware and/or software assets available
– Successfully used by Petty, Colley for validation of JTLS
in Terminal Fury ’05 (published at SIW Fall 2005)
• Select from broad NCW categories
– Use M×M to drill down for specific systems
– Methodologies should be similar within broad categories
Validation for Netcentric Warfare Simulations
Missions × Means Example
Ground Force Comm.
Single Soldier
Secure
Connection
authenlogon ticate data voice data voice
outgoing
Voice
incoming
JTRS LAN
bandwidth
allocation
GIG
JTRS
JTRS radio
= Evidence Item
Send
Receive
Data
Voice
Data
Validation for Netcentric Warfare Simulations
Evidence Item
• Element in M×M matrix where means
supports mission
• Each will have quantitative measures of
performance (MOP)
–
–
–
–
Latency (e.g., time to authenticate connection)
Bandwidth consumed
Total packets sent/received
Compliance with protocol
Validation for Netcentric Warfare Simulations
Validation
• Compare MOPs to known standards
– TCP/IP
– JTRS specification
– FORCEnet documentation, etc.
FORCEnet Architecture &
Standards Volume I
FORCEnet Architecture &
Standards Volume I (Cont)
5.0
Table of Contents:
1.0
FORCEnet Definition
1.1
1.2
2.0
Sea Power 21 Pillars
FORCEnet Precepts
FORCEnet Architectural Scope
FORCEnet Development Process
FORCEnet Architecture Approach
Volum e I
Operational & System s View
6.0
Operational View (OV-1)
Initial Capabilities Document (ICD)
Fleet Capability Needs
FORCEnet Drivers
4.1
4.2
4.3
4.4
DoD Drivers
Naval
Commercial Technology Update
Allied/Coalition Interoperability
DoD Architecture Framework Document
Mission Capability Packages
Communications and Networks
Intelligence, Surveillance, and Reconnaissance (ISR)
Distributed Services, Common Operational and Tactical
Implementation Reference Mission
6.1
6.2
6.3
6.4
FORCEnet Requirements
3.1
3.2
3.3
4.0
Architecture & Standards
Background
2.1
2.2
2.3
2.4
2.5
3.0
FORCEnet Objective-Transformation
Architecture Overview
FORCEnet Functional Architecture
5.1
5.2
5.3
5.4
5.5
F O R C E ne t
SEA STRIKE
SEA SHIELD
SEA BASE
FORCEnet
Appendices
Office of the Chief Engineer
SPAW AR 05
Distribution D: Distribution auth orized to the Departm ent of Defense and U.S. DoD
contractors only (Operational Use) 03 Novem ber 2003.
Other requests shall be referred to the Space and Naval W arfare S ystem s Com m and Office
(Architecture and Standards SPAW AR Code 052) or Public Affairs and Corporate
Comm unications Office (SP AW AR 00P).
“Destruction Notice” - For unclassified, limited distribution docum ents, destroy by an y
m ethod that will prevent disclosure of contents or reconstruction of the docum ent.”
Version 1.4
30 April 2004
Appendix A
Appendix B
Appendix C
Appendix D
Appendix E
Appendix F
Appendix G
Appendix H
Appendix I
AV-1
ICD – FORCEnet Capabilities
FORCEnet Service Category Definitions
Allied and Coalition FORCEnet Architecture
Human Systems Integration
FORCEnet System Descriptions
SP-21 Pillar POR Standards Compliance
List of Acronyms
References
Validation for Netcentric Warfare Simulations
Quantitative Analysis
• Experimental Design
– Flex system in statistically interesting ways
– Push simulations hard to identify problem areas
• Non-linear behaviors
– Networks are inherently
susceptible to non-linear
response
– Must create mathematical
means of handling nonlinearity
Message Latency
Network in non-linear regime
Runtime
Validation for Netcentric Warfare Simulations
Validation Testbed—NETE
• Netcentric End-to-End Simulation
• Based in Extend discrete event simulation
environment
• Modeled NCW Elements:
– GIG BE, TSATs, Link-16
– TADIL-J messages
– AEGIS cruisers, FBX-T (Sea of Japan)
• Thread
– DPRK launches missiles
– Tracks formed, passed by GIG to STRATCOM
Validation for Netcentric Warfare Simulations
NETE Features
•
•
•
•
Realistic message processing
Link-16 message slotting, jitter
Latency computation
Threat tracks based on Lincoln Lab models
Validation for Netcentric Warfare Simulations
Validation for Netcentric Warfare Simulations
• < NETE Demo >
• Metric under test = TADIL-J latency from
Link-16 hub in Sea of Japan back to
STRATCOM
Validation for Netcentric Warfare Simulations
Bandwidth Sensitivity
• 10 Gbits/sec
10.0 Gbits/sec
– Very stable at low latency
throughout scenario
• 1.0 Gbits/sec
– Latency grows as number
of tracks increases
– Highly unstable run to
run
1.0 Gbits/sec
Wide variation
run to run
Validation for Netcentric Warfare Simulations
Bandwidth Sensitivity
• 0.8 Gbits/sec
0.8 Gbits/sec
– Latency now grows
throughout run, most of
the time
• 0.1 Gbits/sec
– Nearly zero througput
– Latency grows linearly as
run proceeds
0.1 Gbits/sec
Validation for Netcentric Warfare Simulations
Quantitative Analysis
• Onset of latency …
– rapid, non-linear, brittle, etc.
• Repeatability in this regime is limited
– Statistics difficult to quantify
– Validity assessment very difficult with only MOPs,
MOEs or a few live exercises as a guide
• Tack:
– Quantify statistical behavior of simulations
– Validate where possible
– Assess likely validity breakdowns
Validation for Netcentric Warfare Simulations
Validity Roll-Up—Indexing
•
•
•
•
Recent NASA V&V work provides a roll-up framework
Index validity on 0 – 5 scale
Assign target validity index to each component
Measured validity vs. target validity generates validity gap
– How to change statistical measures into 0 – 5 scale
– How to assign target validity?
Component 1
5
4
3
2
1
0
Component 2
Target
Measured
Gap of 2
Component 3
Component 4
5
4
3
2
1
0
5
4
3
2
1
0
5
4
3
2
1
0
Gap of 1
Gap of 3
No Gap
Validation for Netcentric Warfare Simulations
Validity Roll-Up
• Assign risk associated with each component
• Roll-up validity as sum of risk-weighted gaps
– May use max rather than sum (weakest link premise)
Module Validity Gap:
  (risk)  (validitygap) or
i
i
components
 (risk)
 (validitygap)
 (risk)
max
i  ( validitygap)i
i
i
components
components
Component:
Validity score
Target Validity
Risk Factor
Component:
Validity score
Target Validity
Risk Factor
Component:
Validity score
Target Validity
Risk Factor
Component:
Validity score
Target Validity
Risk Factor
Validation for Netcentric Warfare Simulations
InterTEC Joint Fires Exercise
RED
BLUE
RED = small enemy nation
BLUE = small friendly nation
RED forces invade Boron-rich
area in BLUE lands
Validation for Netcentric Warfare Simulations
Primary Battlespace Objects
•
Threat
– Integrated Air Defenses with CAP
– Concentrations of Ground Forces Stationary and Mobile
– Truck Convoy (HVT)
•
Joint Force
– C2
•
•
•
•
•
•
CVN
AOC
GCCS-A
AEGIS
E-2C
E-3 AWACS
– Sensor Platforms
• EC-135
• JSTARS
– Strike Elements
•
•
•
•
F-16, F-22, F-35
F-18
MLRS
AEGIS
– ECM
• EA6-B Prowler
Validation for Netcentric Warfare Simulations
Mission Thread Overview
• Carrier Air Wing and Joint Air Forces Conduct Coordinated
Strike to Destroy HVT Truck Convoy
• Targeting Thread:
–
–
–
–
–
–
–
–
–
EW Provides Initial Indicators and Focused Search for JSTARS
JSTARS Provides Continuous Track for Targeting
E-3 and E-2 Coordinate Strike and Targeting
Also Control Strike Aircraft
EA-6 Provides SEAD Support
FA-18, F-16 Conduct Strike
AEGIS and MLRS also Included in Joint Strike
E-2 Provides SAM Location with EA-6/EC-135
Objective is Closely Spaced (in time) Weapon Arrival on Target
• Threat Uses IAMD to Disrupt Strike
Validation for Netcentric Warfare Simulations
Joint Fires Scenario in Extend (so far)
Enemy
convoy
Link-16 models pulled from NETE
Ground models pulled from ESP (FCS simulation)
Validation for Netcentric Warfare Simulations
Possible Network Topology
Surveillance
Network Link16 JRE
EPLRS
Surveillance
Network
Link-16
Strike
Control
Network
Link-16
Validation for Netcentric Warfare Simulations
• < Joint Fires Scenario Demo >
Validation for Netcentric Warfare Simulations
Status of Exercise
• Currently underway at Point Mugu
• Awaiting post-exercise artifacts
– Clearance recently granted
• Plan
– Examine exercise in detail
• Model closely in Extend
• Construct particular validation methodologies
– Compare to live artifacts
• Assess validity
• Identify weaknesses/gaps in methodologies
Validation for Netcentric Warfare Simulations
“High-Level” Validation
• Is the level of fidelity appropriate?
• Is the constructive setup appropriate?
– If network is analyzed and computed ahead of time
• Is line-of-sight fidelity compromised?
• Do latencies change as assets move?
• Do connections change (a la cell tower hand-offs)?
• Red Force
– Any networking capabilities modeled here?
– Jamming, other counter-measures on blue comm assets?
– Blue force anti-comm efforts against red?
• Other information axes
– Accuracy, relevance (should realistic error rates be played?)
Validation for Netcentric Warfare Simulations
Wrap-Up
• Robust, quantitative validation needed for NCW
simulations
• Missions × Means decomposition helps structure
methodology
• Quantitative analysis just beginning
– Still must address statistics of non-linear onset, etc.
• Validation roll-up methods provide an overall validity
picture
– “Recomposition” from M × M decomposition
• Support of Live Joint Fires Exercise just underway
Validation for Netcentric Warfare Simulations
Contact:
Wesley N. Colley, Ph.D.
Senior Research Scientist,
CMSA
[email protected]