Introduction - Simulation Interoperability Standards Organization
Download
Report
Transcript Introduction - Simulation Interoperability Standards Organization
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended
Acceptance Testing Procedure for
Network Enabled Training
Simulators
Peter Ross and Peter Clark
Defence Science & Technology Organisation
AUSTRALIA
Euro-SIW-030: 1
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Acceptance Testing – Distributed Simulation
Objective:
- To ensure the supplier has met the requirements of the
contract
Problem:
- It is difficult to gauge fulfilment of the requirements, as
often there is no immediate requirement to interface the
simulator to another
- Often it is cost prohibitive to conduct a trial with another
simulator, so test equipment is used as an alternative
- Whilst there is a wide range of test equipment available to
facilitate testing, there is no standard procedure for
applying these tools
Solution:
- Establish a procedure
Euro-SIW-030: 2
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Platform Training Simulator – Definition
Is a local term for:
– Human in the loop training simulator
– Platform level representation of the
battlespace
– Simulated in real-time
– ... and often big and expensive (order of tens
of millions)
Euro-SIW-030: 3
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Training Simulators – ADF Examples
– AEW&C operational mission simulator
– ANZAC team trainer
– AP-3C operational mission simulator
– AP-3C advanced flight simulator
– FFG-UP onboard training system and team
trainer
– Hornet aircrew training system
– Seasprite full mission flight simulator
– ARH Trainer
– ASLAV Crew Procedural Trainer
– ABRAMS trainer
Euro-SIW-030: 4
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Platform Training Simulator - Components
Generalisation only
Trainer – cockpit, ops room, bridge
Control Station – simulator configuration
and scenario control
Instructor/Asset Station – management of
additional role players within the scenario
Debrief – provides performance feedback to
trainees
Simulation Computer – calculations and
display rendering
Distributed Simulation Interface – enables
simulators to participate in a shared
virtual battlespace
Euro-SIW-030: 5
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Distributed Simulation 101
What is distributed simulation :
– The provision of a shared virtual battlespace
The problem :
– Internally, each simulator models the virtual
battlespace differently
The solution :
1. Use the same internal model across all
simulators; or
2. Adopt a simulation interoperability standard
– ALSP: Aggregate Level Simulation Protocol
– DIS: Distributed Interactive Simulation
– HLA: High Level Architecture
– TENA: Test and Training Enabling Architecture?
Euro-SIW-030: 6
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Simulation Interoperability Standards
Network Model – what information is exchanged between
simulators
–
DIS: ground truth, WGS84
–
HLA: flexible, but often based on DIS/RPR-FOM
Network Protocol – how the information is represented digitally
–
DIS: Protocol Data Units (PDUs)
–
HLA: flexible
Network Transport – how the information is transported between
simulators
–
DIS: flexible; but UDP/IP is almost always used
–
HLA: flexible
Flexibility is not always an advantage
Euro-SIW-030: 7
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Distributed Simulation Interface (1)
Combination of software and hardware
Performs two tasks
Translation – translate information between
the internal and network models
–
e.g. coordinate conversion
Exchange – marshal information and send it
to other simulators (and vice-versa)
–
e.g. storing information within PDUs and
outputting Ethernet frames
Euro-SIW-030: 8
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Layer
DIS
Internal
Model
Internal model
HLA
Defined by
‘Simulation
Object Model’
ISO/OSI
7- Application
Translation
Network
Model
PDU types
Defined by
‘Federation
Object Model’
7- Application
Exchange
Network
Protocol
Network
Transport
Byte ordering,
Data
structures,
Heartbeats,
Timeouts
UDP/IP
Defined by
‘Run Time
Infrastructure’
Typically IP
6- Presentation
5- Session
4- Transport
3- Network
2- Data Link
1– Physical
Receive
Send
Distributed Simulation Interface (2) Layers
Euro-SIW-030: 9
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Interoperability – Three Levels
Compliant – the distributed simulation interface is
implemented in accordance with the relevant
standards
Achieved at acceptance testing stage
Interoperable – two or more simulators can
participate in a distributed training exercise
Achieved at requirements specification stage
Compatible – two or more simulators can
participate in a distributed training exercise and
achieve training objectives
Achieved at training needs analysis stage
Euro-SIW-030: 10
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Acceptance Testing – Technical Reasons
Distributed simulation standards are often
ambiguous; engineers will form their own
interpretations of the standard
– Two compliant simulators may not interoperate due to
these interpretations
Network protocols are intolerant to implementation
errors
– One incorrectly set bit is sufficient to prevent
interoperability
Resolving defects after the Defence Department
takes ownership of the simulator is often
expensive
Euro-SIW-030: 11
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended Procedure - Overview
Three stages:
1) Planning
2) Test Activity
3) Documentation
Euro-SIW-030: 12
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended Procedure - Planning
The When, Where, What, Why, Who, How …
Functionality being tested; not all of the simulator’s
capabilities are represented by the network
model
Manning: arms and legs to operate the various
components of the simulator
Data concerns
– Enumerations (e.g. platform types)
– Geographic locations
– Classification
Network media compatibility: 10base2 …
100baseFX
Test equipment availability and compatibility
Schedule
Euro-SIW-030: 13
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended Procedure - Test Activity (1)
Black box testing
Test cases are applied to the two exposed interfaces:
– HMI – Human Machine Interface
– NIC – Network Interface Card
Euro-SIW-030: 14
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended Procedure - Test Activity (2)
Deploy the team and equipment to the training
or contractor facility
Test cases are categorised into three types:
Configuration Testing – verify the simulator can
be (and is) configured appropriately for a
distributed training exercise
Send Testing – verify information sent by the
simulator is correct
Receive Testing – verify information received
by the simulator is correct
Euro-SIW-030: 15
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended Procedure – Test Activity (3)
Offsite analysis
Time spent with the simulator is likely to be
“precious”
It is desirable to perform lengthy analysis of
the data elsewhere
To facilitate this:
– Relevant HMI actions and network data are
recorded in a test log
– Log entries are time-stamped to enable
correlation of events
Euro-SIW-030: 16
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
The Procedure – Documentation (1)
Report findings to Project Authority
Indicate whether the distributed simulation
component of the simulator should be
“accepted”
If not, make recommendations for change
Testing should be repeated where there are
significant problems
Euro-SIW-030: 17
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
The Procedure – Documentation (2)
Problems are highlighted by severity. Our adopted
scheme:
FAULT
– Has potential to prevent interoperability with another
simulator
– Resolution advised
ISSUE
– Does not comply with standard, or lacks some
functionality. However is unlikely to prevent
interoperability with another simulator.
– Resolution desirable
ACTION
– Test results insufficient to draw firm conclusion
– Further investigation advised
Euro-SIW-030: 18
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Test Case Development (1)
Test cases demonstrate the fulfilment of a specific
distributed simulation requirement.
Test cases must be documented, and reference the
requirement, and any interpretations or assumptions
made by the test engineer
Requirements exist at different “layers” of the
distributed simulation. Some examples:
– Training – simulation of IFF modes 1, 2, 3 and Charlie
– Network Model – issuance and receipt of the IFF object
– Network Protocol – population of the IFF PDU
– Network Transport – network host address, port
numbers
– Network Hardware – provision of a 100baseTX NIC
Euro-SIW-030: 19
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Testing Tools
Tool
Layer
Tcpdump / Ethereal
Transport
PR Log
LZ Netdump
MaK Logger
MaK Netdump
PDU Generator
DISCommWin (Radio)
Airline Scheduler
World View
DIS Test Suite
LZ Entity Generator
MaK F18
MaK PVD
MaK Stealth
MEG
Protocol
Protocol
Protocol
Protocol
Protocol
Model
Model
Model
Model
Model
Model
Model
Model
Model
Generate
Monitor
Record Replay
Euro-SIW-030: 20
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Test Case Development (2)
Emphasis is placed on testing the network
model and protocol requirements; the other
requirements are often easier to verify
For each object and interaction supported by
the simulator, test cases attempt to exercise
all relevant software execution paths
– Exercise all relevant aspects of the HMI
– Exercise all relevant fields within all
supported objects and interactions
– Exercise the relationship between the HMI and
NIC
Euro-SIW-030: 21
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Test Case Development (3) - Example
Euro-SIW-030: 22
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Sample Faults Noted
Simulator A sends entity ID #0:0:0; Simulator B
crashes on receipt of entity ID #0:0:0
Azimuth is reported in degrees instead of radians;
power is reported in milliwatts instead of dBreferenced to one milliwatt
Entity enumeration field is hard-coded and
indicates the ownship is subsurface life form
3:4:225:4:0:0:0 (a whale)
“Based on a true story…”
Euro-SIW-030: 23
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Summary
DSTO has developed a library of test cases for
IEEE 1278.1/A - Distributed Interactive
Simulation Application Protocol
The library evaluates many of the common
object and interaction types
Simulators are fundamentally different;
tailoring the test cases is almost always
necessary
The procedure and library have been applied to
several Australian Defence training
simulators
Euro-SIW-030: 24
DEFENCE SCIENCE AND ECHNOLOGY ORGANISATION
Recommended
Acceptance Testing Procedure for
Network Enabled Training
Simulators
Peter Ross and Peter Clark
Defence Science & Technology Organisation
AUSTRALIA
Euro-SIW-030: 25