IEEE 1730.1-2023
$57.96
IEEE Recommended Practice for Distributed Simulation Engineering and Execution Process Multi‐Architecture Overlay (Approved Draft)
Published By | Publication Date | Number of Pages |
IEEE | 2023 | 95 |
Revision Standard – Active. A recommended practice for applying the Distributed Simulation Engineering and Execution Process (DSEEP) to the development and execution of distributed simulation environments that include more than one distributed simulation architecture is described. The distributed simulation architectures to which the recommended practice applies include Distributed Interactive Simulation (DIS), High Level Architecture (HLA), and Test and Training Enabling Architecture (TENA). The DSEEP Multi-Architecture Overlay (DMAO) identifies and describes multi-architecture issues and provides recommended actions for simulation environment developers faced with those issues. The DMAO also augments the DSEEP lists of inputs, recommended tasks, and outcomes with additional inputs, recommended tasks, and outcomes that apply to multi-architecture simulation environments. This document is an overlay to the DSEEP, which is a separate recommended practice.
PDF Catalog
PDF Pages | PDF Title |
---|---|
1 | IEEE Std 1730.1-2023 Front Cover |
2 | Title page |
4 | Important Notices and Disclaimers Concerning IEEE Standards Documents |
8 | Participants |
9 | Introduction |
14 | Contents |
15 | 1. Overview 1.1 Scope 1.2 Purpose 1.3 Word usage |
16 | 2. Normative references 3. Definitions, acronyms, and abbreviations 3.1 Definitions |
17 | 3.2 Acronyms and abbreviations |
20 | 4. Multi-architecture issues and solutions |
22 | 4.1 Step 1: Define simulation environment objectives 4.1.1 Activity 1.1: Identify user/sponsor needs 4.1.1.1 Purpose |
23 | 4.1.1.2 Issues 4.1.2 Activity 1.2: Develop objectives 4.1.2.1 Purpose 4.1.2.2 Issues 4.1.3 Activity 1.3: Conduct initial planning 4.1.3.1 Purpose 4.1.3.2 Issues 4.1.3.2.1 Multi-architecture initial planning |
24 | 4.1.3.2.2 Required multi-architecture simulation environment expertise |
25 | 4.1.3.2.3 Inconsistent development and execution processes |
26 | 4.1.3.2.4 VV&A for multi-architecture applications |
28 | 4.1.3.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 1.3 4.2 Step 2: Perform conceptual analysis |
29 | 4.2.1 Activity 2.1: Develop scenario 4.2.1.1 Purpose |
30 | 4.2.1.2 Issues 4.2.2 Activity 2.2: Develop conceptual model 4.2.2.1 Purpose 4.2.2.2 Issues 4.2.3 Activity 2.3: Develop simulation environment requirements 4.2.3.1 Purpose |
31 | 4.2.3.2 Issues 4.2.3.2.1 Requirements for multi-architecture simulation environment 4.2.3.2.2 Member application requirement incompatibility |
33 | 4.2.3.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 2.3 4.3 Step 3: Design simulation environment |
34 | 4.3.1 Activity 3.1: Select member applications 4.3.1.1 Purpose 4.3.1.2 Issues 4.3.1.2.1 Member application selection criteria for multi-architecture simulation environments |
35 | 4.3.1.2.2 Non-conforming member applications |
38 | 4.3.1.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 3.1 4.3.2 Activity 3.2: Design simulation environment 4.3.2.1 Purpose |
39 | 4.3.2.2 Issues 4.3.2.2.1 Gateway usage and selection decisions |
41 | 4.3.2.2.2 Object state update contents 4.3.2.2.3 Object ownership management |
43 | 4.3.2.2.4 Time management in multi-architecture simulation environments |
45 | 4.3.2.2.5 Interest management capability differences |
46 | 4.3.2.2.6 Gateway translation paths |
47 | 4.3.2.2.7 DIS heartbeat translation 4.3.2.2.8 Multi-architecture and inter-architecture performance |
48 | 4.3.2.2.9 Translating non-ground-truth network data |
49 | 4.3.2.2.10 Object identifier uniqueness and compatibility |
50 | 4.3.2.2.11 Cross domain solutions in multi-architecture simulation environments |
51 | 4.3.2.2.12 Multi-architecture save and restore 4.3.2.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 3.2 |
53 | 4.3.3 Activity 3.3: Design member applications 4.3.3.1 Purpose 4.3.3.2 Issues 4.3.3.2.1 New member application architecture |
54 | 4.3.3.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 3.3 |
55 | 4.3.4 Activity 3.4: Prepare detailed plan 4.3.4.1 Purpose 4.3.4.2 Issues 4.3.4.2.1 Cost and schedule estimating for multi-architecture development |
56 | 4.3.4.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 3.4 |
57 | 4.4 Step 4: Develop simulation environment 4.4.1 Activity 4.1: Develop simulation data exchange model 4.4.1.1 Purpose |
58 | 4.4.1.2 Issues 4.4.1.2.1 Metamodel incompatibilities |
59 | 4.4.1.2.2 SDEM content incompatibilities |
61 | 4.4.1.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 4.1 |
62 | 4.4.2 Activity 4.2: Establish simulation environment agreements 4.4.2.1 Purpose 4.4.2.2 Issues 4.4.2.2.1 Agreements to address multi-architecture development |
64 | 4.4.2.2.2 Tool availability and compatibility |
66 | 4.4.2.2.3 Initialization sequencing and synchronization |
67 | 4.4.2.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 4.2 |
68 | 4.4.3 Activity 4.3: Implement member application designs 4.4.3.1 Purpose |
69 | 4.4.3.2 Issues 4.4.3.2.1 Nonstandard algorithms 4.4.3.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 4.3 |
70 | 4.4.4 Activity 4.4: Implement simulation environment infrastructure 4.4.4.1 Purpose 4.4.4.2 Issues 4.4.4.2.1 Network configuration |
71 | 4.4.4.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 4.4 |
72 | 4.5 Step 5: Integrate and test simulation environment 4.5.1 Activity 5.1: Plan execution 4.5.1.1 Purpose |
73 | 4.5.1.2 Issues 4.5.1.2.1 Integration and test planning for multi-architecture simulation environments 4.5.1.2.2 Multi-architecture execution planning considerations |
74 | 4.5.1.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 5.1 4.5.2 Activity 5.2: Integrate simulation environment 4.5.2.1 Purpose 4.5.2.2 Issues 4.5.2.2.1 Live entity time, space, and position information updates |
76 | 4.5.2.3 Multi-architecture specific inputs, tasks, and outcomes for integrate simulation environment 4.5.3 Activity 5.3: Test simulation environment 4.5.3.1 Purpose |
77 | 4.5.3.2 Issues 4.5.3.2.1 Complexities of testing in a multi-architecture simulation environment |
78 | 4.5.3.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 5.3 4.6 Step 6: Execute simulation |
79 | 4.6.1 Activity 6.1: Execute simulation 4.6.1.1 Purpose 4.6.1.2 Issues 4.6.1.2.1 Monitoring and controlling multi-architecture simulation environment execution |
81 | 4.6.1.2.2 Multi-architecture data collection |
83 | 4.6.1.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 6.1 4.6.2 Activity 6.2: Prepare simulation environment outputs 4.6.2.1 Purpose |
84 | 4.6.2.2 Issues 4.7 Step 7: Analyze data and evaluate results 4.7.1 Activity 7.1: Analyze data 4.7.1.1 Purpose 4.7.1.2 Issues |
85 | 4.7.2 Activity 7.2: Evaluate and feedback results 4.7.2.1 Purpose 4.7.2.2 Issues 4.7.2.2.1 Multi-architecture simulation environment assessment |
86 | 4.7.2.3 Multi-architecture specific inputs, tasks, and outcomes for Activity 7.2 |
87 | Annex A (informative) Bibliography |
95 | Back Cover |