IEEE 1730.1-2013
$60.83
IEEE Recommended Practice for Distributed Simulation Engineering and Execution Process Multi-Architecture Overlay (DMAO)
Published By | Publication Date | Number of Pages |
IEEE | 2013 | 91 |
New IEEE Standard – Active. A recommended practice for applying the Distributed Simulation Engineering and Execution Process (DSEEP) to the development and execution of distributed simulation environments that include more than one distributed simulation architecture is described. The distributed simulation architectures to which the recommended practice applies include Distributed Interactive Simulation (DIS), High Level Architecture (HLA), and Test and Training Enabling Architecture (TENA). The DSEEP Multi-Architecture Overlay (DMAO) identifies and describes multi-architecture issues and provides recommended actions for simulation environment developers faced with those issues. The DMAO also augments the DSEEP lists of inputs, recommended tasks, and outcomes with additional inputs, recommended tasks, and outcomes that apply to multi-architecture simulation environments. This document is an overlay to the DSEEP, which is a separate recommended practice.
PDF Catalog
PDF Pages | PDF Title |
---|---|
1 | IEEE Std 1730.1-2013 Cover |
3 | Title page |
4 | Abstract/Keywords |
5 | Important Notices and Disclaimers Concerning IEEE Standards Documents |
8 | Participants |
10 | Introduction |
14 | Contents |
15 | IMPORTANT NOTICE 1. Overview |
16 | 1.1 Scope 2. Normative references 3. Definitions, abbreviations, and acronyms 3.1 Definitions |
17 | 3.2 Acronyms and abbreviations |
18 | 4. Multi-architecture issues and solutions |
20 | 4.1 Step 1: Define simulation environment objectives 4.1.1 Activity 1.1: Identify user/sponsor needs 4.1.1.1 Issues |
21 | 4.1.2 Activity 1.2: Develop objectives 4.1.2.1 Issues 4.1.3 Activity 1.3: Conduct initial planning 4.1.3.1 Issues 4.1.3.1.1 Multi-architecture initial planning |
22 | 4.1.3.1.2 Required multi-architecture simulation environment expertise |
23 | 4.1.3.1.3 Inconsistent development and execution processes |
24 | 4.1.3.1.4 VV&A for multi-architecture applications |
26 | 4.1.3.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 1.3 (conduct initial planning) 4.2 Step 2: Perform conceptual analysis |
27 | 4.2.1 Activity 2.1: Develop scenario 4.2.1.1 Issues |
28 | 4.2.2 Activity 2.2: Develop conceptual model 4.2.2.1 Issues 4.2.3 Activity 2.3: Develop simulation environment requirements |
29 | 4.2.3.1 Issues 4.2.3.1.1 Requirements for multi-architecture simulation environment 4.2.3.1.2 Member application requirement incompatibility |
30 | 4.2.3.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 2.3 (develop simulation environment requirements) |
31 | 4.3 Step 3: Design simulation environment 4.3.1 Activity 3.1: Select member applications |
32 | 4.3.1.1 Issues 4.3.1.1.1 Member application selection criteria for multi-architecture simulation environments |
33 | 4.3.1.1.2 Non-conforming member applications |
36 | 4.3.1.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 3.1 (select member applications) 4.3.2 Activity 3.2: Design simulation environment |
37 | 4.3.2.1 Issues 4.3.2.1.1 Gateway usage and selection decisions |
39 | 4.3.2.1.2 Object state update contents 4.3.2.1.3 Object ownership management |
41 | 4.3.2.1.4 Time management in multi-architecture simulation environments |
43 | 4.3.2.1.5 Interest management capability differences |
44 | 4.3.2.1.6 Gateway translation paths 4.3.2.1.7 DIS heartbeat translation |
45 | 4.3.2.1.8 Multi-architecture and inter-architecture performance |
46 | 4.3.2.1.9 Translating non-ground-truth network data |
47 | 4.3.2.1.10 Object identifier uniqueness and compatibility |
48 | 4.3.2.1.11 Cross-domain solutions (CDSs) in multi-architecture simulation environments 4.3.2.1.12 Multi-architecture save and restore |
49 | 4.3.2.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 3.2 (design simulation environment) |
50 | 4.3.3 Activity 3.3: Design member applications |
51 | 4.3.3.1 Issues 4.3.3.1.1 New member application architecture |
52 | 4.3.3.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 3.3 (design member applications) 4.3.4 Activity 3.4: Prepare detailed plan |
53 | 4.3.4.1 Issues 4.3.4.1.1 Cost and schedule estimating for multi-architecture development |
54 | 4.3.4.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 3.4 (prepare detailed plan) 4.4 Step 4: Develop simulation environment |
55 | 4.4.1 Activity 4.1: Develop simulation data exchange model |
56 | 4.4.1.1 Issues 4.4.1.1.1 Metamodel incompatibilities |
57 | 4.4.1.1.2 SDEM content incompatibilities |
59 | 4.4.1.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 4.1 (develop simulation data exchange model) 4.4.2 Activity 4.2: Establish simulation environment agreements |
60 | 4.4.2.1 Issues 4.4.2.1.1 Agreements to address multi-architecture development |
61 | 4.4.2.1.2 Tool availability and compatibility |
63 | 4.4.2.1.3 Initialization sequencing and synchronization |
64 | 4.4.2.2 Multi-architecture specific inputs, tasks, and outcomes for Activity 4.2 (establish simulation environment agreements) |
65 | 4.4.3 Activity 4.3: Implement member application designs |
66 | 4.4.3.1 Issues 4.4.3.1.1 Nonstandard algorithms 4.4.3.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 4.3 (implement member application designs) 4.4.4 Activity 4.4: Implement simulation environment infrastructure |
67 | 4.4.4.1 Issues 4.4.4.1.1 Network configuration |
68 | 4.4.4.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 4.4 (implement simulation environment infrastructure) |
69 | 4.5 Step 5: Integrate and test simulation environment 4.5.1 Activity 5.1: Plan execution |
70 | 4.5.1.1 Issues 4.5.1.1.1 Integration and test planning for multi-architecture simulation environments 4.5.1.1.2 Multi-architecture execution planning considerations 4.5.1.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 5.1 (plan execution) |
71 | 4.5.2 Activity 5.2: Integrate simulation environment 4.5.2.1 Issues 4.5.2.1.1 Live entity time, space, and position information (TSPI) updates |
72 | 4.5.2.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 5.2 (integrate simulation environment) |
73 | 4.5.3 Activity 5.3: Test simulation environment 4.5.3.1 Issues 4.5.3.1.1 Complexities of testing in a multi-architecture simulation environment |
75 | 4.5.3.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 5.3 (test simulation environment) 4.6 Step 6: Execute simulation 4.6.1 Activity 6.1: Execute simulation |
76 | 4.6.1.1 Issues 4.6.1.1.1 Monitoring and controlling multi-architecture simulation environment execution |
78 | 4.6.1.1.2 Multi-architecture data collection |
80 | 4.6.1.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 6.1 (execute simulation) 4.6.2 Activity 6.2: Prepare simulation environment outputs 4.6.2.1 Issues 4.7 Step 7: Analyze data and evaluate results |
81 | 4.7.1 Activity 7.1: Analyze data 4.7.1.1 Issues 4.7.2 Activity 7.2: Evaluate and feedback results |
82 | 4.7.2.1 Issues 4.7.2.1.1 Multi-architecture simulation environment assessment |
83 | 4.7.2.2 Multi-architecture-specific inputs, tasks, and outcomes for Activity 7.2 (evaluate and feedback results) |
84 | Annex A (informative) Bibliography |