3.4.9 The project shall ensure that the software system is validated on the targeted platform or high-fidelity simulation. Typically, a high-fidelity simulation has the exact processor, processor performance, timing, memory size, and interfaces as the flight unit. Class G is labeled with "P (Center)." This means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable Validation is a process of evaluating work products to ensure that the right behaviors have been built into the work products. The right behaviors adequately describe what the system is supposed to do and what the system is supposed to do under adverse conditions. They may also describe what the system is not supposed to do. Validation is performed to assure that the specified software systems fulfill their intended use when placed on the targeted platform in the target environment (or simulated target environment). The methods used to accomplish validation on the actual target platform or in a high fidelity simulator may include aspects that were applied to previous software work products (requirements, designs, prototypes, etc.). The use of these methods provides continuity of results through the assembling system. The use of the high-fidelity or targeted system allows the software developers to check systems level interfaces, memory performance and constraints, event timing, and other characteristics that can only be evaluated properly in the real system or near-system environment (see SWE-055). Validation activities include preparation, performance, analysis of results, and identification of corrective action. Validation at the systems level ensures that the correct product has been built. 001 The basic validation process is shown below with the steps addressed by this requirement highlighted: Validation activities are not be confused with verification activities as each has a specific goal. Validation is designed to confirm the right product is being produced while verification is conducted to confirm the product being produced meets the specified requirements correctly. Validation, as used in this requirement, addresses the following: See SWE-055 for additional information on requirements validation during the concept, design, coding, and initial testing phases of the software development life cycle. Once the software work products have been integrated into a software system, validation activities are concentrated on systems-level effects, interactions, interfaces, and the overall behavior of the system (i.e., whether the system is providing for and meeting the needs of the customer). This level of validation can be accomplished in either an actual operational environment with the use of the targeted platform, or if this combination is not viable, on a high-fidelity simulator. Recall from the note associated with this requirement that a high-fidelity simulation typically has the exact processor, processor performance, timing, memory size, and interfaces as the flight unit. The following scenarios provide additional considerations for selection of the most appropriate validation approach at the systems level: See Lessons Learned for other considerations related to simulated environment validation. Also, consider user-created operational scenarios, when appropriate. They can be a valuable tool in either simulated or operational environments. The small project does not normally involve highly complex platforms, so it is generally easier and cheaper to validate software systems on the targeted platform. However, the environment for space systems will typically need to be simulated during validation for projects regardless of size. When using simulated platforms, small projects are advised to look for existing tools rather than creating their own. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. The NASA Lessons Learned database contains the following lessons learned related to simulations:
See edit history of this section
Post feedback on this section
1. Requirements
1.1 Notes
1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
Additional guidance related to platform or hi-fidelity simulations may be found in the following related requirements in this Handbook:
4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
The Recommendation states: "When it is only feasible to test 'last minute' command changes or flight software changes via simulation, instead of using the flight system that has been integrated with the launch vehicle, assure that the simulation testbed is capable of end-to-end verification of the impact on all flight software functions, including fault protection. Should the system testbed lack high fidelity features such as dual string simulation, the project should identify potential testing shortfalls and address how it will validate the test results 578."
SWE-073 - Platform or Hi-Fidelity Simulations
Web Resources
View this section on the websiteUnknown macro: {page-info}