See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.
1.1 Notes
Information regarding specific V&V techniques and the analysis of models and simulations can be found in NASA-STD-7009, Standard for Models and Simulations, NASA-HDBK-7009, Handbook for Models and Simulations, or discipline-specific recommended practice guides.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Performing verification and validation (V&V) to accredit software models, simulations, and analysis tools is important to ensure the credibility of the results produced by those tools. Critical decisions may be made, at least in part, based on the results produced by models, simulations, and analysis tools. Reducing the risk associated with these decisions is one reason to use accredited tools that have been properly verified and validated.
See also SWE-066 - Perform Testing,
3. Guidance
When using a model, there are many ways an analysis can produce the wrong result: a model can be incorrect for the scenario being analyzed, software can incorrectly implement a model, a user can incorrectly operate software, etc. Thus, the model, the software and hardware, and the processes to operate the model need appropriate V&V.
3.1 Accrediting Models and Simulations
The processes of V&V are key activities for accrediting all types of models and simulations. To provide a basis to determine if the models, simulations, and analysis tools are acceptable for use for a specific purpose, there are three pieces of information to capture and address from the modeling and simulation (M&S) characterizations as part of this process:
- The question(s) to be answered and the particular aspects of the problem that the M&S will be used to help address.
- The decisions that will be made are based on the M&S results.
- The consequences result from erroneous M&S outputs. 175
- Be aware of the constraints/limitations of M&S (i.e., what it can and cannot do).
Information for the V&V of models, simulations, and tools used to develop software code can be found in NASA-STD-7009, Standard for Models and Simulations, 272 and NPR 7150.2. Center requirements and associated processes flowed down from these two documents to address numerical accuracy, uncertainty analysis, sensitivity analysis, and V&V of models and simulations.
The use of NASA-STD-7009 in fulfilling the requirements of NPR 7150.2 is described in topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009. The Topic also provides a list of additional resources for V&V of models and simulations used in the software development life cycle.
3.2 Verification of Models and Simulations
Per section 4.4 of NASA-STD-7009, 272 the following basic activities are to be performed for the verification of models and simulations:
Verification Steps
- Document verification techniques and the conditions under which the verification was conducted.
- Document any numerical error estimates for the results of the computational model.
- Document the verification status.
3.3 Validation of Models and Simulations
Per section 4.4 of NASA-STD-7009, 272 the following basic activities are to be performed for the validation of models and simulations:
Validation Steps
- Document techniques are used to validate the models and simulations for their intended use.
Document the conditions under which the validation was conducted.
- Document any validation metrics and any validation data set used.
- Document any studies conducted.
- Document validation results.
3.4 Reporting
To better understand the uncertainties affecting the results of the models and simulations, follow the required steps in section 4.4 of NASA-STD-7009 and the steps for assessing and reporting the credibility of model and simulation results found in sections 4.7 and 4.8 of NASA-STD-7009. 272
3.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links |
---|
3.6 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
Small projects may choose to lighten their Verification and Validation (V&V) and accreditation requirements through the use of software models, simulations, and analysis tools verified, validated, and accredited by other NASA projects that used these tools in a similar manner and purpose as the small project. To determine the relevance and usefulness of this option, small projects need to be aware of the differences between the projects and the prior project's V&V and accreditation activities as well as the versions of the models, simulations, and analysis tools on which the V&V and accreditation activities were performed.
If the project plans to use software models, simulations, and analysis tools verified, validated, and accredited by other NASA projects, they should pay particular attention to the Key points listed below.
Some other considerations:
- When models are used to design, develop, analyze, test, or maintain software systems, ensure that:
- The models are kept up to date and configuration managed, incorporating changes, as needed
- The models incorporate the requirements, constraints, and design features
- Inputs are correct and complete
- The models are testable, tested, and accredited
- The models are kept up to date and configuration managed, incorporating changes, as needed
- When using test tools, simulations, models, and environments, check:
- Use of up-to-date versions and licenses on purchased tools
- Are models the level of fidelity and completeness required to determine the requirements and functionality required?
- Have any concerns or risks been recorded and resolved?
- Are tools, models, and simulators being operated within the parameters/limitations of their capabilities?
- When using test tools, simulations, models, and environments, check:
- Have any operations of the software system or supporting tools, models, or simulators outside known boundaries, parameters, or limitations been documented, and have the risks input to the risk management system?
- Are the results as expected or can they be explained?
- Is there a report on the limits, functioning, and results of all simulators, models, and tools provided along with an analysis of the level of certainty/trust from outcomes, including any concerns, risks, or issues?
Key Points
- Simulations/emulations must be kept in sync with hardware/software updates
- Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software.
- Projects should verify that their simulators/emulators have been tested and validated against flight or flight-like hardware before use.
- Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify that program-to-program interfaces are correctly documented and support the hazard-cause risk ranking.
5. Resources
5.1 References
- (SWEREF-175) Department of Defense, MIL-STD-3022, 2008.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-272) NASA-STD-7009A w/ CHANGE 1: ADMINISTRATIVE/ EDITORIAL CHANGES 2016-12-07
- (SWEREF-623) Cook, Dr. David A. ; Skinner, Dr. James M.; The AEgis Technologies Group, Inc.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
Topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009, and NASA-STD-7009 272, includes lessons learned for the verification and validation of models and simulations.
7. Software Assurance
7.1 Tasking for Software Assurance
7.2 Software Assurance Products
- None at this time.
Objective Evidence
- Validation and accreditation criteria for software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
- Software test results for software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
- Use of NASA-STD-7009 for accreditation of software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
7.3 Metrics
- # of Non-Conformances identified in models, simulations, and tools over time (Open, Closed, Severity)
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
Task 1:
For this requirement, software assurance needs to confirm that any software models, simulations, and analysis tools used for the qualification of flight software or flight equipment have been validated and accredited. To be accredited means the models, simulations, and analysis tools have been officially certified as acceptable for the specific use for which they are intended.
Performing V&V to approve SW models, simulations, analysis tools, and software tools is important to ensure the credibility of those tools’ results and how the tool directly affects the SW being developed. Critical decisions may be made, at least in part, based on the results produced by these tools. Reducing the risk associated with these decisions is one reason to ensure that the approved tools have been properly verified and validated. Information regarding specific V&V techniques and the analysis of models and simulations not covered can be found in the NASA Standard NASA-STD-7009 272, Standard for Models and Simulations.
The first step In confirming the validation and accreditation of the models, simulations, and analysis tools used for the qualification of flight software or flight is to obtain a list of those being used. Then check to see if the project has verified, validated, and approved them for their use in the development, analysis, testing, or maintenance of the flight software or flight equipment.
Examples of SW tools that directly affect the SW code development can include but are not limited to, the compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools used by a Project. Another example would be the Project’s use of a linked code library, which would be typically validated and accredited before use in the executable code.
There is information in the software guidance section of this requirement on validating and accrediting these tools. Another reference is “How to Perform Credible Verification, Validation, and Accreditation for Modeling and Simulations” by Dr. David Cook and Dr. James Skinner 623.
Task 2:
Software assurance also needs to confirm that flight software or flight equipment has been properly calibrated before use, if applicable. Generally, tools that need calibration have a calibration tag that states the last calibration date and the expiration of that calibration (date when recalibration is required). If tools are found that are out of calibration before use for a test, this should be brought to the attention of the test director or project manager and the test should be delayed until the tool is calibrated or replaced with a calibrated tool.
If the software tools (e.g., simulators, models, simulations, emulators, compiler libraries, built-in memory checkers, materials analysis, trajectory analysis) have an impact on the safety of the system, then determine if they are correctly applied and used, operated within the range of their limitations, the results are documented, etc. Typically, once it is determined if a tool has any safety implications, an assessment is performed on the severity of impact if the tool provides one or more wrong outputs, and the likelihood of that occurring determines the level of effort or hazard mitigations to employ.
Software assurance may want to check on the following other considerations for software models, simulations, and analysis tools:
- When models are used to design, develop, analyze, test, or maintain software systems, ensure that:
- The models are kept up to date and configuration managed, incorporating changes, as needed
- The models incorporate the requirements, constraints, and design features
- Inputs are correct and complete
- The models are testable, tested, and accredited
- When using test tools, simulations, models, and environments, check:
- Use of up-to-date versions and licenses on purchased tools
- Are models the level of fidelity and completeness required to determine the requirements and functionality required?
- Have any concerns or risks been recorded and resolved?
- Are tools, models, and simulators being operated within the parameters/limitations of their capabilities
- Have any operations of the software system or supporting tools, models, or simulators outside known boundaries, parameters, or limitations been documented, and the risks input to the risk management system
- Are the results as expected or can they be explained?
- Is there a report on the limits, functioning, and results of all simulators, models, and tools provided along with an analysis of the level of certainty/trust from outcomes, including any concerns, risks, or issues?
Key points:
- Simulations/emulations must be kept in sync with hardware/software updates
- Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software.
- Projects should verify that their simulators/emulators have been tested and validated against fight-or-flight-like hardware before use.
- Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify that program-to-program interfaces are correctly documented and support the hazard-cause risk ranking.
For auto-generated code see Topic 8.11 - Auto-Generated Code.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: