bannerd


SWE-070 - Models, Simulations, Tools

1. Requirements

4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.

1.1 Notes

Information regarding specific V&V techniques and the analysis of models and simulations can be found in NASA-STD-7009, Standard for Models and Simulations, NASA-HDBK-7009, Handbook for Models and Simulations, or discipline-specific recommended practice guides.

1.2 History

SWE-070 - Last used in rev NPR 7150.2D

RevSWE Statement
A

3.4.6 The project shall verify, validate, and accredit software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.

Difference between A and BChanged project's responsibility to verify, validate, and accredit to just use ones that have been validated and accredited.
B

4.5.7 The project manager  shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.

Difference between B and C

No change

C

4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.

Difference between C and DNo change
D

4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Performing verification and validation (V&V) to accredit software models, simulations, and analysis tools is important to ensure the credibility of the results produced by those tools. Critical decisions may be made, at least in part, based on the results produced by models, simulations, and analysis tools. Reducing the risk associated with these decisions is one reason to use accredited tools that have been properly verified and validated.

See also SWE-066 - Perform Testing

3. Guidance

When using a model, there are many ways an analysis can produce the wrong result: a model can be incorrect for the scenario being analyzed, software can incorrectly implement a model, a user can incorrectly operate software, etc.  Thus, the model, the software and hardware, and the processes to operate the model need appropriate V&V.

3.1 Accrediting Models and Simulations

The processes of V&V are key activities for accrediting all types of models and simulations. To provide a basis to determine if the models, simulations, and analysis tools are acceptable for use for a specific purpose, there are three pieces of information to capture and address from the modeling and simulation (M&S) characterizations as part of this process:

  1. The question(s) to be answered and the particular aspects of the problem that the M&S will be used to help address.
  2. The decisions that will be made are based on the M&S results.
  3. The consequences result from erroneous M&S outputs. 175
  4. Be aware of the constraints/limitations of M&S (i.e., what it can and cannot do).

Information for the V&V of models, simulations, and tools used to develop software code can be found in NASA-STD-7009, Standard for Models and Simulations, 272  and NPR 7150.2. Center requirements and associated processes flowed down from these two documents to address numerical accuracy, uncertainty analysis, sensitivity analysis, and V&V of models and simulations.

The use of NASA-STD-7009 in fulfilling the requirements of NPR 7150.2 is described in topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009. The Topic also provides a list of additional resources for V&V of models and simulations used in the software development life cycle.

3.2 Verification of Models and Simulations

Per section 4.4 of NASA-STD-7009,  272 the following basic activities are to be performed for the verification of models and simulations:

Verification Steps

  • Document verification techniques and the conditions under which the verification was conducted.
  • Document any numerical error estimates for the results of the computational model.
  • Document the verification status.

3.3 Validation of Models and Simulations

Per section 4.4 of NASA-STD-7009,  272  the following basic activities are to be performed for the validation of models and simulations:

Validation Steps

  • Document techniques are used to validate the models and simulations for their intended use.
  • Document the conditions under which the validation was conducted.

  • Document any validation metrics and any validation data set used.
  • Document any studies conducted.
  • Document validation results.

3.4 Reporting

To better understand the uncertainties affecting the results of the models and simulations, follow the required steps in section 4.4 of NASA-STD-7009 and the steps for assessing and reporting the credibility of model and simulation results found in sections 4.7 and 4.8 of NASA-STD-7009. 272

Understanding the uncertainties affecting the results

Key points:

  1. Simulations/emulations must be kept in sync with hardware/software updates
  2. Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software.
  3. Projects should verify that their simulators/emulators have been tested and validated against flight or flight-like hardware before use. 
  4. Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify that program-to-program interfaces are correctly documented and support the hazard-cause risk ranking.

3.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.6 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

Small projects may choose to lighten their Verification and Validation (V&V) and accreditation requirements through the use of software models, simulations, and analysis tools verified, validated, and accredited by other NASA projects that used these tools in a similar manner and purpose as the small project. To determine the relevance and usefulness of this option, small projects need to be aware of the differences between the projects and the prior project's V&V and accreditation activities as well as the versions of the models, simulations, and analysis tools on which the V&V and accreditation activities were performed. 

If the project plans to use software models, simulations, and analysis tools verified, validated, and accredited by other NASA projects, they should pay particular attention to the Key points listed below. 

Some other considerations: 

  • When models are used to design, develop, analyze, test, or maintain software systems, ensure that: 
    • The models are kept up to date and configuration managed, incorporating changes, as needed 
      • The models incorporate the requirements, constraints, and design features 
      • Inputs are correct and complete 
      • The models are testable, tested, and accredited 
      • When using test tools, simulations, models, and environments, check: 
        • Use of up-to-date versions and licenses on purchased tools 
        • Are models the level of fidelity and completeness required to determine the requirements and functionality required? 
        • Have any concerns or risks been recorded and resolved? 
        • Are tools, models, and simulators being operated within the parameters/limitations of their capabilities? 
  • Have any operations of the software system or supporting tools, models, or simulators outside known boundaries, parameters, or limitations been documented, and have the risks input to the risk management system? 
  • Are the results as expected or can they be explained?  
  • Is there a report on the limits, functioning, and results of all simulators, models, and tools provided along with an analysis of the level of certainty/trust from outcomes, including any concerns, risks, or issues? 


Key Points

  1. Simulations/emulations must be kept in sync with hardware/software updates 
  2. Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software. 
  3. Projects should verify that their simulators/emulators have been tested and validated against flight or flight-like hardware before use.  
  4. Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify that program-to-program interfaces are correctly documented and support the hazard-cause risk ranking. 

5. Resources

5.1 References


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

Topic 7.15 - Relationship Between NPR 7150.2 and NASA-STD-7009, and NASA-STD-7009 272, includes lessons learned for the verification and validation of models and simulations.

7. Software Assurance

SWE-070 - Models, Simulations, Tools
4.5.6 The project manager shall use validated and accredited software models, simulations, and analysis tools required to perform qualification of flight software or flight equipment.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that the software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment have been validated and accredited.

7.2 Software Assurance Products

  • None at this time.


    Objective Evidence

    • Validation and accreditation criteria for software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
    • Software test results for software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.
    • Use of NASA-STD-7009 for accreditation of software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  •  # of Non-Conformances identified in models, simulations, and tools over time (Open, Closed, Severity)

See also Topic 8.18 - SA Suggested Metrics

7.4 Guidance

Task 1:

For this requirement, software assurance needs to confirm that any software models, simulations, and analysis tools used for the qualification of flight software or flight equipment have been validated and accredited. To be accredited means the models, simulations, and analysis tools have been officially certified as acceptable for the specific use for which they are intended.

Performing V&V to approve SW models, simulations, analysis tools, and software tools is important to ensure the credibility of those tools’ results and how the tool directly affects the SW being developed. Critical decisions may be made, at least in part, based on the results produced by these tools. Reducing the risk associated with these decisions is one reason to ensure that the approved tools have been properly verified and validated. Information regarding specific V&V techniques and the analysis of models and simulations not covered can be found in the NASA Standard NASA-STD-7009 272, Standard for Models and Simulations.

The first step In confirming the validation and accreditation of the models, simulations, and analysis tools used for the qualification of flight software or flight is to obtain a list of those being used. Then check to see if the project has verified, validated, and approved them for their use in the development, analysis, testing, or maintenance of the flight software or flight equipment.

Examples of SW tools that directly affect the SW code development can include but are not limited to, the compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools used by a Project. Another example would be the Project’s use of a linked code library, which would be typically validated and accredited before use in the executable code.

There is information in the software guidance section of this requirement on validating and accrediting these tools. Another reference is “How to Perform Credible Verification, Validation, and Accreditation for Modeling and Simulations” by Dr. David Cook and Dr. James Skinner 623.

Task 2:

Software assurance also needs to confirm that flight software or flight equipment has been properly calibrated before use, if applicable. Generally, tools that need calibration have a calibration tag that states the last calibration date and the expiration of that calibration (date when recalibration is required). If tools are found that are out of calibration before use for a test, this should be brought to the attention of the test director or project manager and the test should be delayed until the tool is calibrated or replaced with a calibrated tool.

If the software tools (e.g., simulators, models, simulations, emulators, compiler libraries, built-in memory checkers, materials analysis, trajectory analysis) have an impact on the safety of the system, then determine if they are correctly applied and used, operated within the range of their limitations, the results are documented, etc.  Typically, once it is determined if a tool has any safety implications, an assessment is performed on the severity of impact if the tool provides one or more wrong outputs, and the likelihood of that occurring determines the level of effort or hazard mitigations to employ.

Software assurance may want to check on the following other considerations for software models, simulations, and analysis tools:

  • When models are used to design, develop, analyze, test, or maintain software systems, ensure that:
    • The models are kept up to date and configuration managed, incorporating changes, as needed
    • The models incorporate the requirements, constraints, and design features
    • Inputs are correct and complete
    • The models are testable, tested, and accredited
    • When using test tools, simulations, models, and environments, check:
      • Use of up-to-date versions and licenses on purchased tools
      • Are models the level of fidelity and completeness required to determine the requirements and functionality required?
      • Have any concerns or risks been recorded and resolved?
      • Are tools, models, and simulators being operated within the parameters/limitations of their capabilities
  • Have any operations of the software system or supporting tools, models, or simulators outside known boundaries, parameters, or limitations been documented, and the risks input to the risk management system
  • Are the results as expected or can they be explained? 
  • Is there a report on the limits, functioning, and results of all simulators, models, and tools provided along with an analysis of the level of certainty/trust from outcomes, including any concerns, risks, or issues?

Key points:

  1. Simulations/emulations must be kept in sync with hardware/software updates
  2. Unidentified coding or logic errors in simulators/emulators used by the programs could lead to flight software errors or incorrect flight software.
  3. Projects should verify that their simulators/emulators have been tested and validated against fight-or-flight-like hardware before use. 
  4. Projects should independently assess the accuracy of simulator/emulator design and timing controls to verify that program-to-program interfaces are correctly documented and support the hazard-cause risk ranking.

For auto-generated code see Topic 8.11 - Auto-Generated Code

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:


  • No labels