3.4.4 The project shall evaluate test results and document the evaluation. NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement. Class G is labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable Test results are the basis for confirming that the team has fulfilled the software requirements in the resulting software product. In order to make such decisions, test results must be reviewed and evaluated using a documented, repeatable process. The team can derive quality conclusions by capturing the actual test results, comparing them to expected results, analyzing those results against pre-established criteria, and documenting that analysis/evaluation process. It is important to document and retain elements used to generate and analyze the results for future regression testing and related test results analysis. Per NASA-STD-8719.13, NASA Software Safety Standard, 271 and NASA-GB-8719.13, NASA Software Safety Guidebook, 276 the analysis methodology for software and system test results includes the following steps: The following from IEEE-STD-1012-2004, IEEE Standard for Software Verification and Validation, Other elements for the evaluation methodology include: For all levels of software testing (unit, component, integration, etc.) capture and document items used to generate and collect the results. These items are an important part of analyzing the test results since some anomalies could have been caused by the tests themselves. The following are captured, not only for results analysis, but for future regression testing: In addition to the information used to generate test results, the following may be important inputs to the results analysis: When performing the actual test results analysis/evaluation, consider the following practices 047 : When documenting the outcome of the analysis, important items to include are: Additional guidance related to software test results may be found in the following related requirements in this Handbook: No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. A documented lesson from the NASA Lessons Learned database notes the following:
See edit history of this section
Post feedback on this section
1. Requirements
1.1 Implementation Notes from Appendix D
1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
209 are also appropriate considerations when developing a test results analysis methodology:4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
SWE-068 - Evaluate Test Results
Web Resources
View this section on the websiteUnknown macro: {page-info}