5.3.2.1 The Software Test Report shall include: a. Overview of the test results: NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement. Classes C-E Safety Critical are labeled with "P (Center) + SO". "P (Center)" means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement, while "SO" means that the requirement applies only for safety-critical portions of the software. Class C Not Safety Critical and class G are labeled with "P (Center)." This means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? P(C) P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | When testing software, it is important to capture the outcome of tests used to verify requirements, functionality, safety, and other aspects of the software. It is also important to capture any decisions based on the outcome of those tests. Test reports capture that information and more for purposes including but not limited to: As noted above, software test reports document an assessment of the software based on the test results, the detailed test results, a log of the testing activities, and the rationale for any decisions made during the testing process. The software test reports may be tailored by software classification. Goddard Space Flight Center's (GSFC)'s 580-STD-077-01, Requirements for Minimum Contents of Software Documents, 090 provides one suggestion for tailoring software test reports based on the required contents and the classification of the software being tested. Test reports are to be written following each type of testing activity, such as a pass of unit or integration tests. Note that NASA-GB-8719.13, NASA Software Safety Guidebook 276, recommends formal unit test reports to be written for safety-critical unit tests, while other unit tests' reports may be as simple as notes in a log book. Also note, however, that project planning will include a determination of the degree of formality for the various types of testing, including unit and integration testing, and that formality needs to be documented in the project's software development plan (SDP). Depending on a project's defined procedures, test reports can be of multiple types: Preliminary Test Reports Detailed Test Reports Prepared at the end of each test session. Prepared within a week of test execution. Provide rapid assessment of how software is working. Describe problems but do not identify their sources in the code. Provide early indications of any major problems. Prepared by test team by analyzing results obtained during test sessions, using hand calculations and detailed comparisons with expected results. Prepared by test team on basis of a "quick-look" evaluation of the executions. To identify and properly configuration manage the test environment used for testing, include the following information in the test report: Test reports are to provide evidence of the thoroughness of the testing, including: Issues that may arise related to software test reports include: Useful processes or recommended practices for software test reports include: Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to software test reports and their contents. Additional guidance related to software testing may be found in the following related requirements in this Handbook: Use of Commercial, Government, and Legacy Software Test Plans, Procedures, Reports Perform Testing Evaluate Test Results Document Defects and Track Software Test Plan Software Test Procedures For projects with small budgets or small team size, the following approaches may be helpful in reducing the time and cost of preparing software test reports: Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. There are currently no Lessons Learned identified for this requirement. View this section on the website
See edit history of this section
Post feedback on this section
1. Requirements
(1) Overall evaluation of the software as shown by the test results.
(2) Remaining deficiencies, limitations, or constraints detected by testing (e.g., including description of the impact on software and system performance, the impact a correction would have on software and system design, and recommendations for correcting the deficiency, limitation, or constraint).
(3) Impact of test environment.
b. Detailed test results:
(1) Project-unique identifier of a test and test procedure(s).
(2) Summary of test results (e.g., including requirements verified).
(3) Problems encountered.
(4) Deviations from test cases/procedures.
c. Test log:
(1) Date(s), time(s), and location(s) of tests performed.
(2) Test environment, hardware, and software configurations used for each test.
(3) Date and time of each test-related activity, the identity of the individual(s) who performed the activity, and the identities of witnesses, as applicable.
d. Rationale for decisions.1.1 Notes
1.2 Applicability Across Classes
X
X
X
- Applicable |
- Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures
2. Rationale
3. Guidance
4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
SWE-118 - Software Test Report
Web Resources
Unknown macro: {page-info}