Overall evaluation of the software as shown by the test results.
Remaining deficiencies, limitations, or constraints detected by testing (e.g., including description of the impact on software and system performance, the impact a correction would have on software and system design, and recommendations for correcting the deficiency, limitation, or constraint).
Impact of test environment.
b. Detailed test results:
Project-unique identifier of a test and test procedure(s).
Summary of test results (e.g., including requirements verified).
Problems encountered.
Deviations from test cases/procedures.
c. Test log:
Date(s), time(s), and location(s) of tests performed.
Test environment, hardware, and software configurations used for each test.
Date and time of each test-related activity, the identity of the individual(s) who performed the activity, and the identities of witnesses, as applicable.
d. Rationale for decisions.
Div
id
tabs-2
2. Rationale
When testing software, it is important to capture the outcome of tests used to verify requirements, functionality, safety, and other aspects of the software. It is also important to capture any decisions based on the outcome of those tests. Test reports capture that information and more for purposes including but not limited to:
Documenting what was done and how the results match or differ from the expected results.
Identification and isolation of the source of any error found in the software.
Verification that testing was completed as planned.
Verification that safety-critical elements were properly tested.
Verification that all identified hazards have been eliminated or controlled to an acceptable level of risk.
Reporting safety-critical findings that should be used to update hazard reports.
Generating data for evaluating the quality of tested products and the effectiveness of testing processes.
Div
id
tabs-3
3. Guidance
As noted above, software test reports document an assessment of the software based on the test results, the detailed test results, a log of the testing activities, and the rationale for any decisions made during the testing process.
The software test reports may be tailored by software classification. Goddard Space Flight Center's (GSFC)'s 580-STD-077-01, Requirements for Minimum Contents of Software Documents, provides one suggestion for tailoring software test reports based on the required contents and the classification of the software being tested.
Test reports are to be written following each type of testing activity, such as a pass of unit or integration tests. Note that NASA-GB-8719.13, NASA Software Safety Guidebook
Swerefn
refnum
276
, recommends formal unit test reports to be written for safety-critical unit tests, while other unit tests' reports may be as simple as notes in a log book. Also note, however, that project planning will include a determination of the degree of formality for the various types of testing, including unit and integration testing, and that formality needs to be documented in the project's software development plan (SDP).
Depending on a project's defined procedures, test reports can be of multiple types:
Preliminary Test Reports
Detailed Test Reports
Prepared at the end of each test session.
Prepared within a week of test execution.
Provide rapid assessment of how software is working.
Describe problems but do not identify their sources in the code.
Provide early indications of any major problems.
Prepared by test team by analyzing results obtained during test sessions, using hand calculations and detailed comparisons with expected results.
Prepared by test team on basis of a "quick-look" evaluation of the executions.
To identify and properly configuration manage the test environment used for testing, include the following information in the test report:
Version numbers of all software elements used to support testing, such as simulators and monitoring tools.
Inventory numbers and calibration dates for tools such as logic analyzers, oscilloscopes, multimeters.
Version numbers for all FPGA (Field Programmable Gate Array) components present on the testbed.
Serial numbers of the testbed elements.
Test reports are to provide evidence of the thoroughness of the testing, including:
Differences in the test environment and the operational environment and any effects those differences had on the test results.
Any test anomalies and the disposition of any related corrective actions or problem reports.
Details of the test results (see requirement text), including test case identifications, test version, completion status, etc., along with the associated item tested as required by the project's software test plan.
Location of original test results (output from tests, screen shots, error messages, etc., as captured during the actual testing activity).
Issues that may arise related to software test reports include:
If acquired software does not include complete (meet the required content) test reports, it is prudent for the acquirer to conduct additional testing upon delivery of the software from the provider.
Test reports may not be available for acquired off-the-shelf (OTS) products. SWE-027 requires the project to ensure OTS is verified and validated to the same level of confidence as a developed software component. Lack of existing OTS test reports, does not relieve the project from subsequent tests and test reports to determine fitness for use.
Useful processes or recommended practices for software test reports include:
Review and signoff by Software Assurance and Software Safety for software safety verifications; may include NASA Independent Verification and Validation (IV&V) if those services are being used for the project.
Maintenance under configuration management.
If software is acquired, the provider needs to provide software test reports containing the required information.
Use of templates or checklists to ensure accurate capture and recording of detailed information such as the test logs and deviations from planned test cases/test procedures.
Use of tables to summarize test results, including requirements tested, pass/fail results, identification of tests performed, etc.
Referencing of problem reports/change requests when documenting details of remaining deficiencies, limitations, constraints, etc., rather than duplication of the information contained in those reports/requests.
Additional guidance related to software testing may be found in the following requirements in this Handbook:
For projects with small budgets or small team size, the following approaches may be helpful in reducing the time and cost of preparing software test reports:
Automate collection of detailed report data and/or test logs for the test report, particularly if the automated tools already exist at the Center.
Use existing templates rather than create new ones.
Use less formal reporting for unit tests, if Center procedures allow.
Tailor the contents of the software test reports, where allowed and appropriate.
Reference existing content rather than duplicate it in the test report.
Div
id
tabs-5
5. Resources
5.1 References
Include Page
MC STR
MC STR
Show If
group
confluence-users
Panel
titleColor
red
title
Visible to editors only
Enter necessary modifications to be made in the table below:
SWEREFs to be added
SWEREFS to be deleted
SWEREFs called out in text: 276
SWEREFs NOT called out in text but listed as germane: 047
5.2 Tools
Include Page
TT STR
TT STR
Div
id
tabs-6
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.