bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin


Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned


Div
idtabs-1

1. Requirements

5.3.2.1 The Software Test Report shall include:

a.    Overview of the test results:
      (1)  Overall evaluation of the software as shown by the test results.
      (2)  Remaining deficiencies, limitations, or constraints detected by testing (e.g., including description of the impact on software and system performance, the impact a correction would have on software and system design, and recommendations for correcting the deficiency, limitation, or constraint).
      (3)  Impact of test environment.
b.    Detailed test results:
      (1)  Project-unique identifier of a test and test procedure(s).
      (2)  Summary of test results (e.g., including requirements verified).
      (3)  Problems encountered.
      (4)  Deviations from test cases/procedures.
c.    Test log:
     (1)  Date(s), time(s), and location(s) of tests performed.
     (2)  Test environment, hardware, and software configurations used for each test.
     (3)  Date and time of each test-related activity, the identity of the individual(s) who performed the activity, and the identities of witnesses, as applicable.
d.    Rationale for decisions.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C-E Safety Critical are labeled with "P (Center) + SO".  "P (Center)" means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement, while "SO" means that the requirement applies only for safety-critical portions of the software.

Class C Not Safety Critical and class G are labeled with "P (Center)." This means that an approved Center-defined process that meets a non-empty subset of the full requirement can be used to achieve this requirement.


applicable
f1
gp
h0
ansc1
asc1
bnsc1
csc*
bsc1
esc*
cnscp
dnsc0
dsc*
ensc0



Div
idtabs-2

2. Rationale

When testing software, it is important to capture the outcome of tests used to verify requirements, functionality, safety, and other aspects of the software. It is also important to capture any decisions based on the outcome of those tests. Test reports capture that information and more for purposes including but not limited to:

  • Documenting what was done and how the results match or differ from the expected results.
  • Identification and isolation of the source of any error found in the software.
  • Verification that testing was completed as planned.
  • Verification that safety-critical elements were properly tested.
  • Verification that all identified hazards have been eliminated or controlled to an acceptable level of risk.
  • Reporting safety-critical findings that should be used to update hazard reports.
  • Generating data for evaluating the quality of tested products and the effectiveness of testing processes.


Div
idtabs-3

3. Guidance

As noted above, software test reports document an assessment of the software based on the test results, the detailed test results, a log of the testing activities, and the rationale for any decisions made during the testing process.

The software test reports may be tailored by software classification.  Goddard Space Flight Center's (GSFC)'s 580-STD-077-01, Requirements for Minimum Contents of Software Documents,

sweref
090
090
 provides one suggestion for tailoring software test reports based on the required contents and the classification of the software being tested.

Test reports are to be written following each type of testing activity, such as a pass of unit or integration tests.  Note that NASA-GB-8719.13, NASA Software Safety Guidebook

sweref
276
276
, recommends formal unit test reports to be written for safety-critical unit tests, while other unit tests' reports may be as simple as notes in a log book. Also note, however, that project planning will include a determination of the degree of formality for the various types of testing, including unit and integration testing, and that formality needs to be documented in the project's software development plan (SDP).

Depending on a project's defined procedures, test reports can be of multiple types:


Preliminary Test Reports

Detailed Test Reports

Prepared at the end of each test session.

Prepared within a week of test execution.

Provide rapid assessment of how software is working.

Describe problems but do not identify their sources in the code.

Provide early indications of any major problems.

Prepared by test team by analyzing results obtained during test sessions, using hand calculations and detailed comparisons with expected results.

Prepared by test team on basis of a "quick-look" evaluation of the executions.




To identify and properly configuration manage the test environment used for testing, include the following information in the test report:

  • Version numbers of all software elements used to support testing, such as simulators and monitoring tools.
  • Inventory numbers and calibration dates for tools such as logic analyzers, oscilloscopes, multimeters.
  • Version numbers for all FPGA (Field Programmable Gate Array) components present on the testbed.
  • Serial numbers of the testbed elements.

Test reports are to provide evidence of the thoroughness of the testing, including:

  • Differences in the test environment and the operational environment and any effects those differences had on the test results.
  • Any test anomalies and the disposition of any related corrective actions or problem reports.
  • Details of the test results (see requirement text), including test case identifications, test version, completion status, etc., along with the associated item tested as required by the project's software test plan.
  • Location of original test results (output from tests, screen shots, error messages, etc., as captured during the actual testing activity).

Issues that may arise related to software test reports include:

  • If acquired software does not include complete (meet the required content) test reports, it is prudent for the acquirer to conduct additional testing upon delivery of the software from the provider.
  • Test reports may not be available for acquired off-the-shelf (OTS) products. SWE-027 requires the project to ensure OTS is verified and validated to the same level of confidence as a developed software component. Lack of existing OTS test reports, does not relieve the project from subsequent tests and test reports to determine fitness for use.  

Useful processes or recommended practices for software test reports include:

  • Review and signoff by Software Assurance and Software Safety for software safety verifications; may include NASA Independent Verification and Validation (IV&V) if those services are being used for the project.
  • Maintenance under configuration management.
  • If software is acquired, the provider needs to provide software test reports containing the required information.
  • Use of templates or checklists to ensure accurate capture and recording of detailed information such as the test logs and deviations from planned test cases/test procedures.
  • Use of tables to summarize test results, including requirements tested, pass/fail results, identification of tests performed, etc.
  • Referencing of problem reports/change requests when documenting details of remaining deficiencies, limitations, constraints, etc., rather than duplication of the information contained in those reports/requests.


Note

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to software test reports and their contents.


Additional guidance related to software testing may be found in the following related requirements in this Handbook:


SWE-027

Use of Commercial, Government, and Legacy Software

SWE-065

Test Plans, Procedures, Reports

SWE-066

Perform Testing

SWE-068

Evaluate Test Results

SWE-069

Document Defects and Track

SWE-104

Software Test Plan

SWE-114

Software Test Procedures




Div
idtabs-4

4. Small Projects

For projects with small budgets or small team size, the following approaches may be helpful in reducing the time and cost of preparing software test reports:

  • Automate collection of detailed report data and/or test logs for the test report, particularly if the automated tools already exist at the Center.
  • Use existing templates rather than create new ones.
  • Use less formal reporting for unit tests, if Center procedures allow.
  • Tailor the contents of the software test reports, where allowed and appropriate.
  • Reference existing content rather than duplicate it in the test report.


Div
idtabs-5

5. Resources


refstable

toolstable


Div
idtabs-6

6. Lessons Learned

There are currently no Lessons Learned identified for this requirement.