bannerd


SWE-186 - Unit Test Repeatability

1. Requirements

4.4.6 The project manager shall assure that the unit test results are repeatable.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-186 - Last used in rev NPR 7150.2D

RevSWE Statement
A


Difference between A and B

N/A

B


Difference between B and C

NEW

C

4.4.6 The project manager shall assure that the unit test results are repeatable. 

Difference between C and DNo change
D

4.4.6 The project manager shall assure that the unit test results are repeatable.



1.3 Applicability Across Classes

 

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Unit test procedures are to be repeatable so that future runs can confirm that any identified flaws have been corrected and for regression purposes to ensure that any new changes do not introduce new flaws in the software. As stated in SWE-062 - Unit Test,  unit testing can be described as the confirmation that the unit performs the capability assigned to it, correctly interfaces with other units and data, and represents a faithful implementation of the unit design.  

3. Guidance

3.1 Performing Unit Tests

Unit tests are performed per SWE-062 - Unit Test. Unit test results should follow test procedures and be repeatable.

IEEE STD 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology, a "unit" is defined as:

  1. A separately testable element specified in the design of a computer software component. 
  2. A logically separable part of a computer program.
  3. A software component that is not subdivided into other components.

Given the low-level nature of a unit of code, the person most able to fully test that unit is the developer who created it.  

3.2 Prepare for Unit Testing

Projects ensure that the appropriate test environment, test materials, and personnel training (SWE-017 - Project and Software Training), are in place and then conduct unit tests per the approved plans (5.10 - STP - Software Test Plan), according to the schedule (SWE-016 - Software Schedule), and with proper monitoring per the software assurance plan, making sure that:

  • Criteria for a successful test are established before the test.
  • The test environment represents inputs, output, and stimulus the unit will experience in operation.
  • Capture weaknesses or differences between the unit test environment and the actual target environment.
  • Following the approved plans for unit testing:
    • Unit test results are captured and compared with the expected results in the approved test procedures, documenting any differences.  
    • Issues are identified and documented (some minor issues, such as typos, as defined by the project, may simply be corrected without documentation).
    • Unit test issues are corrected; these may include:
      • Issues found in the code.
      • Issues found in test instruments (e.g., scripts, data, procedures).
      • Issues found in testing tools (e.g., setup, configuration).
    • Unit test corrections are captured (for root cause analysis, as well as proof that the unit test plans were followed).
  • Unit test results are evaluated by someone other than the tester to confirm the results, as applicable and practical; evaluation results captured.
  • Unit test data, scripts, test cases, procedures, test drivers, test stubs are captured for reference and any required regression testing.
  • Notes captured in software engineering notebooks or other documents are captured for reference.
  • Objective evidence that unit tests were completed and unit test objectives met is captured in the Software Development Folders (SDFs) or other appropriate project location as called out in the project documentation (e.g., 5.06 - SCMP - Software Configuration Management Plan, 5.08 - SDP-SMP - Software Development - Management Plan).
  • Unit test metrics captured, as appropriate and defined for the project.

Per NASA-GB-8719.13 276, NASA Software Safety Guidebook, software assurance is to "Verify unit testing and data verification is completed before the unit is integrated." 

Unit tests are particularly important for safety critical modules since this testing often cannot be done once more of the system has been integrated. 

Documented test results, results evaluations, issues, problem reports, corrections, and tester notes can all serve as evidence that unit tests were completed. Comparing those documents to the software test plans for unit testing can ensure the tests were completed following those documented procedures. Make sure evidence of all test passes is captured. See SWE-066 - Perform Testing

Once unit tests are completed and results are documented, the tests should be repeated to ensure the same results.

3.3 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.4 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

No additional guidance is available for small projects.

5. Resources

5.1 References

  • (SWEREF-013) "Code and Unit Test," HOU-EGP-310, Boeing, 2002. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
  • (SWEREF-271) NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07
  • (SWEREF-276) NASA-GB-8719.13, NASA, 2004. Access NASA-GB-8719.13 directly: https://swehb.nasa.gov/download/attachments/16450020/nasa-gb-871913.pdf?api=v2


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-186 - Unit Test Repeatability
4.4.6 The project manager shall assure that the unit test results are repeatable.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that the project maintains the procedures, scripts, results, and data needed to repeat the unit testing (e.g., as-run scripts, test procedures, results).

7.2 Software Assurance Products

  • None at this time.


    Objective Evidence

    • Unit test results.
    • Software problem or defect report findings related to issues identified in unit testing.
    • Configuration data for unit testing.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of planned unit test cases vs. # of actual unit test cases successfully completed
  • # of Safety-Critical tests executed vs. # of Safety-Critical tests witnessed by SA

See also Topic 8.18 - SA Suggested Metrics.

7.4 Guidance

Software assurance will assure that unit test results are repeatable. To do this, they will check that the following are recorded and stored:

  • The tests and test procedures, test cases as run
  • Any input data for the tests, including any databases or other data needed for the test, outputs,
  • Capture any stimulus the unit will experience in the operation
  • Build instructions
  • Scripts or test drivers and test stubs used with the test
  • Test configurations (versions of operating systems, tools used, etc.). Capture any differences between the unit test environment and the planned operational environment
  • The version of the software being tested
  • Test report and a record of any discrepancies found in the unit

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:


  • No labels