bannerc
SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures

1. Requirements

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:

    1. Software requirements.
    2. Software plans.
    3. According to the software development plans, any design items that the project identified for software peer review or software inspections.
    4. Software code as defined in the software and or project plans.
    5. Software test procedures.

1.1 Notes

Software peer reviews or software inspections are recommended best practices for all safety and mission-success related software components. Recommended best practices and guidelines for software formal inspections are contained in NASA-STD-8739.9 277.

1.2 History

SWE-087 - Last used in rev NPR 7150.2D

RevSWE Statement
A

4.3.1 The project shall perform and report on software peer reviews/inspections for:

    1. Software requirements.
    2. Software Test Plan.
    3. Any design items that the project identified for software peer review/inspections according to the software development plans.
    4. Software code as defined in the software and or project plans.
Difference between A and BAdded item e. Software test procedures to the requirement.
Broadened the scope of the plans to be reviewed to include all Plans, not just the Test Plan.
B

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:

    1. Software requirements.
    2. Software plans.
    3. Any design items that the project identified for software peer review or software inspections according to the software development plans.
    4. Software code as defined in the software and or project plans.
    5. Software test procedures.
Difference between B and C

No change

C

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:

    1. Software requirements.
    2. Software plans.
    3. According to the software development plans, any design items that the project identified for software peer review or software inspections.
    4. Software code as defined in the software and or project plans.
    5. Software test procedures.

Difference between C and D

Added cybersecurity plans to part b.

D

5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for: 

a. Software requirements.
b. Software plans, including cybersecurity.
c. Any design items that the project identified for software peer review or software inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
e. Software test procedures.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Software peer reviews or software inspections are performed to ensure product and process quality and to add value and reduce risk through expert knowledge infusion, confirmation of approach, identification of defects, and specific suggestions for product improvements. “Peer reviews are focused, in-depth technical reviews that support the evolving design and development of a product, including critical documentation or data packages. The participants in a peer review are the technical experts and key stakeholders for the scope of the review.” 486

3. Guidance

Peer reviews are one of the most efficient and effective ways to find and remove defects.

NASA-STD-8709.22 provides two definitions for peer reviews: 274

[1] A review of a software work product, following defined procedures, by peers of the product producers to identify defects and improvements.

[2] Independent evaluation by internal or external subject matter experts who do not have a vested interest in the work product under review. Projects can plan peer reviews, focused reviews conducted on selected work products by the producer’s peers to identify defects and issues before that work product moves into a milestone review or approval cycle.

Peer reviews can also be described as “planned, focused reviews by technical team peers on a single work product with the intent of identifying issues before that work product moves on to the next step. A peer review includes planning, preparing, conducting, analyzing outcomes, and identifying and implementing corrective actions.” 041

A key rationale for using software peer reviews or software inspections is that they are a few verification and validation (V&V) approaches that can be applied in the early stages of software development, long before there is any code that can be run and tested. Moreover, when defects are found and fixed in these early stages rather than slip into later phases, it can have a huge impact on the project budget. For this reason, software peer reviews and software inspections of requirements documents are explicitly required.

As documented in NASA-GB-8719.13, NASA Software Safety Guidebook 276 , "Formal Inspections have the most impact when applied early in the life of a project, especially the requirements specification and definition stages of a project. Impact means that the defects are found earlier when it's cheaper to fix them. Formal Inspection greatly improves the communication within a project and enhances understanding of the system while scrubbing out many of the major errors and defects." 

Software plans are another key artifact on which software peer reviews or software inspections can be applied with the best return on investment. Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review or inspections are applied to improve the quality of such plans.

Software testing represents a substantial part of assuring software quality for most projects, so it is important to ensure that test cases are focused on the appropriate functional areas, cover all important usage scenarios, and specify the expected system behavior adequately and accurately. The best way to ensure these factors is via peer review/inspection, the application of human judgment, and analysis.

Code and design artifacts also benefit from peer review or inspection. However, although important, inspections of such artifacts are less crucial than those of other life-cycle activities because there are other effective Verification and Validation (V&V) options available for code and design (such as testing or simulation). A project needs to identify the most crucial design and code segments and deploy inspections to improve their quality. Projects also need to focus inspections of code and design on issues that cannot be verified using automated tools; i.e., projects need to be sure that human judgment is applied on appropriate issues and rely on automation where it is best suited.

Peer reviews/inspections provide the following additional benefits:

Useful for many types of products: documentation, requirements, designs, code

Simple to understand

Provide a way for sharing/learning good product development techniques

Serve to bring together human judgment and analysis from diverse stakeholders in a constructive way

Can result in the very efficient method of identifying defects early in the product’s life cycle

Use a straightforward, organized approach for evaluating a work product

-          To detect potential defects in a product

To methodically evaluate each defect to identify solutions and track incorporation of these solutions into the product

Projects are required to conduct peer reviews/inspections for the following documentation based on software classification:

Software Documentation

A

B

C

D

E

Software Requirements

X

X

X

X (SC only)

 

Software Plans

X

X

X

X (SC only)

 

Software Design Identified in Plans

X

X

X

X (SC only)

 

Software Code identified in Plans

X

X

X

X (SC only)

 

Test Procedures

X

X

X

X (SC only)

 

 When conducting a peer review/inspection, be sure the following are part of the process to have an effective software peer review/inspection:

  • Keep the review focused on the technical integrity and quality of the product.
  • Keep the review simple and informal.
  • Concentrate on review of the documentation and minimize presentations.
  • Use a round-table format rather than a stand-up presentation.
  • Give a full technical picture of items being reviewed.
  • Plan the review/inspection, use checklists, and include readiness and completion criteria.
  • Capture action items, monitor defects, results, and effort.
  • The project's static analysis tools have been run on any source code under review before the software code review.
  • Make sure that the code changes implement the software requirement and that the software requirement is up-to-date.

When putting a software peer review team together, use the following best practices:

  • The team consists of a minimum of four inspectors.
    • Diverse viewpoints and objectivity are required.
  • Inspection team members are based on the analysis of key stakeholders in the item under inspection.
  • The author should not be the reader, recorder, or moderator.
  • The moderator should not be a manager.
  • At a minimum, the moderator should be formally trained in the process, but all participants may be trained.
  • Management presence/participation discouraged.
  • Each role has a specific responsibility, as shown in the table below:

Role

Responsibility

Moderator

Conducts and controls the inspection

Author

Producer of the product under inspection, answers technical questions

Reader

Presents (reads, paraphrases) the inspection product to the inspection team

Recorder

Documents defects identified during the inspection as well as open issues and action items

Software Peers

Look for software and software coding defects in the product under inspection.

Hardware Engineer(s)Look for defects in the product under inspection, ensure software control of hardware is correct.
System Engineer(s)Look for defects in the product under inspection, ensure software control of the system is correct, including fault detection, fault isolation, and fault recoveries.

The table below is from the NASA System Engineering Processes and Requirements, NPR 7123.1, and shows the entrance criteria and success criteria for a peer review activity.

Table G-19 - Peer Review Entrance and Success Criteria

Peer Review

Entrance Criteria

Success Criteria

  1. The product to be reviewed (e.g., document, process, model, design details) has been identified and made available to the review team.
  2. Peer reviewers independent from the project have been selected for their technical background related to the product being reviewed.
  3. A preliminary agenda, success criteria, and instructions to the review team have been agreed to by the technical team and project manager.
  4. Rules have been established to ensure consistency among the team members involved in the peer-review process.
  5. *Spectrum (radio frequency) considerations addressed.
  1. Peer review has thoroughly evaluated the technical integrity and quality of the product.
  2. Any defects have been identified and characterized.
  3. The results of the peer review are communicated to the appropriate project personnel.
  4. Spectrum-related aspects have been concurred to by the responsible Center spectrum manager.
  5. A code peer review checks to verify that the code meets the software requirements

*Required per NPD 2570.5.


Software peer reviews are conducted using the following steps:

Step

Description

Planning

Organize inspection, inspection package contents, required support, schedule.

Overview

Educational briefing at time of package distribution to explain materials at a high level

Preparation

Inspectors individually look for and document defects and develop questions

Inspection Meeting

Inspectors examine the product as a group to classify and record defects, capture open issues and action items.

Third Hour

Optional informal meeting to resolve open issues and discuss solutions

Rework

The author corrects major defects (others when cost and schedule allow)

Follow-up

Moderator verifies all major and other dispositioned defects have been corrected; no new defects introduced; all action items/open issues are closed.

 When conducting the software peer review, incorporate the following best practices to ensure an effective outcome:

  • Defects found during inspections are never used to evaluate the author – the goal is to improve the product.
  • Use checklists relevant to each inspector’s perspective.
  • Verify that the product being reviewed meets the requirements
  • Use readiness and completion criteria.
  • Limit inspection meeting to 2 hours.
  • Track action items until they are resolved.
  • Collect and use inspection data:
    • The effort, number of participants, areas of expertise.
    • Defects - list, total, type.
    • Inspection outcome (pass/fail).
    • The item being inspected and type (requirements, code, etc.).
    • Date and time.
    • Meeting length, the preparation time of participants.

NASA-STD-8739.9, Software Formal Inspection Standard 277 includes lessons that practitioners have learned over the last decade. Contained in the Standard are best practices related to performing inspections on different work products, including recommendations for the checklist contents, the minimum set of reference materials required, the needed perspectives to be included on the inspection team, and reasonable page rates that can help plan adequate time for the inspection, specially adapted for when inspecting:

  • Requirements.
  • Design documents.
  • Source code.
  • Software plans.
  • Software test procedures.

The design and code segments selected for peer review or inspection should be the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap.

The presence and participation of project management in peer review or inspection meetings are usually not recommended due to the potential negative impact on the effectiveness of the inspections. Typically, management only receives summary-level information on peer reviews/inspections. However, since the project manager for both the software and the system is often the stakeholders of the work products examined (especially in the context of software plans), they may be included as participants of the inspections only when necessary.

Both management and the inspectors must be aware that defects found during inspections are never used to evaluate the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and help identify potential solutions.

In Maryland, the Fraunhofer Center maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.

NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.

NASA-specific peer review and inspection guidance, checklists, worksheets, etc., are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook. 

Additional guidance related to peer reviews and inspections may be found in the following topic and related requirements in this handbook:

4. Small Projects

While small projects are required to use peer reviews and/or inspection processes to evaluate key artifact types, they could make the task more manageable by varying the size of the inspection team, as long as key stakeholders are still represented. When it isn't possible to find all of the needed expertise from within the project team itself, consider whether the peer review/inspection team can leverage personnel from:

  • Areas that interface with the product being developed.
  • Related projects.
  • Other areas within the functional organization.
  • User organization.

Small teams also determine whether quality assurance personnel from the Center participate, for example, by providing a trained moderator to oversee the inspection logistics. 

5. Resources

5.1 References


5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Deficiencies in Mission Critical Software Development for Mars Climate Orbiter (1999). Lesson Number 0740 521: Experience at NASA has also shown that a lack of software reviews can result in loss of spacecraft: "1) Non-compliance with preferred software review practices may lead to mission loss. 2) Identifying mission-critical software requires concurrent engineering and thorough review by a team of systems engineers, developers, and end-users. 3) For all mission-critical software (or software interfaces between two systems or two major organizations), systems engineers, developers, and end-users should participate in ... walk-throughs of requirements, design, and acceptance plans."
  • Arianne 5 -The Inquiry Board's Recommendations: -  685
    • Review all flight software (including embedded software), and in particular:

      • Identify all implicit assumptions made by the code and its justification documents on the values of quantities provided by the equipment. Check these assumptions against the restrictions on the use of the equipment.
      • Verify the range of values taken by any internal or communication variables in the software.
      • Solutions to potential problems in the onboard computer software, paying particular attention to onboard computer switchover, shall be proposed by the Project Team and reviewed by a group of external experts, who shall report to the onboard- computer Qualification Board.
    • Include external (to the project) participants when reviewing specifications, code, and justification documents. Make sure that these reviews consider the substance of arguments, rather than check that verifications have been made.

6.2 Other Lessons Learned

  • A substantial body of data and experience justifies the use of inspections on requirements. Finding and fixing requirements problems during requirements analysis is cheaper than doing so later in the life cycle and is substantially cheaper than finding and fixing the same defects after delivering the software. Data from NASA and numerous other organizations (such as IBM, Toshiba, the Defense Analysis Center for Software) all confirm this effect. 319
  • The effectiveness of inspections for defect detection and removal in any artifact has also been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type. 319
  • Ensure peer review participation by key stakeholders to include Systems Engineering, and all affected Responsible Engineers.
  • Perform requirements check as part of Implementation or code peer reviews.

7. Software Assurance

SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures
5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:
    1. Software requirements.
    2. Software plans.
    3. According to the software development plans, any design items that the project identified for software peer review or software inspections.
    4. Software code as defined in the software and or project plans.
    5. Software test procedures.

7.1 Tasking for Software Assurance

  1. Confirm that software peer reviews are performed and reported on for project activities.

  2. Confirm that the project addresses the accepted software peer review findings.

  3. Perform peer reviews on software assurance and software safety plans.

  4. Confirm that the source code satisfies the conditions in the NPR 7150.2 requirement SWE-134, "a" through "l," based upon the software functionality for the applicable safety-critical requirements at each code inspection/review.

  5. For code peer reviews, confirm that all identified software safety-critical components have a cyclomatic complexity value of 15 or lower or develop a risk for the software safety-critical components with a cyclomatic complexity value over 15.

7.2 Software Assurance Products

  • SA peer review records (Including findings for  software assurance and software safety plans.)


Objective Evidence

  • Peer review metrics, reports, data, or findings,
  • List of participates in the software peer reviews,

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
  • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of software work product Non-Conformances identified by life-cycle phase over time
  • # of Peer Review Audits planned vs. # of Peer Review Audits performed
  • Time required to close review Non-Conformances
  • Total # of peer review Non-Conformances (Open, Closed)
  • # of Non-Conformances identified by software assurance during each peer review
  • # of Non-Conformances identified in each peer review
  • # of peer reviews performed vs. # of peer reviews planned
  • #  of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures
  • Software cyclomatic complexity # for safety-critical components
  • # of safety-related non-conformances identified by life-cycle phase over time
  • # of safety-related requirement issues (Open, Closed) over time
  • # of Non-Conformances from reviews (Open vs. Closed; # of days Open)
  • # of Non-Conformances (activities not being performed)
  • # of Non-Conformances accepted by the project
  • # of Non-Conformances (Open, Closed, Total)
  • Trends of overtime
  • % of Total Source Code for each Software Classification (*organizational measure)

7.4 Guidance

Confirm that peer reviews are planned and are being executed on the items listed in SWE-087. Early in the project (e.g., before SRR or PDR), assurance confirms that peer reviews are planned for the requirements, the software plan, and the test procedures. In addition to these peer reviews, confirm that the project has considered whether other products need to be reviewed and have included a list of those in their software management/development plan. 

Generally, the additional peer-reviewed items are design and code products for any software design or code intended to address requirements for critical software or for any of the areas of the code or design that are particularly complex. Software assurance may also choose other areas to peer review independently if they feel a particular area or product needs review.

Plan to attend any scheduled reviews listed in this requirement and any identified in the software plan. Other software assurance responsibilities associated with peer reviews are addressed in SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking. This includes confirming that all issues and defects from the peer reviews are recorded and addressed before the review is closed out. Software assurance tracks all actions (issues, defects) from the peer reviews and verifies their closure before the reviews are closed out.

Software assurance products such as the SA Plan and the requirement assessments done by SA will also be peer-reviewed. In addition to addressing any issues, defects resulting from the SA product reviews, the software assurance team tracks their review metrics, including those listed in section 7.3 above. Software assurance confirms that the software team is collecting metrics on their peer reviews. It is beneficial for software assurance to collect the metrics on any issues and defects found by software assurance in software peer reviews to show that their attendance was valuable.

Ensure peer review participation by key stakeholders to include Systems Engineering, and all affected Responsible Engineers.

Ensure that peer reviews perform requirements check as part of any code peer review activity and process.

Assure that all code peer reviews have verified that the code or code changes meet the software requirements.

See SWE-134 - Safety Critical Software Design Requirements for additional guidance associated with cyclomatic complexity assessments.



  • No labels