See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.3.2 The project manager shall perform and report the results of software peer reviews or software inspections for:
a. Software requirements.
b. Software plans, including cybersecurity.
c. Any design items that the project identified for software peer review or software inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
e. Software test procedures.
1.1 Notes
Software peer reviews or software inspections are recommended best practices for all safety and mission-success related software components. Recommended best practices and guidelines for software formal inspections are contained in NASA-STD-8739.9, Software Formal Inspection Standard. 277
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Software peer reviews or inspections are performed to ensure product and process quality, add value, and reduce risk through expert knowledge infusion, confirmation of approach, identification of defects, and specific suggestions for product improvements.
G.20 Peer Reviews
“Peer reviews are focused, in-depth technical reviews that support the evolving design and development of a product, including critical documentation or data packages. The participants in a peer review are the technical experts and key stakeholders for the scope of the review.” 041
3. Guidance
Peer reviews are one of the most efficient and effective ways to find and remove defects.
3.1 Peer Review Defined
NASA-STD-8709.22 provides two definitions for peer reviews.
[1] A review of a software work product, following defined procedures, by peers of the product producers to identify defects and improvements.
[2] Independent evaluation by internal or external subject matter experts who do not have a vested interest in the work product under review. Projects can plan peer reviews, and focused reviews conducted on selected work products by the producer’s peers to identify defects and issues before that work product moves into a milestone review or approval cycle. 274
Peer reviews can also be described as “planned, focused reviews by technical team peers on a single work product with the intent of identifying issues before that work product moves on to the next step. A peer review includes planning, preparing, conducting, analyzing outcomes, and identifying and implementing corrective actions.” 041
A key rationale for using software peer reviews is that there are a few verification and validation (V&V) approaches that can be applied in the early stages of software development, long before there is any code that can be run and tested. Moreover, when defects are found and fixed in these early stages rather than slipping into later phases, it can have a huge impact on the project budget. For this reason, software peer reviews and software inspections of requirements documents are explicitly required.
6.5.5 Peer Reviews of Software Requirements
"Peer Reviews have the most impact when applied early in the life of a project, especially the requirements specification and definition stages of a project. Impact means that the defects are found earlier when it's cheaper to fix them... Peer Reviews greatly improves the communication within a project and enhances understanding of the system while scrubbing out many of the major errors and defects." 276
3.2 Advantages of Peer Reviews
3.2.1 Stakeholder Buy-in
Software plans are another critical artifact on which software peer reviews can be applied with the best return on investment. Since well-developed and appropriate plans that have buy-in from key stakeholders are important elements of critical software success, peer review or inspections are applied to improve the quality of such plans.
3.2.2 Improving Quality
Software testing represents a substantial part of assuring software quality for most projects, so it is important to ensure that test cases are focused on the appropriate functional areas, cover all important usage scenarios, and specify the expected system behavior adequately and accurately. The best way to ensure these factors is via peer review/inspection, the application of human judgment, and analysis.
See also Topic 7.06 - Software Test Estimation and Testing Levels.
Code and design artifacts also benefit from peer review or inspection. However, although important, inspections of such artifacts are less crucial than those of other life cycle activities because other effective Verification and Validation (V&V) options are available for code and design (such as testing or simulation). A project needs to identify the most crucial design and code segments and deploy inspections to improve their quality. Projects also need to focus peer reviews of code and design on issues that cannot be verified using automated tools; i.e., projects need to be sure that human judgment is applied on appropriate issues and rely on automation where it is best suited.
3.2.3 Additional Benefits from Peer Reviews
Peer reviews provide the following additional benefits:
Useful for many types of products: documentation, requirements, designs, code | Simple to understand |
Provide a way for sharing/learning good product development techniques | Serve to bring together human judgment and analysis from diverse stakeholders in a constructive way |
This can result in a very efficient method of identifying defects early in the product’s life cycle | Use a straightforward, organized approach for evaluating a work product - To detect potential defects in a product To methodically evaluate each defect to identify solutions and track the incorporation of these solutions into the product |
3.3 Required Peer Reviews
Projects are required to conduct peer reviews for the following documentation based on software classification:
Software Documentation | A | B | C | D | E |
Software Requirements | X | X | X | X (SC only) |
|
Software Plans | X | X | X | X (SC only) |
|
Software Design Identified in Plans | X | X | X | X (SC only) |
|
Software Code identified in Plans | X | X | X | X (SC only) |
|
Test Procedures | X | X | X | X (SC only) |
|
3.4 Preparing for a Peer Review
3.4.1 Peer Review Process Elements
When conducting a peer review, be sure the following are part of the process to have an effective software peer review:
- Keep the review focused on the technical integrity and quality of the product.
- Keep the review simple and informal, and manage time effectively.
- Concentrate on the review of the documentation and minimize presentations.
- Use a round-table format rather than a stand-up presentation.
- Give a full technical picture of the items being reviewed.
- Plan the review/inspection, use checklists, and include readiness and completion criteria.
- Capture action items, and monitor defects, results, and effort.
- Don't pass your opinion off as fact and don't ask judgmental questions.
- Before a software code review, run the project's static analysis tools on any source code under review.
- Make sure that the code changes implement the software requirement and that the software requirement is up-to date.
- Don't use emojis to point out issues and don't ghost people (let the reviewers know what you did with the comments)
- Give a full technical picture of the items being reviewed.
- Automate when possible.
- Take advantage of the talent.
- Plan the review/inspection, use checklists, and include readiness and completion criteria.
- Capture action items, and monitor defects, results, and effort.
- The project's static analysis tools have been run on any source code under review before the software code review.
- Make sure that the code changes implement the software requirement and that the software requirement is up-to-date.
- Use the code reviews and inspections as a teaching opportunity.
3.4.2 Peer Review Members and Roles
When putting a software peer review team together, use the following best practices:
- The team consists of a minimum of four inspectors.
- Diverse viewpoints and objectivity are required.
- Inspection team members are based on the analysis of key stakeholders in the item under inspection.
- The author should not be the reader, recorder, or moderator.
- The moderator should not be a manager.
- At a minimum, the moderator should be formally trained in the process, but all participants may be trained.
- Management presence/participation discouraged.
- Each role has a specific responsibility, as shown in the table below:
Role | Responsibility |
Moderator | Conducts and controls the inspection |
Author | The producer of the product under inspection, answers technical questions |
Reader | Presents (reads, paraphrases) the inspection product to the inspection team |
Recorder | Documents defects identified during the inspection as well as open issues and action items |
Software Peers | Look for software and software coding defects in the product under inspection. |
Hardware Engineer(s) | Look for defects in the product under inspection, and ensure software control of hardware is correct. |
System Engineer(s) | Look for defects in the product under inspection, and ensure software control of the system is correct, including fault detection, fault isolation, and fault recoveries. |
3.4.3 Entrance and Exit Criteria
The table below is from the NASA System Engineering Processes and Requirements, NPR 7123.1, and shows the entrance criteria and success criteria for a peer review activity.
Table G-19 - Peer Review Entrance and Success Criteria
Peer Review | |
Entrance Criteria | Success Criteria |
|
|
*Required per NPD 2570.5.
3.4.4 Peer Review Process Steps
Software peer reviews are conducted using the following steps:
Step | Description |
Planning | Organize inspection, inspection package contents, required support, and schedule. |
Overview | Educational briefing at the time of package distribution to explain materials at a high level |
Preparation | Inspectors individually look for and document defects and develop questions |
Inspection Meeting | Inspectors examine the product as a group to classify and record defects and capture open issues and action items. |
Third Hour | Optional informal meeting to resolve open issues and discuss solutions |
Rework | The author corrects major defects (others when cost and schedule allow) |
Follow-up | The moderator verifies all major and other dispositioned defects have been corrected; no new defects introduced; and all action items/open issues are closed. |
3.4.5 Peer Review Best Practices
When conducting the software peer review, incorporate the following best practices to ensure an effective outcome:
- Defects found during inspections are never used to evaluate the author – the goal is to improve the product.
- Use checklists relevant to each inspector’s perspective.
- Verify that the product being reviewed meets the requirements
- Use readiness and completion criteria.
- Limit inspection meeting to 2 hours.
- Track action items until they are resolved.
- Collect and use inspection data:
- The effort, number of participants, and areas of expertise.
- Defects - list, total, type.
- Inspection outcome (pass/fail).
- The item being inspected and type (requirements, code, etc.).
- Date and time.
- Meeting length, and the preparation time of participants.
NASA-STD-8739.9, Software Formal Inspection Standard 277 includes lessons that practitioners have learned over the last decade. Contained in the Standard are best practices related to performing inspections on different work products, including recommendations for the checklist contents, the minimum set of reference materials required, the needed perspectives to be included on the inspection team, and reasonable page rates that can help plan adequate time for the inspection, specially adapted for when inspecting:
- Requirements.
- Design documents.
- Source code.
- Software plans.
- Software test procedures.
The design and code segments selected for peer review or inspection should be the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap.
The presence and participation of project management in peer review or inspection meetings are usually not recommended due to the potential negative impact on the effectiveness of the inspections. Typically, management only receives summary-level information on peer reviews/inspections. However, since the project manager for both the software and the system is often the stakeholders of the work products examined (especially in the context of software plans), they may be included as participants in the inspections only when necessary.
Both management and the inspectors must be aware that defects found during inspections are never used to evaluate the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and help identify potential solutions.
In Maryland, the Fraunhofer Center maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections.
See also 5.03 - Inspect - Software Inspection, Peer Reviews, Inspections.
See SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking for additional checklists.
Some best practices related to performing peer reviews on different work products:
- Checklists for system requirement inspections should contain items that:
- Describe the proper allocation of functions to software, firmware, hardware, and operations.
- Address the validation of all external user interfaces.
- Check that all the software system functions are identified and broken into configuration items and that the boundary between components is well-defined.
- Check that all configuration items within the software system are identified.
- Check that the identified configuration items provide all functions required of them.
- Check that all interfaces between configuration items within the software system are identified.
- Address the correctness of the software system structure.
- Check that all quantifiable requirements and requirement attributes have been specified.
- Address the verifiability of the requirements.
- Check for the traceability of requirements from mission needs (e.g., use cases, etc.).
- Check for the traceability of requirements from system safety and reliability analyses (e.g., Preliminary Hazard Analysis (PHA), Fault Tree Analysis (FTA), Failure Modes and Effects Analysis (FMEA), hazard reports, etc.).
See also Topic 8.05 - SW Failure Modes and Effects Analysis.
- Check that the software requirements specification of each of the following is complete and accurate:
- Software functions.
- Input and output parameters.
- States and modes.
- Timing and sizing requirements for performance.
- Interfaces.
- Use Cases if available.
- Check that specifications are included for error detection and recovery, reliability, maintainability, performance, safety, and accuracy.
- Check that safety-critical modes and states, and any safety-related constraints, are identified.
- Address the traceability of requirements from higher-level documents.
- Check that the requirements provide a sufficient base for the software design.
- Check that the requirements are measurable, consistent, complete, clear, concise, and testable.
- Check that the content of the software requirement specification fulfills the NPR 7150.2 recommendations, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.
- Checklists for architectural (preliminary) design should contain items that:
- Check that the design meets approved requirements.
- Address the validation of all interfaces among modules within each component.
- Address the completeness of the list of modules and the general function(s) of each module.
- Address the validation of fault detection, identification, and recovery requirements.
- Check that the component structure meets the requirements.
- Address the validation of the selection of reusable components.
- Address the traceability of the design to the approved requirements.
- Address the validation of the input and output interfaces.
- Check that each design decision is a good match to the system’s goal.
- Check that the content of the design description fulfills the NPR 7150.2 recommendation, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.
- Check that safety controls and mitigations are identified in the design document when a safety-critical system is under inspection (Review system safety analyses in supporting documentation).
- When inspecting object-oriented or other design models:
- Check that the notations used in the diagram comply with the agreed-upon model standard notation (e.g., UML notations).
- Check that the design is modular.
- Check that the cohesion and coupling of the models are appropriate.
- Check that architectural styles and design patterns are used where possible. If design patterns are applied, validate that the selected design pattern is suitable.
- Check the output of any self or external static analysis tool outputs.
- Checklists for detailed design should contain items that:
- Check that the design meets the approved requirements.
- Address the validation of the choice of data structures, logic algorithms (when specified), and relationships among modules.
- Check that the detailed design is complete for each module.
- Address the traceability of the design to the approved requirements.
- Check that the detailed design meets the requirements and is traceable to the architectural software system design.
- Check that the detailed design is testable.
- Check that the design can be successfully implemented within the constraints of the selected architecture.
- Check the output from any static analysis tools available.
- Checklists for source code should contain items that:
- Address the technical accuracy and completeness of the code concerning the requirements.
- Check that the code implements the detailed design.
- Check that all required standards (including coding standards) are satisfied.
- Check that latent errors are not present in the code, including errors such as index out-of-range errors, buffer overflow errors, or divide-by-zero errors.
- Address the traceability of the code to the approved requirements.
- Address the traceability of the code to the detailed design.
- When static or dynamic code analysis is available, check the results of these tools.
- Checklists for the test plan should contain items that:
- Check that the purpose and objectives of testing are identified in the test plan and they contribute to the satisfaction of the mission objectives.
- Check that all new and modified software functions will be verified to operate correctly within the intended environment and according to approved requirements.
- Check that the resources and environments needed to verify software functions and requirements correctly are identified.
- Check that all new and modified interfaces will be verified.
- Address the identification and elimination of extraneous or obsolete test plans.
- Check that each requirement will be tested.
- Check that the tester has determined the expected results before executing the test(s).
- For safety-critical software systems:
- Check that all software safety-critical functions or hazard controls and mitigations will be tested. This testing should include ensuring that the system will enter a safe state when unexpected anomalies occur.
- Check that safety and reliability analyses have been used to determine which failures and failure combinations to test for.
- Check that the content of the test plan fulfills NPR 7150.2 recommendations, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.
- Checklists for test procedures should contain items that:
- Check that the set of test procedures meets the objective of the test plan.
- Check that each test procedure provides:
- A complete and accurate description of its purpose
- A description of how it executes
- All expected results.
- Check that each test procedure identifies which requirement(s) it is testing and correctly tests the listed requirement(s).
- Check that each test procedure identifies the required hardware and software configurations.
- Check that test procedures exist to verify the correctness of the safety critical controls as well as any software controls or mitigations of hazards (HW, SW, or CPLD) and that the system can obtain a safe state from different modes, states, and conditions.
- Check that each test procedure will objectively verify the implementation of the requirement with the expected outcome.
- Check that the content of the software test procedure fulfills NPR 7150.2 recommendations, found in NASA-HDBK-2203A, NASA Software Engineering Handbook.
The design and code segments selected for peer review or inspection should be the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap.
The presence and participation of project management in peer review or inspection meetings are usually not recommended due to the potential negative impact on the effectiveness of the inspections. Typically, management only receives summary-level information on peer reviews/inspections. However, since the project manager for both the software and the system is often the stakeholder of the work products examined (especially in the context of software plans), they may be included as participants in the inspections only when necessary.
Both management and the inspectors must be aware that defects found during inspections are never used to evaluate the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and help identify potential solutions. See also SWE-089 - Software Peer Reviews and Inspections - Basic Measurements.
3.4.6 Other Guidance
In Maryland, the Fraunhofer Center maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements.
3.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
3.6 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
While small projects are required to use peer reviews and inspection processes to evaluate key artifact types, they could make the task more manageable by varying the size of the inspection team, as long as key stakeholders are still represented. When it isn't possible to find all of the needed expertise from within the project team itself, consider whether the peer review team can leverage personnel from:
- Areas that interface with the product being developed.
- Related projects.
- Other areas within the functional organization.
- User organization.
Small teams also determine whether quality assurance personnel from the Center participate, for example, by providing a trained moderator to oversee the inspection logistics.
5. Resources
5.1 References
- (SWEREF-041) NPR 7123.1D, Office of the Chief Engineer, Effective Date: July 05, 2023, Expiration Date: July 05, 2028
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-274) NASA-HDBK-8709.22. Baseline, 2018-03-08, Change 5: 2019-05-14
- (SWEREF-276) NASA-GB-8719.13, NASA, 2004. Access NASA-GB-8719.13 directly: https://swehb.nasa.gov/download/attachments/16450020/nasa-gb-871913.pdf?api=v2
- (SWEREF-277) NASA-STD-8739.9, NASA Office of Safety and Mission Assurance, 2013. Change Date: 2016-10-07, Change Number: 1
- (SWEREF-319) Shull, F., Basili, V. R., Boehm, B., Brown, A. W., Costa, P., Lindvall, M., Port, D., Rus, I., Tesoriero, R., and Zelkowitz, M. V., Proc. IEEE International Symposium on Software Metrics (METRICS02), pp. 249-258. Ottawa, Canada, June 2002.
- (SWEREF-421) "A collection of checklists for different purposes," Fraunhofer USA, The University of Maryland. This web page contains several resources for performing software inspections.
- (SWEREF-521) Public Lessons Learned Entry: 740.
- (SWEREF-685) Arianne 5 Inquiry Board - European Space Agency
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
- Deficiencies in Mission Critical Software Development for Mars Climate Orbiter (1999). Lesson Number 0740 521: Experience at NASA has also shown that a lack of software reviews can result in the loss of spacecraft: "1) Non-compliance with preferred software review practices may lead to mission loss. 2) Identifying mission-critical software requires concurrent engineering and thorough review by a team of systems engineers, developers, and end-users. 3) For all mission-critical software (or software interfaces between two systems or two major organizations), systems engineers, developers, and end-users should participate in ... walk-throughs of requirements, design, and acceptance plans."
- Arianne 5 -The Inquiry Board's Recommendations: - 685
Review all flight software (including embedded software), and in particular:
- Identify all implicit assumptions made by the code and its justification documents on the values of quantities provided by the equipment. Check these assumptions against the restrictions on the use of the equipment.
- Verify the range of values taken by any internal or communication variables in the software.
- Solutions to potential problems in the onboard computer software, paying particular attention to onboard computer switchover, shall be proposed by the Project Team and reviewed by a group of external experts, who shall report to the onboard- computer Qualification Board.
Include external (to the project) participants when reviewing specifications, code, and justification documents. Make sure that these reviews consider the substance of arguments, rather than check that verifications have been made.
6.2 Other Lessons Learned
- A substantial body of data and experience justifies the use of inspections on requirements. Finding and fixing requirements problems during requirements analysis is cheaper than doing so later in the life cycle and is substantially cheaper than finding and fixing the same defects after delivering the software. Data from NASA and numerous other organizations (such as IBM, Toshiba, and the Defense Analysis Center for Software) all confirm this effect. 319
- The effectiveness of inspections for defect detection and removal in any artifact has also been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 percent of the existing defects, regardless of the artifact type. 319
- Ensure peer review participation by key stakeholders including Systems Engineering, and all affected Responsible Engineers.
- Perform requirements checks as part of Implementation or code peer reviews.
7. Software Assurance
a. Software requirements.
b. Software plans, including cybersecurity.
c. Any design items that the project identified for software peer review or software inspections according to the software development plans.
d. Software code as defined in the software and or project plans.
e. Software test procedures.
7.1 Tasking for Software Assurance
1. Confirm that software peer reviews are performed and reported on for project activities.
2. Confirm that the project addresses the accepted software peer review findings.
3. Perform peer reviews on software assurance and software safety plans.
4. Confirm that the source code satisfies the conditions in the NPR 7150.2 requirement SWE-134, "a" through "l," based upon the software functionality for the applicable safety-critical requirements at each code inspection/review.
7.2 Software Assurance Products
- SA peer review records (Including findings for software assurance and software safety plans.)
Objective Evidence
- Peer review metrics, reports, data, or findings,
- List of participants in the software peer reviews,
7.3 Metrics
- # of Non-Conformances identified in each peer review
- # of peer reviews performed vs. # of peer reviews planned
- # of software work product Non-Conformances identified by life cycle phase over time
- Time required to close review Non-Conformances
- Total # of peer review Non-Conformances (Open, Closed)
- # of Non-Conformances identified by software assurance during each peer review
- # of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures
- # of safety-related non-conformances identified by life cycle phase over time
- # of safety-related requirement issues (Open, Closed) over time
- # of Non-Conformances (activities not being performed)
- # of Non-Conformances accepted by the project
- # of Non-Conformances (Open, Closed, Total)
- Trends of Open vs. Closed Non-Conformances over time
- % of Total Source Code for each Software Classification (*organizational measure)
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
Confirm that peer reviews are planned and are being executed on the items listed in SWE-087. Early in the project (e.g., before SRR or PDR), assurance confirms that peer reviews are planned for the requirements, the software plan, and the test procedures. In addition to these peer reviews, confirm that the project has considered whether other products need to be reviewed and have included a list of those in their software management/development plan.
Generally, the additional peer-reviewed items are design and code products for any software design or code intended to address requirements for critical software or for any of the areas of the code or design that are particularly complex. Software assurance may also choose other areas to peer review independently if they feel a particular area or product needs review.
Plan to attend any scheduled reviews listed in this requirement and any identified in the software plan. Other software assurance responsibilities associated with peer reviews are addressed in SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking. This includes confirming that all issues and defects from the peer reviews are recorded and addressed before the review is closed out. Software assurance tracks all actions (issues, defects) from the peer reviews and verifies their closure before the reviews are closed. See also Topic 7.10 - Peer Review and Inspections Including Checklists.
Software assurance products such as the SA Plan and the requirement assessments done by SA will also be peer-reviewed. In addition to addressing any issues, or defects resulting from the SA product reviews, the software assurance team tracks its review metrics, including those listed in section 7.3 above. Software assurance confirms that the software team is collecting metrics on their peer reviews. It is beneficial for software assurance to collect the metrics on any issues and defects found by software assurance in software peer reviews to show that their attendance was valuable.
Ensure peer review participation by key stakeholders including Systems Engineering, and all affected Responsible Engineers.
Ensure that peer reviews perform requirements checks as part of any code peer review activity and process.
Software Assurance personnel should verify compliance of the performed peer review to the procedures by:
- Selectively reviewing peer review packages for required materials and personnel participation.
- Participating in peer reviews, including fulfillment of any of the inspection roles.
- Provide an independent evaluation of the effectiveness of the inspection process and the product quality.
Software Assurance personnel will ensure that:
- Ensure compliance with requirements defined in NPR 7150.2 and NASA-STD 8739.8.
- Ensure that preventive and safety measures are being implemented.
- Verify that requirements include error detection and recovery methods.
- Validate fault detection, identification, mitigation, and recovery requirements.
The following is an example of error taxonomy for code-related defects:
- Algorithm or method: An error in the sequence or set of steps used to solve a particular problem or computation, including mistakes in computations, incorrect implementation of algorithms, or calls to an inappropriate function for the algorithm being implemented.
- Assignment or initialization: A variable or data item that is assigned a value incorrectly or is not initialized properly or where the initialization scenario is mishandled (e.g., incorrect publish or subscribe, incorrect opening of the file, etc.)
- Checking: Software contains inadequate checking for potential error conditions or an inappropriate response is specified for error conditions.
- Data: Error in specifying or manipulating data items, incorrectly defined data structure, pointer or memory allocation errors, or incorrect type conversions.
- External interface: Errors in the user interface (including usability problems) or interfaces with other systems.
- Internal interface: Errors in the interfaces between system components, including mismatched calling sequences and incorrect opening, reading, writing, or closing of files and databases.
- Logic: Incorrect logical conditions on if, case, or loop blocks, including incorrect boundary conditions ("off by one" errors are an example) being applied, or incorrect expression (e.g., incorrect use of parentheses in a mathematical expression).
- Non-functional defects: Includes non-compliance with standards, failure to meet non-functional requirements such as portability and performance constraints, and lack of clarity of the design or code to the reader - both in the comments and the code itself.
- Timing or optimization: Errors that will cause timing (e.g., potential race conditions) or performance problems (e.g., unnecessarily slow implementation of an algorithm).
- Coding Standard Violation: When reviewing code, the coding standards need to be reviewed and verified that the code meets them.
- Other: Anything that does not fit any of the above categories that is logged during an inspection of a design artifact or source code.
Checklists for software peer reviews should contain items that:
- Address that the effort, schedule, and cost estimates for each activity (e.g., development, configuration management, maintenance, assurance, safety, security) are reasonable.
- Address the allocations of appropriate resources, including tools and personnel with the needed skills and knowledge.
- Check that all risks have been identified and documented along with their probability of occurrence, impact, and mitigation strategy.
- Assure sufficient management of the produced project data.
- Check that an appropriate and feasible plan exists for sustainment and retirement of the software, if applicable.
- Check that the plan fulfills the corresponding NPR 7150.2 recommended contents.
Assure that all code peer reviews have verified that the code or code changes meet the software requirements.
See SWE-134 - Safety-Critical Software Design Requirements for additional guidance associated with cyclomatic complexity assessments.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: