4.3.1 The project shall perform and report on software peer reviews/inspections for: Software peer reviews/inspections are a recommended best practice for all safety and mission-success related design and code software components. Guidelines for software peer reviews/inspections are contained in NASA-STD-2202.93, NASA Software Formal Inspection Standard. Class D through E and safety critical are labeled with "SO" (safety only). This means that the requirement applies to the safety-critical aspects of the software. Class G is labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement. Class F is labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf software for these classes. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? X X X P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable A key rationale for using software peer reviews/inspections is that they are one of the few V&V (Verification and Validation) approaches that can be applied in the early stages of software development, long before there is any code that can be run and tested. Moreover, when defects are found and fixed in these early stages rather than allowed to slip into later phases, it can have a huge impact on project budget. For this reason, software peer reviews/inspections of requirements documents are explicitly required. Test plans are another key artifact on which software peer reviews/inspections can be applied with the best return on investment. Software testing represents a substantial part of the effort of assuring software quality for most projects; so it is important to ensure that test cases are focused on the appropriate functional areas, cover all important usage scenarios, and specify the expected system behavior adequately and accurately. The best way to ensure these factors is via peer review/inspection, the application of human judgment and analysis. Code and design artifacts also benefit from peer review/inspection. However, although important, inspections of such artifacts are less crucial than those of other life cycle activities because there are other effective verification and validation (V&V) options available for code and design (such as testing or simulation). It is important for a project to identify the most crucial design and code segments and deploy inspections on them to improve their quality. Projects also need to focus inspections of code and design on issues that cannot be verified using automated tools; i.e., projects need to be sure that human judgment is applied on appropriate issues and rely on automation where it is best suited. NASA-STD-2202, Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. Contained in the Standard are best practices related to performing inspections on different work products, including recommendations for the checklist contents, the minimum set of reference materials required, the needed perspectives to be included on the inspection team, and reasonable page rates that can help plan adequate time for the inspection, specially adapted for when inspecting: The design and code segments selected for peer review/inspection need to be those that are the most critical, complex, have key interfaces, or otherwise represent areas where the concerns of multiple stakeholders overlap. The presence and participation of project management in peer review/inspection meetings is usually not recommended due to the potential negative impact to the effectiveness of the inspections. Typically, management only receives summary level information on peer reviews/inspections. Both management and the inspectors must be aware that defects found during inspections are never to be used for evaluating the authors. Everyone involved in an inspection needs to have a vested interest in improving the product that is being inspected. This requires that everyone be willing to identify defects (including the author) and to help identify potential solutions. The Fraunhofer Center 421 in Maryland maintains a public website that collects checklists found from NASA and other contexts, which can be applied to the types of work products mentioned in these requirements. Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections. Additional guidance related to peer reviews and inspections may be found in the following related requirements in this handbook: Software Peer Reviews and Inspections - Checklist Criteria and Tracking Software Peer Reviews and Inspections - Basic Measurements While small projects are required to use peer reviews and/or inspection processes to evaluate key artifact types, they could make the task more manageable by varying the size of the inspection team, as long as key stakeholders are still represented. When it isn't possible to find all of the needed expertise from within the project team itself, consider whether the peer review/inspection team can leverage personnel from: Small teams also determine whether quality assurance personnel from the Center participate, for example by providing a trained moderator to oversee the inspection logistics. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. A substantial body of data and experience justifies the use of inspections on requirements. Finding and fixing requirements problems during requirements analysis is cheaper than doing so later in the life cycle, and is substantially cheaper than finding and fixing the same defects after delivery of the software. Data from NASA as well as numerous other organizations (such as IBM, Toshiba, the Defense Analysis Center for Software) all confirm this effect. 319 The effectiveness of inspections for defect detection and removal in any artifact has also been amply demonstrated. Data from numerous organizations have shown that a reasonable rule of thumb is that a well-performed inspection typically removes between 60 percent and 90 per cent of the existing defects, regardless of the artifact type. 319 A documented lesson from the NASA Lessons Learned database notes the following: Deficiencies in Mission Critical Software Development for Mars Climate Orbiter (1999). Lesson Number 0740: Experience at NASA has also shown that a lack of software reviews can result in loss of spacecraft: "1) Non-compliance with preferred software review practices may lead to mission loss. 2) To identify mission critical software, require concurrent engineering and thorough review by a team of systems engineers, developers, and end users. 3) For all mission critical software (or software interfaces between two systems or two major organizations), systems engineers, developers, and end users should participate in ... walk-throughs of requirements, design, and acceptance plans." 521
See edit history of this section
Post feedback on this section
1. Requirements
a. Software requirements.
b. Software Test Plan.
c. Any design items that the project identified for software peer review/inspections according to the software development plans.
d. Software code as defined in the software and or project plans.1.1 Notes
1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
SWE-087 - Software Peer Reviews and Inspections for Requirements, Test Plans, Design, and Code
Web Resources
View this section on the websiteUnknown macro: {page-info}
As documented in NASA-GB-8719.13, NASA Software Safety Guidebook , "Formal Inspections have the most impact when applied early in the life of a project, especially the requirements specification and definition stages of a project. Impact means that the defects are found earlier, when it's cheaper to fix them....Formal Inspection greatly improves the communication within a project and enhances understanding of the system while scrubbing out many of the major errors and defects."