4.3.4 The project shall, for each planned software peer review/inspection, record basic measurements. The requirement describing the contents of a Software Peer Review/Inspection Report is defined in Chapter 5 [of NPR 7150.2, NASA Software Engineering Requirements, Section 5.3.3]. Class F is labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf (OTS) software for these classes. Class G is labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? X P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable As with other engineering practices, it is important to monitor defects, pass/fail results, and effort. This is necessary to ensure that peer reviews/inspections are being used in an appropriate way as part of the overall software development life cycle, and to be able to improve the process itself over time. Moreover, key measurements are required to interpret inspection results correctly. For example, if very little effort is expended on an inspection or key phases (such as individual preparation) are skipped altogether, it is very unlikely that the inspection will have found a majority of the existing defects. NASA-STD 2202-93, Software Formal Inspection Standard, is currently being updated and revised to include lessons that have been learned by practitioners over the last decade. The creators of the updated NASA-STD 2202-93, Software Formal Inspection Standard, suggest several best practices related to the collection and the use of inspection data. This requirement along with SWE-119 collects effort, number of participants, defects, number and types of defects found, pass/fail, and identification in order to ensure the effectiveness of the inspection. Where peer reviews/inspections yield less than expected results, some questions to address may include: As with other forms of software measurement, best practices for ensuring that the collection and analysis of peer review/inspection metrics are done well include: Best practices related to the collection and analysis of inspection data include: In an acquisition context, there are several important considerations for assuring proper inspection usage by software provider(s): Additional guidance regarding software peer review/inspection measures can be found in the guidebook section for SWE-119. Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources, such as templates, related to peer reviews and inspections. Projects with small budgets or a limited number of personnel need not use complex or user-intensive data collection logistics. Given the amount of data typically collected, well-known and easy to use tools such as Excel sheets or small databases (e.g., implemented in MS Access) are usually sufficient to store and analyze the inspections performed on a project. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. Over the course of hundreds of inspections and analysis of their results, the Jet Propulsion Laboratory (JPL) has identified key lessons learned which lead to more effective inspections, including:
See edit history of this section
Post feedback on this section
1. Requirements
1.1 Notes
1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
SWE-089 - Software Peer Reviews and Inspections - Basic Measurements
Web Resources
View this section on the websiteUnknown macro: {page-info}