bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

SWE-018 - Software Activities Review

1. Requirements

2.2.6 The project shall regularly hold reviews of software activities, status, and results with the project stakeholders and track issues to resolution.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Class G is labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

   

   

   

   

   

    P(C)

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

Regular reviews of software status and activities assure that all needed tasks and deliverables are managed and achieved. Regular reviews include both formal ((e.g., Preliminary Design Review (PDR), Critical Design Review (CDR)) and informal reviews ( e.g., weekly status reviews)). Issues presented or discovered during these activities are communicated to the appropriate personnel. The tracking of these issues to closure assures that errors and shortcomings in the requirements, architecture, design and/or build of the software are corrected and prevented from reoccurring.

Stakeholders typically are those materially affected by the outcome of a decision or a deliverable. In the context of software work product development, a stakeholder may be inside of or outside of the organization doing the work. The key reason for holding regular reviews with project stakeholders is to keep them and their organizations informed since they are both participants in the software work product development and advocates for the activity.

3. Guidance

Regular reviews are those held on a scheduled basis throughout the software development life cycle. These may be weekly, monthly, or quarterly; or as needed, depending on the size and scope of the software development activity. They include the internal reviews conducted periodically to status the project, and the major technical reviews that occur at various phases of the project (see the NASA Systems Engineering Handbook and the NASA Space Flight Program and Project Management Requirements documents). The latter reviews are the major reviews in the software development life cycle (e.g., PDR, CDR, etc.). Take into consideration the results of the software classification level process (see SWE-020) and safety criticality determination process (see SWE-133) when making the choice of review frequency.

Regular reviews cover details of work in software planning, requirements development, architecture, detailed design, coding and integration, testing plans, testing results, and overall readiness for flight. The individual project or development activity determines specific content of each review, with consideration of the current position of the activities within the software development life cycle. Review content is based on the specific project needs. However, the major technical reviews often must show evidence of satisfaction of Entrance and Exit (Success) criteria. See 7.09 - Entrance and Exit Criteria for a listing of potential criteria to use in reviews.

The evaluation of metrics developed from a software measures process (see SWE-091, SWE-092, SWE-093, and SWE-094) and the assessment of milestone status provide quantitative determinations of the work progress. Risk identification and mitigation, safety, problem identification and resolution are parts of the regular reviews. Risk identification (see SWE-086) and mitigation efforts are tracked in a controlled manner, whether in a database tool or in a software package written for risk management.

Issues that are identified during a regular review that can't be closed at the review are documented and tracked until they are officially dispositioned and/or closed. The issue can be tracked in a suitable tool, such as a risk management system, a configuration management system, or a problem reporting and corrective action (PRACA) system. The configuration management and control system selected for the project is written up in the Configuration Management Plan. The plan is used to record the methods and tools used for tracking the issues to closure (see SWE-079 and SWE-103).

The interpretation of the term "stakeholder" for this requirement can be taken to include representatives from the following organizations:

  • Quality assurance.
  • Systems engineering.
  • Independent testing.
  • Operations.
  • Independent Verification and Validation (IV&V).
  • Project management.
  • Other organizations performing project activities.

However, other external stakeholders at the program or project level (e.g., principal investigators, the science community, technology community, public, education community, and/or a Mission Directorate sponsor) are not included on a regular basis at the internal reviews for the satisfaction of this requirement. In contrast to the relevant or internal stakeholders, external project stakeholders generally participate in just the major milestone reviews.

A best practice related to the involvement of stakeholders is to determine and invite the relevant stakeholders, i.e., those who are typically involved in the review because they are engaged in the development and/or who have a vested interest in the work products being produced.

Additional guidance related to holding review of software activities, status, and results may be found in the following related requirements in this Handbook:

SWE-019

Software Life Cycle

SWE-024

Plan Tracking

SWE-037

Software Milestones

SWE-087

Software Peer Reviews and Inspections for Requirements, Test Plans, Design, and Code

SWE-088

Software Peer Reviews and Inspections - Checklist Criteria and Tracking

SWE-089

Software Peer Reviews and Inspections - Basic Measurements


4. Small Projects

This requirement applies to all projects depending on the determination of the software classification of the project (see SWE-020 ). A smaller project, however, may be able to get by with less frequent reviews, if the risk to the overall project or program by the software product is low. Periodic evaluations of the software classification and risk level may validate the use of less frequent reviews or suggest an increase in their frequency.

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to or applicable to the software reviews:


  1. Aero-Space Technology/X-34 In-Flight Separation from L-1011 Carrier, Lesson No.1122: The following entry in the lesson learned database, among other things, points to a concern for holding an adequate software review of validation activities to reduce risk on a particular part of the X-34 mission. "The X-34 technology demonstrator program faces safety risks related to the vehicle's separation from the L-1011 carrier aircraft and to the validation of flight software. Moreover, safety functions seem to be distributed among the numerous contractors, subcontractors, and NASA without a clear definition of roles and responsibilities." The recommendation is that "NASA should review and assure that adequate attention is focused on the potentially dangerous flight separation maneuver, the thorough and proper validation of flight software, and the pinpointing and integration of safety responsibilities in the X-34 program." 539.
  2. Informal Design Reviews Add Value to Formal Design Review Processes (1996), Lesson No.0582: "A JPL study of in-flight problems on Voyager I and II, Magellan, and Galileo (up to late 1993) revealed that 40% of the problems would likely have been identified by better technical penetration in reviews of the detailed designs performed well before launch. An additional 40% (for a total of 80%) might also have been found by similarly in-depth reviews." Also, "Since formal reviews emphasize verification of the project status and attainment of milestones, their chief benefit lies in the investigative work performed in preparation for the review. In contrast, informal reviews feature detailed value-added engineering analysis, problem solving, and peer review." 509.
  3. Project Management: Integrating and Managing Reviews, Lesson No.1281: This lesson learned discusses some caveats to running reviews. "The Project underwent several reviews throughout its life cycle. In general, these reviews helped the Project maintain its course and manage its resources effectively and efficiently. However, at times these review groups would provide the Project with recommendations that were inconsistent with the Project's requirements. For example, these recommendations included the recommendations of other review teams, and past recommendations of their own team. The result was that the Project had to expend resources trying to resolve these conflicts or continually redraft its objectives." The Recommendation is that "Projects should develop and maintain a matrix of all the reviews they undergo. This matrix should have information such as the review team roster, their scope, and a record of all their recommendations. Prior to each review this matrix should be given to the review team's chairperson so that the outcome of the review remains consistent and compatible with the Project's requirements. Senior Management should be represented at these reviews so that discrepancies between the review team and the Project can be resolved immediately." 547.