bannerd


SWE-018 - Software Activities Review

1. Requirements

3.3.2 The project manager shall regularly hold reviews of software schedule activities, status, performance metrics, and assessment/analysis results with the project stakeholders and track issues to resolution.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-018 - Last used in rev NPR 7150.2D

RevSWE Statement
A

2.2.6 The project shall regularly hold reviews of software activities, status, and results with the project stakeholders and track issues to resolution.

Difference between A and B

No change

B

3.3.2 The project manager shall regularly hold reviews of software activities, status, and results with the project stakeholders and track issues to resolution.

Difference between B and C

Added metrics and specified "schedule" activities.

C

3.3.2 The project manager shall regularly hold reviews of software schedule activities, metrics, status, and results with the project stakeholders and track issues to resolution. 

Difference between C and D

Reworded requirement to indicate performance metrics and assessment/analysis results were what was being included in stakeholder reviews

D

3.3.2 The project manager shall regularly hold reviews of software schedule activities, status, performance metrics, and assessment/analysis results with the project stakeholders and track issues to resolution.



1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Regular reviews of software status and activities assure that all needed tasks and deliverables are managed and achieved. The technical reviews and assessments are used to monitor the progress of the software technical effort and provide software product status information. A key aspect of the technical assessment process is the conduct of life cycle and technical reviews throughout the software life cycle. These reviews provide a periodic assessment of the program's or project's software technical and progress status and health at key points in the life cycle.

3. Guidance

3.1 Perform Regular Life Cycle Reviews

Each program and project will perform the life cycle reviews as required by their governing project management NPR, applicable Center practices, and the applicable requirements. Life cycle reviews are event-based and occur when the entrance criteria for the application review are satisfied. They should occur based on the maturity of the relevant technical baseline as opposed to calendar milestones (e.g., the quarterly progress review, the yearly summary).  The software and project management technical team needs to develop and document plans for life cycle and technical reviews for use in the software project planning process. The life cycle and technical review schedule should be documented or recorded in a planning document or record.

Regular reviews are held on a scheduled basis throughout the software development life cycle. These may be weekly, monthly, or quarterly; or as needed, depending on the size and scope of the software development activity. They include the software reviews conducted to status the project, and the major software technical reviews that occur at various phases of the project (see 7.08 - Maturity of Life Cycle Products at Milestone Reviews).   7.08 - Maturity of Life Cycle Products at Milestone Reviews provides a chart that summarizes the current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews. This chart serves as guidance only and NASA Center procedures should take precedence for projects at those Centers. The chart was constructed using the software engineering products from NPR 7150.2, the project life cycle reviews from NPR 7123.1 041, previous work from the NASA Software Working Group to map products to life cycle reviews, and additional information gathered from these NPRs, NPR 7120.5 082, and individual NASA Center procedures.

3.2 Work Covered In a Review

Regular software reviews cover details of work in software planning, requirements development, architecture, detailed design, coding and integration, testing plans, testing results, and overall readiness for flight. The individual project or development activity determines the specific content of each review, with consideration of the current position of the activities within the software development life cycle. The software review content is based on specific project needs. However, the major technical reviews that include software often must show evidence of satisfaction of entrance and exit (success) criteria. See 7.09 - Entrance and Exit Criteria for a listing of potential criteria to use in reviews. Review and status planning should take into consideration the results of the software classification level process (see SWE-020 - Software Classification) and the safety criticality determination process (See NASA-STD-8739.8) when choosing review frequency. See SWE-037 - Software Milestones

See also SWE-046 - Supplier Software Schedule

3.3 Metrics and Risk Management

The evaluation of metrics developed from a software measures process (see SWE-091 - Establish and Maintain Measurement Repository, SWE-092 - Using Measurement Data, SWE-093 - Analysis of Measurement Data, and SWE-094 - Reporting of Measurement Analysis) and the assessment of milestone status provide quantitative determinations of the work progress. Risk identification and mitigation, safety, problem identification, and resolution are parts of the regular reviews. Risk identification (see SWE-086 - Continuous Risk Management) and mitigation efforts are tracked in a controlled manner, whether in a database tool or in a software package written for risk management.

3.4 Issue Tracking

Issues that are identified during a regular review that can't be closed at the review are documented and tracked until they are officially dispositioned and/or closed. The issue can be tracked in a suitable tool, such as a risk management system, a configuration management system, or a problem reporting and corrective action (PRACA) system. The configuration management and control system selected for the project is written up in the Configuration Management Plan. The plan is used to record the methods and tools used for tracking the issues to closure (see SWE-079 - Develop CM Plan and 5.06 - SCMP - Software Configuration Management Plan).

The progress between life cycle phases is marked by key decision points (KDPs). At each KDP, software engineering and project management examine the maturity of the technical aspects of the project. For example, management examines whether the resources (staffing and funding) are sufficient for the planned technical effort, whether the technical maturity has evolved, what the technical and nontechnical internal issues and risks are, and whether the stakeholder expectations have changed.

3.5 Stakeholders

The interpretation of the term ‘stakeholder’ for this requirement can be taken to include representatives from the following organizations:

  • Quality assurance.
  • Systems engineering.
  • Independent testing.
  • Operations.
  • Independent Verification and Validation.
  • Project and/or Engineering management.
  • Other organizations performing project activities.

However, other external stakeholders at the program or project level (e.g., principal investigators, the science community, technology community, public, education community, and/or a Mission Directorate sponsor) are not included regularly at the internal reviews for the satisfaction of this requirement. In contrast to the relevant or internal stakeholders, external project stakeholders generally participate in just the major milestone reviews.  Stakeholders typically are those materially affected by the outcome of a decision or a deliverable. In the context of software work product development, a stakeholder may be inside or outside of the organization doing the work. The key reason for holding regular reviews with project stakeholders is to keep them and their organizations informed since they are both participants in the software work product development and advocates for the activity.

A best practice related to the involvement of stakeholders is to determine and invite the relevant stakeholders, i.e., those who should be involved in the review because they are engaged in the development and/or who have a vested interest in the work products being produced

3.6 Additional Guidance

Additional guidance related to holding a review of software activities, status, and results may be found in the following related requirements in this Handbook:

3.7 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

This requirement applies to all projects depending on the determination of the software classification of the project (see SWE-020 ). A smaller project, however, may be able to get by with less frequent reviews, if the risk to the overall project or program by the software product is low. Periodic evaluations of the software classification and risk level may validate the use of less frequent reviews or suggest an increase in their frequency. 7.08 - Maturity of Life Cycle Products at Milestone Reviews provides a chart that summarizes the current guidance approved by the NASA Office of the Chief Engineer (OCE) for software engineering life cycle products and their maturity level at the various software project life cycle reviews. This chart serves as guidance only and NASA Center procedures should take precedence for projects at those Centers.

5. Resources

5.1 References

5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to or applicable to the software reviews:

  • Aero-Space Technology/X-34 In-Flight Separation from L-1011 Carrier, Lesson No.1122 539: The following entry in the lesson learned database, among other things, points to a concern for holding an adequate software review of validation activities to reduce risk on a particular part of the X-34 mission. "The X-34 technology demonstrator program faces safety risks related to the vehicle's separation from the L-1011 carrier aircraft and to the validation of flight software. Moreover, safety functions seem to be distributed among the numerous contractors, subcontractors, and NASA without a clear definition of roles and responsibilities." The recommendation is that "NASA should review and assure that adequate attention is focused on the potentially dangerous flight separation maneuver, the thorough and proper validation of flight software, and the pinpointing and integration of safety responsibilities in the X-34 program.".
  • Informal Design Reviews Add Value to Formal Design Review Processes (1996), Lesson No.0582 509: "A JPL study of in-flight problems on Voyager I and II, Magellan, and Galileo (up to late 1993) revealed that 40% of the problems would likely have been identified by better technical penetration in reviews of the detailed designs performed well before launch. An additional 40% (for a total of 80%) might also have been found by similarly in-depth reviews." Also, "Since formal reviews emphasize verification of the project status and attainment of milestones, their chief benefit lies in the investigative work performed in preparation for the review. In contrast, informal reviews feature detailed value-added engineering analysis, problem-solving, and peer review.".
  • Project Management: Integrating and Managing Reviews, Lesson No.1281 547: This lesson learned discusses some caveats to running reviews. "The Project underwent several reviews throughout its life cycle. In general, these reviews helped the Project maintain its course and manage its resources effectively and efficiently. However, at times these review groups would provide the Project with recommendations that were inconsistent with the Project's requirements. For example, these recommendations included the recommendations of other review teams and past recommendations of their team. The result was that the Project had to expend resources trying to resolve these conflicts or continually redraft its objectives." The Recommendation is that "Projects should develop and maintain a matrix of all the reviews they undergo. This matrix should have information such as the review team roster, their scope, and a record of all their recommendations. Before each review, this matrix should be given to the review team's chairperson so that the outcome of the review remains consistent and compatible with the Project's requirements. Senior Management should be represented at these reviews so that discrepancies between the review team and the Project can be resolved immediately.".

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-018 - Software Activities Review
3.3.2 The project manager shall regularly hold reviews of software schedule activities, status, performance metrics, and assessment/analysis results with the project stakeholders and track issues to resolution.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm the generation and distribution of periodic reports on software schedule activities, metrics, and status, including reports of software assurance and software safety schedule activities, metrics, and status.

2. Confirm closure of any project software schedule issues.

7.2 Software Assurance Products

  • Lists of the non-conformances found during reviews with their corrective action status


    Objective Evidence

    • Evidence of confirmations of regular project reviews, including any risks or issues. 
    • Evidence showing software assurance, IV&V (if required), and software safety (if required) reviews with the project and engineering, reviews should be at least once every month. 

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • Deviations of actual schedule progress vs. planned schedule progress above defined threshold 
  • # of open versus # of closed issues over time and latency.  See metrics in SWE-024 
  • The trend of change status over time (# of changes approved, # in implementation, # in test, # closed)

See also Topic 8.18 - SA Suggested Metrics. SWE-024 - Plan Tracking

7.4 Guidance

Confirm that stakeholders receive periodic reviews of the SA software schedule activities, metrics, and status. Participate in engineering and project status reviews to determine if software information is being discussed and used.  Schedule periodic reviews of the SA software schedule activities, metrics, and status with engineering and project management. Software assurance can provide the status of their activities, metrics, an overall assessment of the project's health in several ways. These can include a software assurance in a project regular review (status review, or milestone review), a targeted SA status review, a regular status meeting with stakeholders, or a regular report. Examples of items to be discussed or reported are found below:

Examples of items on software assurance schedule activities could include:

  • Software assurance plan,
  • Software assurance audit,
  • Software assurance status reports,
  • Software assurance and software safety requirements mapping table for the SASS standard requirements, the cost estimate for the project’s software assurance support, Software Assurance reviews and software assurance review support,
  • IV&V planning and risk assessment, if required.
  • The IV&V Execution Plan, if required.
  • System hazard analysis assessment activities,
  • Software Safety Analysis,
  • Software assurance independent static code analysis for cybersecurity vulnerabilities and weaknesses,
  • Software assurance independent static code analysis, on the source code, showing that the source code follows the defined secure coding practices,
  • Software assurance analysis performed on the detailed software requirements,
  • Software assurance design analysis, SA peer reviews,
  • Software assurance metric data, reporting, and analysis activities,
  • Software assurance status reviews.

Examples of Software Assurance Metrics:

  • The number of software assurance findings (e.g., # open, closed, latency) mapped against SA activities.
  • Planned software assurance resource allocation versus actual SA resource allocation.
  • The number of audit findings per audit, including the number of findings from process non-compliances and process maturity.
  • Software cyclomatic complexity data for all identified safety-critical software components;
  • Test coverage data for all identified safety-critical software components.
  • Percentage completed for each area of traceability.
  • Software test coverage percentages, including the percentage of testing completed and the percentage of the detailed software requirements, successfully tested to date;
  • The number of compliance audits planned vs. the number of compliance audits completed.
  • Number of peer reviews performed vs. # planned; the number of defects found in each peer review;
  • The number of root cause analyses performed; list of finding identified by each root cause analysis.

Content guidelines for a good Software Assurance status report: (see 8.52 - Software Assurance Status Reports) See also Topic 8.12 - Basics of Software Auditing

Software Assurance and Software Safety Status Report

The following section defines the content for a software assurance Status Report. The Status Report Plan is a scheduled periodic communication tool to help manage expectations between the software assurance representative(s) and project engineering and management, OSMA stakeholders. It provides insight into the overall status relative to value-added and performance to software assurance plan. Pre-coordinate and define the specifics/content of the status report in the software assurance Plan. The software assurance Status Report content, in no specific order, addresses:

    1. SA Project Title and Date – Identify the project and the date(s) of the reporting period.
    2. Overall Status Dashboard or Stoplight Table – Provide a high-level status of progress, risk, schedule, and whether or not assistance/awareness is required. Typically a Green/Yellow/Red scheme for indicating go, no go status, and approaching minimum threshold or limits.
    3. Key Contributions/Accomplishments/Results (Value Added) – Identify any activities performed during the reporting period that has added value to the project. The reporting should include key SA contributions, accomplishments, and results of SA Tasking activities performed in Table 1 (SA Requirements Mapping Matrix). Examples are:
      • Analyses performed (e.g., PHAs, HAs, FMEAs, FTAs, Static Code Analysis)
      • Audits performed (e.g., process, product, PCAs, FCAs, )
      • Products Reviewed (e.g., Project Plans, Requirements, Design, Code, Test docs)
      • Tests witnessed
      • Assessments performed (e.g., Safety Criticality, Software Class, Risk, Cybersecurity)
    4. Current/Slated Tasks – Identify in-work and upcoming assurance activities. Identify (planned and unplanned) software assurance and oversight activities for the next reporting period. Examples are:
      • Analyses (e.g., PHAs, HAs, FMEAs, FTAs, Static Code Analysis)
      • Audit (e.g., process, product, PCAs, FCAs, )
      • Products Reviews (e.g., Project Plans, Requirements, Design, Code, Test docs)
      • Tests witnessing
      • Assessments (e.g., Safety Criticality, Software Class, Risk, Cybersecurity)
    5. Issue Tracking – Record and track software issues to follow progression until resolution. Track issues by priority status, safety, criticality, or some other criteria combination.
    6. Metrics – Identify the set of SA metrics used and analyzed for the reporting period. At a minimum, collect and report on the list of SA metrics specified in the SA Standard. Include analysis results, trends, or updates and provide supportive descriptions of methodology and criteria.
    7. Process Improvement suggestions and observations
    8. Obstacles/Watch Items – Identify and describe any obstacles, roadblocks, and watch items for the reporting period. Obstacles/Watch Items are an informal list of potential concerns.
    9. Risk Summary – List and provide status on SA risks associated with any activities/tasks required by the SA Standard. Highlight status changes and trends.
    10. Funding (As Applicable) – Provide funding status and trending information as needed to support the SA tasking for the project. Consider how the data/information can be used for future planning and cost estimating
    11. Schedule – Provide the necessary schedule/information to communicate the status of the SA tasking in support of the scope and timeline established for the project.

See also Topic 8.05 - SW Failure Modes and Effects Analysis

7.5 Additional Guidance

Additional guidance related to holding a review of software activities, status, and results may be found in the following related requirements in this Handbook:

  • No labels