bannerd

The title of this page has been changed. If you are using a bookmark to get here, please updated it.

You should be redirected to https://swehb.nasa.gov/display//SWEHBVD/7.09+-+Entrance+and+Exit+Criteria. If you do not get there in 2 seconds, click the link to go there. 


7.09 - Entrance and Exit Criteria

Entrance and Exit Criteria

Background

This guidance provides the recommended life cycle review entrance and exit criteria for software projects and should be tailored for the project class.

This topic describes the recommended best practices for entrance and success criteria for the software life cycle and technical reviews required by NASA. The guidelines are regardless of whether the review is accomplished in a one-step or two-step process.  The entrance and items reviewed criteria do not provide a complete list of all products and their required maturity levels. Additional programmatic products may also be required by the appropriate governing NPRs for the project/program.

Tailoring and customizing are expected for projects and programs.  The entrance and success criteria and products required for each review will be tailored and customized appropriately for the particular program or project being reviewed and the classification of the software.  The decisions made to tailor and customize life-cycle review criteria should be justified to both the Engineering and SMA TA.

The recommended criteria in the following tables are focused on demonstrating acceptable software technical maturity, adequacy of software technical planning and credibility of budget, software schedule, risks (as applicable), and software readiness to proceed to the next phase.  Customized or tailored criteria developed by programs or projects for life-cycle reviews should also be focused on assessing these factors. The software entrance and exit criteria guidance is a collection of material from the following core documents: NPR 7150,2   083  requirements, NASA-STD-8739.8 278 requirements, and NPR 7123.1, Appendix G 041

See also 7.08 - Maturity of Life Cycle Products at Milestone Reviews

1.1 References

1.2 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

1.3 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

Mission Concept Review (MCR)

The MCR affirms the mission need and examines the proposed mission's objectives and the concept for meeting those objectives. Key technologies are identified and assessed. It is an internal review that usually occurs in the cognizant system development organization. ROM (Rough Order of Magnitude) budget and schedules are presented. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

The Project identified the software engineering point of contact.

Project/Program Staffing requirements and plans, Initial or draft cost estimate for software engineering, software assurance support, and IV&V support (if required)

Software planning is sufficient to proceed to the next phase.

The Project identified the software assurance point of contact.

Draft software schedule

The software concept(s) meet the mission's and stakeholder's expectations


Project-Level, System, and Subsystem Requirements

The program/project has demonstrated planned compliance with NASA NPR 7150.2 and NASA-STD-8739.8.

  • Contracts and Statement of Work Documents plan compliance with NASA NPR 7150.2 and NASA-STD-8739.8.
  • NASA has access to the software products in electronic format, including software development and management metrics.


Systems Engineering Management Plan

The staffing resource requirements include adequate resources for software engineering, software assurance, and software safety support on the mission.


Partnerships, interagency, and international agreements

Software concepts and software reuse have adequately considered the use of existing assets or products that could satisfy the mission or parts of the mission.


Program/Project Plan

Mission objectives and operational concepts are clearly defined.


Mission objectives/goals and mission success criteria

The software classifications are correct.


Initial Concept of Operations



Mission, Spacecraft, Ground, and Payload Architectures



Heritage Assessment Documentation



Industrial Base and Supply Chain Risk Management



Top technical, cost, schedule, and safety risks, cybersecurity risks, risk mitigation plans, and associated resources



Acquisition Strategy



Additional Supporting Material

  • The need for the mission is identified.
  • Concept of operations available.
  • Preliminary risk assessment available, including technologies and associated risk management/mitigation strategies and options.
  • A Mission Concept Review (MCR) agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair.
  • Preliminary technical Plans and conceptual life cycles are available.
  • A conceptual life cycle is available.
  • A top-level set of requirements are identified to meet the mission objectives.
  • The mission is feasible.
    • A solution has been identified that is technically feasible.
    • A rough cost estimate is within an acceptable cost range.
  • Draft cost and schedule estimates are available.
    • As developed by software (SW) developers and SW assurance personnel.
  • A technical search was done to identify existing assets/products that could satisfy the mission or parts of the mission
  • Software inputs/contributions provided for:
    • Preliminary Project Plan.
    • Preliminary Systems Engineering Management Plan (SEMP).
    • Development and analysis of alternative concepts (showing at least one feasible).

Software Assurance

  • The software assurance point of contact for the project has been identified
  • Software assurance personnel have reviewed the materials available for the review:
    • The top-level requirements
    • Mission concept of operations
    • Preliminary technical plans and the conceptual life cycle
    • Preliminary risk assessment
  • Software assurance personnel confirm that the software portions of the MCR Entrance Criteria are met before the review
  • Mission goals and objectives.
  • Analysis of alternative concepts.
  • Preliminary development approaches and acquisition plans.
  • Concept of operations.
  • Risk assessments.
  • Technical plans and conceptual life cycle to achieve the next phase.
  • Preliminary requirements.
  • Draft cost and schedule estimates.
  • Conceptual system design.
  • Software Process Root cause analysis results.
  • SA analysis showing uncovered software code percentage.

Software Assurance:

  • Attends review to gain an understanding of the mission
  • Record and submit RIDS (Review Item Discrepancy)/RFAs on risks or issues identified
  • The review panel agrees that:
    • Technical planning is sufficient to proceed to the next phase.
    • Risk and mitigation strategies have been identified and are acceptable based on technical risk assessments.
    • Cost and schedule estimates are credible.
    • Mission goals and objectives are clearly defined and stated, unambiguous, and internally consistent.
    • The conceptual system design meets mission requirements, and the various system elements are compatible.
    • Technology dependencies are understood, and alternative strategies for the achievement of requirements are understood.
  • As applicable, the agreement is reached that:
    • Preliminary mission requirements are traceable to science objectives.
    • The operations concept supports the achievement of science objectives.

Software Assurance:

  • Has gained an understanding of mission goals, objectives, preliminary requirements, and operations concept
  • Confirms that all issues and risks have been recorded
  • Agrees with resolution of RIDS (Review Item Discrepancy)/RFAs submitted

System Requirements Review (SRR)

The SRR examines the functional and performance requirements defined for the system and the preliminary Program or Project Plan and ensures that the requirements and the selected concept will satisfy the mission. (NPR 7120.5 082)

  • If not performing a Software Requirements Review (SwRR), include SwRR criteria as part of SRR.
  • For software-only projects, the SwRR serves as the SRR.
Entrance CriteriaItems ReviewedExit / Success Criteria

The Project identified the software points of contact.

Preliminary Software Development Plan

The software reflects the software's intended operational use and represents capabilities likely to be achieved within the project's scope.

  • The maturity of the system/software requirements definition is sufficient to begin Phase B.

The software engineering, software assurance, software safety, and IV&V (if applicable) personnel for the project have been defined.

Preliminary Software Management Plan

The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project.

  • The correct software classifications have been defined.

Completed initial software classifications and safety criticality assessments

Draft Software Assurance and Software Safety Plan

Software Bi-directional traceability is complete.

  • Bi-directional traceability is complete with the system, hardware, and ICD requirements.
  • Transparent allocation of system requirements to software and software architectural elements.

Preliminary Hazard Analysis available

Preliminary Software Requirements

Content and maturity in the software plans, including the IV&V plan, are sufficient to begin Phase B.

A completed NASA-STD-8739.8 mapping matrix.

Draft requirements mapping table for the software assurance and software safety standard requirements

Quality software requirements have been established.

  • Transparent allocation and decomposition of requirements exist between the hardware, software, and operational concepts.
  • Software interfaces with external entities and between major internal elements are defined, including system security expectations.
  • The software requirements address and include software and fault management requirements and cybersecurity requirements, based on the project protection plan.

A completed NPR 7150.2 mapping matrix.

Preliminary Software Cost Estimate

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.

IV&V Project Execution Plan (if required)

Preliminary Hazard Analysis and software controls and mitigations (Functional Hazard Analysis / Hazard Reports / Hazard Analysis Tracking Index)

Certifiable software development practices by the organizations developing the critical software components exist.

Completed review of software Lessons Learned from previous similar missions

CMMI-Dev rating documentation

Software risk and mitigation strategies have been identified and are acceptable.

  • Adequate planning exists for developing, inserting, or deploying any enabling new software technologies.
  • The software engineering, software assurance, and software safety cost and schedule allocations are credible to meet program/project requirements with acceptable risk.
  • Applicable Software Lessons Learned from other projects and programs have been identified and addressed.
  • NASA has access to the software products in electronic format, including software development and management metrics.
  • Software supply chain risks have been identified.

 

Preliminary IV&V Project risk assessment

Adequate IV&V planning and support are in place. (If IV&V is required)

 

Preliminary IV&V Project Execution Plan

 


Mission, Spacecraft, Ground, and Payload Architectures



Project-Level, System, and Subsystem Requirements



Top software technical, cost, schedule, and safety risks, cybersecurity risks



Preliminary Software Configuration Management Plan (SCMP)



Preliminary Software schedules



Staffing requirements and plans



Preliminary System-level interface requirements for software

 


System Security Plans



Project Projection Plans



Acquisition Strategy



Preliminary Concept of Operations



Configuration Management Plan



Systems Engineering Management Plan



Contracts and Statement of Work Documents



Program/Project Plan



Preliminary Safety and Mission Assurance Plan


 

Mission objectives/goals and mission success criteria


 

Human-Rating Certification Package (if crewed)


 

Partnerships, interagency and international agreements (MOAs/MOUs)


 

Human Systems Integration Plan (if crewed)


 

Project Schedule



Additional Supporting Material

  • Successful completion of Mission Concept Review (MCR) and responses made to all MCR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs).
  • A preliminary SRR agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair.
  • Technical products required for this review were made available to participants before SRR.
  • System requirements captured in format for review.
  • System requirements allocated to the next lower level system (subsystems) – preliminary allocation completed.
  • System-level software functionality description completed.
  • System-level interface requirements for software documented.
  • Updated concept of operations available.
  • Updated mission requirements are available, if applicable.
  • Preliminary Hazards Analysis (PHA) is available.
  • Software inputs/contributions completed for:
    • Baseline Systems Engineering Management Plan (SEMP).
    • Preliminary Project Plan.
    • System safety and mission assurance plan, including software classification.
    • Risk management plan.
      • Updated risk assessment and mitigations (including Probabilistic Risk Assessment (PRA) as applicable).
    • Preliminary human rating plan, if applicable.
    • Initial document tree.

Software Assurance:

  • Confirm RFAs and RiDs from MCR have been satisfactorily resolved
  • Have identified the software assurance personnel for the project and the associated required training
  • Have reviewed materials available before milestone review for:
    • System requirements captured and allocated to the next lower-level
    • System-level software functionality description
    • System-level interface requirements for software
    • Updated mission requirements and concept of operations
  • Have reviewed the trace matrix for the top-level requirements to the next lower-level allocation
  • Has done a software classification and safety criticality assessment and coordinated with the software development/management team. If the software classification was generated by software engineering, software assurance reviews classification confirms it and concurs
  • Have worked with system engineering and safety personnel to develop Preliminary Hazard Analysis
  • Have provided initial contributions to the systems safety plan
  • A preliminary software assurance plan has been developed per the SA Standard
  • Have reviewed software inputs for:
    • Systems Engineering Management Plan
    • Preliminary Project Plan (or Software Project Plan if the project is software only)
    • System safety and mission assurance plan
    • Risk management plan
    • Preliminary Human Rating Plan, if applicable
  • Before the SRR, confirm that the criteria for SRR have been met, particularly those relating to software.
  • System-level requirements and preliminary allocation to the next lower-level system (subsystems).
  • System-level software functionality description.
  • System-level interface requirements for the software.
  • Concept of operations.
  • Mission requirements.
  • Preliminary Hazard Analysis (PHA).
  • Preliminary approach for how requirements will be verified and validated down to the subsystem level.
  • Risk and mitigation strategies.
  • Acquisition strategy.
  • Preliminary Software and Software Assurance schedule  
  • Software cost estimate  -
  • The preliminary set of software and software assurance processes

Software Assurance:

  • Attend review to gain an understanding of the mission including:
    • Requirements allocation to lower-level subsystems
    • Risk and mitigation strategies
    • System-level software functionality and interfaces
    • Acquisition strategy
    • Updated concept of operations
  • Record and submit RIDS (Review Item Discrepancy)/RFAs on any risks or issues including the following:
    • Feasibility of overall project schedule
    • The preliminary allocation of system requirements to hardware, human, and software systems 
    • The requirements allocation and flow down to subsystems
    • Verification approaches
    • The Concept of Operations presented to satisfy the mission requirements
  • The review panel agrees that:
    • The process for allocation and control of requirements throughout all levels is deemed sound; the plan is defined to complete the definition activity within schedule constraints.
    • Requirements definition is complete for top-level mission and science requirements; interfaces with external entities and between major internal elements are defined.
    • Requirements allocation and flow down of key driving requirements are defined down to subsystems including hardware, software, and human.
    • Preliminary approaches have been determined for how requirements will be verified and validated down to the subsystem level.
    • Major risks have been identified and technically assessed, and viable mitigation strategies defined.
    • Requirements and selected concepts of operations will satisfy the mission.
    • System requirements, approved material solution, available product/process technology, and program resources are sufficient to proceed to the next life cycle phase.

Software Assurance:

  • Has gained an understanding of review material
  • Agrees with the resolution of RIDS (Review Item Discrepancy)/RFAs submitted; tracks to closure
  • Agrees that plans reviewed/presented are satisfactory to continue with the development
  • Concurs with the review panel’s assessment of the review’s success
  • Confirms that all issues and risks have been recorded
  • Understand system Preliminary Hazard Analysis and the areas where software might be involved

Software Requirements Review (SwRR)

See the definition of SRR.

  • If not performing a Software Requirements Review (SwRR), include SwRR criteria as part of SRR.
  • For software-only projects, the SwRR serves as the SRR.
Entrance CriteriaItems ReviewedExit / Success Criteria

Software Requirements

Software Requirements Specification

The project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B.

  • The software schedule satisfies the following conditions: Coordinates with the overall project schedule, Documents the interactions of milestones and deliverables between software, hardware, operations, and the rest of the system, Reflects the critical dependencies for software development activities and identifies and accounts for dependencies with other projects and cross-program dependencies.
  • Adequate planning exists for developing, inserting, or deploying any enabling new software technology or avionics technology needed for software development.
  • The proposed avionics/software architecture is credible and responsive to program requirements and constraints, including supporting the single-point failure/Fault Tolerance requirements.

Bi-directional traceability

Software Assurance/Safety Plans

Quality software requirements have been established.

  • Software requirements include requirements for COTS, GOTS, MOTS, OSS, reused software components, interface requirements, cybersecurity requirements, fault management software requirements, and software-related safety constraints, controls, and mitigations.
  • The software requirements are testable.
  • The software requirements address the Software Fault Tree Analysis (FTA) or Software Failure Modes and Effects Analysis results.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 30%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 20%
  • High software requirements quality risk score (software requirements quality score of 3 or higher).  Less than 3 is a risk, and fewer than 20% of the requirements have a high-risk or very high-risk score.
  • A sufficient number of detailed software requirements exist.
  • Operational scenarios are defined enough to support the definition and allocation of software requirements.
  • Software cybersecurity requirements are defined.

Software Management/Development Plan

Software Management/Development Plan(s)

Software Bi-directional traceability is complete.

  • Bi-directional traceability is complete with the system, hardware, and ICD requirements.
  • Transparent allocation of system requirements to software and software architectural elements.

Software Assurance/Safety Plan


Preliminary high-level software architecture defined.

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FMEA) and software Fault Tree results.

Completed software classification and safety criticality assessments are available

IV&V Project Execution Plan (if required)

Certifiable software development practices by the organizations developing the critical software components exist.

NASA NPR 7150.2 requirements mapping matrix is complete

IV&V Project risk assessment (if required)

The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project.

  • The correct software classifications have been defined.

NASA-STD-8739.8 requirements mapping matrix is complete

Software Classification

Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Software Configuration Management Plan

Hazard Analysis and software controls and mitigations (Functional Hazard Analysis / Hazard Reports / Hazard Analysis Tracking Index)

Software metrics to track the software quality and maturity are being used.

  • The software technical metric margins are defined and are acceptable.
  • Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization) and resources exist to complete the software development within budget, schedule, and known risks.

Completed software assurance software requirements analysis

Software risks

Software risk and mitigation strategies have been identified and are acceptable.

  • Adequate planning exists for developing, inserting, or deploying any enabling new software technologies.
  • The software engineering, software assurance, and software safety cost and schedule allocations are credible to meet program/project requirements with acceptable risk.
  • Applicable Software Lessons Learned from other projects and programs have been identified and addressed.

  • NASA has access to the software products in electronic format, including software development and management metrics.

  • Software supply chain risks have been identified.

  • Software risks, issues, and findings are documented.

  • Sufficient software workforce or software skillsets exist.

Completed the IV&V software requirements analysis

Software Assurance Requirement Analysis Results

Adequate IV&V planning, risk assessment, and support are in place. (If IV&V is required)

The software engineering, software assurance, software safety, and IV&V personnel for the project and milestone review are defined.

Software FEMAs and Fault Trees

 

Completed peer review(s) of the software requirements specification

Bidirectional traceability matrix.


Completed peer review(s) of the software plans.


Certified software development practices by the organizations developing the critical software components.

 

CMMI-Dev rating documentation

Top technical, cost, schedule, and safety risks, cybersecurity risks, risk mitigation plans, and associated resources

 

Software test facilities, needs, and capabilities identified.

IV&V concept review results (concept, reuse, architecture, ops maturity, feasibility, hazard, security)


IV&V Project Execution Plan

Results of any software and avionics architecture trade studies


Test tool requirements have been identified and plans for any necessary test development are in place.

Software Configuration Management Plans

 


Concept of Operations Documentation

 


System Security Plans

 

Additional Supporting Material

General

  • Successful completion of the previous review and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs).
  • A final Software Requirements Review (SwRR) agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair.
  • Technical products for this review were made available to participants prior to SwRR.
  • Peer reviews completed on technical products as needed.
  • Preliminary concept of operations available for review.

Software Assurance:

  • Should have reviewed the technical products (e.g. Software Management Plan, software requirements, and the verification and validation plans) prior to any peer reviews
  • Should have reviewed the preliminary Concept of Operations
  • Software Assurance personnel have received the necessary training (both SA general and project-specific)
  • May have performed audits and/or participated in peer reviews of the above material
  • Confirm that all RIDs/RFAs from the previous review have been addressed

Plans

  • Preliminary Software Development Plan (SDP)/Software Management Plan (SMP) updated for corresponding architectural design and test development activities including as appropriate the items from the 7.18 documentation guidance.
    • Preliminary cost estimate.
    • A preliminary schedule, including milestones, exists for all software to be developed.
    • Preliminary Software Configuration Management Plan
  • A preliminary Software Quality Assurance Plan (SQAP) exists.
  • A Preliminary Software Safety Plan exists (if have safety-critical software).
    • Overall software verification strategy.
    • Test facilities, needs, and capabilities.
    • Methodology for verifying the flow-down system requirements and acceptance criteria.
    • Test tool requirements and development plans.
  • The risk management plan updated.
  • Independent Verification and Validation (IV&V) plan available and IV&V assessment of software requirements, if reviewed

Software Assurance:

  • Have completed reviews of the preliminary plan(s), including:
    • Software development/management/assurance plans
    • Configuration management plan
    • Risk management Plan
    • Overall verification strategy
    • Processes and metrics planned for software
    • Preliminary software safety plans
  • Have reviewed the Software Requirements Mapping Matrix and confirmed the SA TA signature.
  • Have reviewed the preliminary cost estimate and schedule – looking for feasibility.
  • Have completed the SA requirements mapping matrix for SA requirements
  • Have completed the preliminary Software Assurance Plan. See topic 7.18 in the Software Engineering Handbook for document contents of the Software Assurance Plan.
  • Have a preliminary software and software assurance schedule
  • Identify software safety-critical components by having a preliminary software hazard analysis completed

Requirements

  • The preliminary allocation of system requirements to software available
  • Preliminary software requirements (SRS) including, as appropriate, the items from the 7.18 documentation guidance, and the following: 
    • Block diagram exists for the major software components in each functional area, their interfaces and data flows
    • Relevant software operational modes defined (e.g., nominal, critical, contingency).
    • Critical and/or controversial requirements identified, including safety-critical requirements, open issues, and areas of concern.
    • Requirements identified that need clarification or additional information.
  • Performance requirements for the software identified.
    • The description exists in critical timing relationships and constraints.
  • Software requirements and interface requirements have been analyzed and specified.
  • Quality assurance assessment of the requirements completed and ready for review.
  • Bidirectional traceability matrix.
    • Requirements are traced to higher-level requirements.
  • Includes identification of verification methodology (e.g., test, demonstration, analysis, inspection).

Software Assurance:

  • Software Assurance analysis has been completed on the software requirements (including analysis of traceability).
    • SA performs requirements analyses and prepares results for the milestone review
    • SA participates in software requirements peer reviews and prepares feedback
    • SA reviews the bidirectional traceability
    • SA safety analysis of the software safety requirements has been completed.

See also Topic 8.09 - Software Safety Analysis

  • Results of SA requirements analyses and peer reviews are ready for review, including any risks or issues found.
  • Updates to (or generation of) hazard analysis reports have been made for detailed requirements, where needed
  • SA has reported (or is ready to report) the results/findings of any audits performed by SA
  • Have verified that the entrance criteria for the SwRR have been met

Design and Analysis

  • Preliminary high-level software architecture defined.
  • Report on current computer resource estimates and margins (memory, bus, throughput) available for review.
  • Design constraints documented.
  • Design drivers exist:
    • Explanation of design drivers and preliminary investigations made during the requirements process to determine the reasonableness of the requirements, including preliminary decisions regarding software architecture, operating systems, reuse of existing software, and selection of commercial-off-the-shelf (COTS) components.
    • Resource goals and preliminary sizing estimates (including timing and database storage) in the context of available hardware allocations; strategies for measuring and tracking resource utilization.
  • Review completed for the technical and economic feasibility of allocation of functions at the (sub)system level to hardware, firmware, and software.
  • Software-related trade-offs and design decisions completed and reviewed or preliminary results available, as applicable.
  • Software Interface requirements defined.
  • Preliminary Hazard Analysis (PHA)/Software Assurance records of the Software Classification, and Software Safety Criticality are available for review.
  • Software-related trade-offs and design decisions completed and reviewed or preliminary results available. 

Software Assurance:

  • Confirms that a high-level software architecture exists and is reasonable to satisfy requirements
  • Confirmation that early software planning has considered:
    • Documentation of design constraints and drivers
    • Knowledge of resource estimates and margins
    • Estimates of software size, timing requirements
    • Software interface (requirements)
    • Review software trade studies and feasibility studies
  • Confirm that any long-lead procurement items have been selected and procurement started
  • Confirm that all applicable entry criteria for the review have been met
  • Has independently performed software classification or concurred with project determination
  • Has reviewed or participated in producing preliminary hazard analysis
  • Has independently performed determination of safety-critical software
  • Has confirmed other items listed above exist and has reviewed them in preparation for the review

Note: In many projects, these design items are not done until after the SwRR, at the beginning of the preliminary design period.

  • Concept of operations.
    • Preliminary system requirements allocation to the software.
    • All software requirements including those in the SRS, the interface documents, and the systems requirements (SRS).
  • Risk management plan.
  • Preliminary software verification and validation (V&V) planning.
  • Software quality assurance (QA) plan.
  • Preliminary Hazards Analysis (PHA), software classification, litmus test results.
  • Design: constraints, strategy, trade-off decisions.
  • Results of technical and economic feasibility review and associated analyses.
  • Bidirectional traceability matrix.
  • Independent verification and validation (IV&V) plan and assessment of software requirements.
  • QA assessment of requirements.
  • Computer resource estimates and margins.
  • Software Configuration Management (CM) plan.
  • Software Development Plan (SDP)/Software Management Plan (SMP).
  • Software concept of operations.
  • Peer review results.

Software Assurance:

  • Review materials prior to review, attend review
  • Be prepared to present the SA assessment of the requirements and progress so far on the project. (Include results of any audits/analysis done). If this information is not presented at the review, it should be made available to the project manager.
  • Submit RIDS/RFAs on any identified issues or areas of risk
  • Implement any RIDs/RFAs submitted on the SA Plan
  • The review panel agrees that plans and requirements are satisfactory and ready to proceed to the design phase:
    • Software requirements are determined to be clear, complete, consistent, feasible, traceable, and testable.
    • Software Management Plan (SMP), software requirements, interface requirements, verification, and validation (V&V) planning is an adequate and feasible basis for architectural design activities and is approved, baselined, and placed under configuration management (CM).
    • Requirements and performance requirements are defined, testable, and consistent with cost, schedule, risk, technology readiness, and other constraints.
    • System requirements, approved material solutions, available product/process technology, and program resources form a satisfactory basis for proceeding into the development phase.
    • Milestones are verifiable and achievable.
    • Initial computer resource estimates are within margin limits; if not, plans for control of resource margins are deemed adequate to meet margins by preliminary design review (PDR).
  • All Software Requirements Review (SwRR) Review Item Discrepancies (RIDs) and actions are documented with resolution plans and authorization received to proceed to software architecture design.

Software Assurance:

  • Confirm that all issues and risks, RFAs, and RIDs have been recorded
  • Agree with the review panel that the project is ready to proceed into development
  • Agree with resolutions of RIDs/RFAs and consider any action plans reasonable and practical within schedule constraints. Track closure of RIDs/RFAs.
  • Review and update the SA Plan if necessary

Mission Definition Review (MDR)

The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5 082)

  • MDR is equivalent to SDR for robotic projects.
Entrance CriteriaItems ReviewedExit / Success Criteria

Software Requirements

Software Requirements

The project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B.

  • Adequate planning exists for developing, inserting, or deploying any enabling new software technology or avionics technology needed for software development.
  • The software schedule satisfies the following conditions: Coordinates with the overall project schedule, Documents the interactions of milestones and deliverables between software, hardware, operations, and the rest of the system, Reflects the critical dependencies for software development activities and identifies and accounts for dependencies with other projects and cross-program dependencies.
  • The proposed avionics/software architecture is credible and responsive to program requirements and constraints, including supporting the single-point failure/Fault Tolerance requirements.
  • The planned use of software static code analysis is acceptable.

Bi-directional traceability

CMMI-Dev rating documentation

Quality software requirements have been established.

  • Software requirements include requirements for COTS, GOTS, MOTS, OSS, reused software components, interface requirements, cybersecurity requirements, fault management software requirements, and software-related safety constraints, controls, and mitigations.
  • The software requirements are testable.
  • The software requirements address the Software Fault Tree Analysis (FTA) or Software Failure Modes and Effects Analysis results.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 30%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 20%
  • High software requirements quality risk score (software requirements quality score of 3 or higher).  Less than 3 is a risk, and fewer than 20% of the requirements have a high-risk or very high-risk score.
  • A sufficient number of detailed software requirements exist.
  • Software cybersecurity requirements are defined.
  • Operational scenarios are defined enough to support the definition and allocation of software requirements.

NASA NPR 7150.2 requirements mapping matrix is complete

Bidirectional traceability matrix.

Software Bi-directional traceability is complete.

  • Bi-directional traceability is complete with the system, hardware, and ICD requirements.
  • Transparent allocation of system requirements to software and software architectural elements.

NASA-STD-8739.8 requirements mapping matrix is complete

Preliminary Hazard Analysis

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FMEA) and software Fault Tree results.

Completed software classification and safety criticality assessments are available

Software and avionics architectures

The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project.

  • The correct software classifications have been defined.

Completed Software Plans

IV&V Project risk assessment

Certifiable software development practices by the organizations developing the critical software components exist.

Completed the IV&V software requirements analysis

IV&V Project Execution Plan

Software metrics to track the software quality and maturity have been selected.

  • The software technical metric margins are defined and are acceptable.
  • Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization) and resources exist to complete the software development within budget, schedule, and known risks.

Confirm RFAs and RIDs from SRR have been satisfactorily resolved

Software risks

Certifiable software development practices by the organizations developing the critical software components exist and are being followed.

  • Evidence exists with Software process audit results.
  • Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Identify software safety-critical components.

Software Engineering (NPR 7150.2) and Software Assurance (8739.8) Requirements Mapping matrices

All known software risks are identified and technically assessed.

  • NASA has access to the software products in electronic format, including software development and management metrics.
  • The planned use of software static code analysis is acceptable.
  • Software risks, issues, and findings are documented.
  • The project has addressed all software assurance and Severity 1 or 2 findings.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.
  • Software supply chain risks.
  • Sufficient software workforce or software skillsets exist.

Identify the software assurance, software safety, and IV&V personnel for the project and milestone review,

Software Development Plan


IV&V concept reviews are completed (concept, reuse, architecture, ops maturity, feasibility, hazard, security)

Software Management Plan

 

Completed Software assurance and software safety requirements and task plans.

Safety and Mission Assurance Plan or Software Assurance Plan


Completed IV&V Project Execution Plan

Top technical, cost, schedule, and safety risks, risk mitigation plans, and associated resources


Completed peer review(s) of the software plans

Software Configuration Management Plan


Completed peer review(s) of the Software Requirements Specification

Operational Scenarios


Completed software assurance software requirements analysis

Concept of operations documentation


Additional Supporting Material

  • Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review were made available to participants prior to MDR
  • The following items are updated, if applicable

    • System requirements document

    • Concept of operations

    • Mission requirements, goals, and objectives

    • Risk management plan

    • Risk assessment and mitigations

    • Project software classification(s)

    • Cost and schedule data (including software)

    • Software Assurance Plan

  • Preferred system solution definition exists, including major trades and options
  • Logistics documentation exists (e.g., preliminary maintenance plan)
  • Preliminary system safety analysis available
  • System requirements traced to mission goals and objectives and the concept of operations
  • A preliminary Human Rating Plan exists, if applicable

Software Assurance:

  • Confirm that all RIDs/RFAs from SwRR have been resolved
  • Coordinate with the project on any changes in software classification and software criticality
  • Review or participate in system preliminary safety analysis
  • Review updates to requirements, concept of operations, and plans; verify changes are in alignment
  • Complete contributions to system safety and mission assurance plans
  • Confirm those entrance criteria are met
  • Confirm system requirements are traced to mission goals and objectives and the concept of operations
  • Be prepared to present the SA assessment of progress so far on the project, including the results of any audits/analyses done. If this information is not presented at the review, it should be made available to the project manager.
  • Review the materials submitted for the milestone review

Review and update the Preliminary SA Plan, as necessary

  • System documentation, as applicable
    • Architecture
    • Updated system requirements document
    • System-level software functionality description
    • System requirements traceability and preliminary allocation to software
    • Preliminary system safety analysis
    • Preferred system solution definition
  • Mission requirements, goals, and objectives, if applicable
  • Concept of operations, if applicable
  • SDP/SMP
  • CM plan
  • Acquisition strategy/plans
  • Cost and schedule data
  • Logistics documentation (e.g., preliminary maintenance plan)
  • Initial document tree, if applicable
  • Preliminary Human Rating Plan if applicable
  • Updated SA Plan

Software Assurance:

  • Reviews the documentation for the review; attends review
  • Summarize or present any SA analysis or audit results from this phase
  • Submits any RIDs/RFAs for identified issues or risks
  • The review panel agrees that:
    • The overall concept is reasonable, feasible, complete, responsive to the mission requirements, and is consistent with system requirements and available resources (cost, schedule, mass, and power)
    • Software design approaches and operational concepts exist and are consistent with the requirements set
    • Requirements, design approaches, and conceptual design will fulfill the mission needs within the estimated costs
    • Major risks have been identified and technically assessed, and viable mitigation strategies have been defined
    • System-level requirements are clearly and logically allocated to software

Software Assurance:

  • Confirms all risks and issues are documented
  • Confirms RFAs/RIDs for SA Plan are documented
  • Agrees with the review panel that the items above have been adequately addressed
  • Agrees with resolutions or action plans for RFAs/RIDs; tracks them to closure

System Definition Review (SDR)

The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5 082)

  • MDR is equivalent to SDR for robotic projects.
Entrance CriteriaItems ReviewedExit / Success Criteria

Software requirements

Requirements mapping table for the software assurance and software safety standard requirements

The project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B.

  • Adequate planning exists for developing, inserting, or deploying any enabling new software technology or avionics technology needed for software development.
  • The software schedule satisfies the following conditions: Coordinates with the overall project schedule, Documents the interactions of milestones and deliverables between software, hardware, operations, and the rest of the system, Reflects the critical dependencies for software development activities and identifies and accounts for dependencies with other projects and cross-program dependencies.
  • The proposed avionics/software architecture is credible and responsive to program requirements and constraints, including supporting the single-point failure/Fault Tolerance requirements.
  • The planned use of software static code analysis is acceptable.

Bi-directional traceability

Software requirements

Quality software requirements have been established.

  • Software requirements include requirements for COTS, GOTS, MOTS, OSS, reused software components, interface requirements, fault management software requirements, and software-related safety constraints, controls, and mitigations.
  • The software requirements are testable.
  • The software requirements address the Software Fault Tree Analysis (FTA) or Software Failure Modes and Effects Analysis results.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 30%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 20%
  • High software requirements quality risk score (software requirements quality score of 3 or higher).  Less than 3 is a risk, and fewer than 20% of the requirements have a high-risk or very high-risk score.
  • A sufficient number of detailed software requirements exist.
  • Software cybersecurity requirements are defined.
  • Operational scenarios are defined enough to support the definition and allocation of software requirements.

Completed Software Plans

Interface requirements documents, including Software interfaces

Software Bi-directional traceability is complete.

  • Bi-directional traceability is complete with the system, hardware, and ICD requirements.
  • Transparent allocation of system requirements to software and software architectural elements.

Completed software classification and safety criticality assessments are available

Hazard Analysis and software controls and mitigations (Functional Hazard Analysis / Hazard Reports / Hazard Analysis Tracking Index)

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Identification of software safety-critical components by having a preliminary software hazard analysis completed

Software and avionics architectures

The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project.

  • The correct software classifications have been defined.

NASA NPR 7150.2 requirements mapping matrix is complete

Software data dictionary

Certifiable software development practices by the organizations developing the critical software components exist.

NASA-STD-8739.8 requirements mapping matrix is complete

Software assurance requirement analysis results

Software metrics to track the software quality and maturity have been selected.

  • The software technical metric margins are defined and are acceptable.
  • Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization) and resources exist to complete the software development within budget, schedule, and known risks

Software data dictionary

Software classifications

Certifiable software development practices by the organizations developing the critical software components exist and are being followed.

  • Evidence exists with Software process audit results.
  • Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Completed Software Safety Assessment

Software architecture

All known software risks are identified and technically assessed.

  • NASA has access to the software products in electronic format, including software development and management metrics.
  • The planned use of software static code analysis is acceptable.
  • Software risks, issues, and findings are documented.
  • The project has addressed all software assurance and Severity 1 or 2 findings.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.
  • Software supply chain risks.
  • Sufficient software workforce or software skillsets exist.

Completed software assurance software requirements analysis

Software Management Plan(s)

IV&V concurs that the project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B. (If IV&V is required)

Completed the IV&V software requirements analysis

Software process audit results

 

Completed peer review(s) of the software requirements

Software verification and validation (V&V) planning


Completed peer review(s) of the software plans

Bidirectional traceability matrix.


Preliminary Human Rating Plan, if applicable

Technical resource utilization estimates and margins


Preliminary Maintenance Plan

Software configuration management (CM) plan

 

Software configuration management (CM) plan

Software Assurance (SA) and Software Safety Plan(s), including SA Product Acceptance Criteria and Conditions.


Completed IV&V Project Execution Plan

Software Development Plan

 

Confirm RFAs and RIDs from MCR have been satisfactorily resolved

Software peer review results


Identify the software assurance, software safety, and IV&V personnel for the project and milestone review.

Software Assurance schedule

 

The software assurance point of contact for the project has been identified.

Software coding guidelines

 


Software risks



Software safety analysis results

 


Cost estimate for the project’s Software Assurance support



IV&V Project Execution Plan (if required)




IV&V Project risk assessment (if required)

 


System architecture, including avionics architecture



Top technical, cost, schedule, and safety risks, risk mitigation plans, and associated resources



Concept Documentation


Additional Supporting Material

  • Successful completion of the previous review (typically SRR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs)
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review were made available to participants prior to SDR
  • Preferred software solution defined including major tradeoffs and options
  • Baseline documentation updated, as required
  • Preliminary functional baseline (with supporting trade-off analyses and data) available
  • Preliminary system software functional requirements available
  • As applicable, the risk management plan updated (could be part of SDP/SMP)
    • Updated software risk assessment and mitigations (including Probabilistic Risk Assessment (PRA), as applicable)
  • As applicable, SDP/SMP updated
    • Updated technology development, maturity, and assessment plan
    • Updated cost and schedule data
    • Work Breakdown Structure
  • Preliminary software safety analysis available for review
  • Project software data dictionary available
  • Preliminary project software assurance plan available
  • As applicable, an updated project software maintenance plan is available
  • Software inputs/contributions completed for:
    • Flow down of system requirements to all software functional elements of the system
    • Requirements process
    • Technical approach

Software Assurance:

  • Confirm that all RIDs/RFAs from SwRR have been resolved
  • Coordinate with the project on any changes in software classification
  • Review or participate in system preliminary safety analysis
  • Confirm that:
    • The preferred software solution is defined as including major tradeoffs and options
    • The functional baseline is available; system software functional requirements exist and have been flowed down into software functional elements of the system.
    • The requirements process is in place
    • SDP/SMP has been updated (as applicable), including risk management, cost, schedule, work break-down structure, technical management, plans for monitoring performance
    • Confirm other baselined documentation has been updated as needed; check for alignment of documentation
    • Complete contributions to system safety and mission assurance plans
    • Complete software assurance plan
    • Confirm those entrance criteria are met
    • Review materials submitted for the review
  • System architecture, including software
  • Preferred software solution with tradeoffs and options
  • Preliminary functional baseline
  • Preliminary system software functional requirements
  • The risk management plan, as applicable
  • SDP/SMP, as applicable
  • Preliminary software verification and validation (V&V) planning, as applicable
  • Software requirements documents
  • Interface requirements documents, including SW
  • Technical resource utilization estimates and margins
  • Software safety analysis
  • Software data dictionary
  • Software configuration management (CM) plan
  • Software quality assurance (QA) plan

Software Assurance:

  • Review materials for review; attend the review
  • Summarize or present any SA analysis or audit results from this phase
  • Submits any RIDs/RFAs for identified issues or risks
  • The review panel agrees that:
    • Software requirements, including mission success criteria and any sponsor-imposed constraints, are defined and form the basis for the proposed conceptual design
    • System-level requirements are flowed down to the software
    • All software technical requirements are allocated and flow down to subsystems is adequate; they are verifiable and traceable to their corresponding system level requirement; requirements, design approaches, and conceptual design will fulfill the mission needs consistent with the available resources (cost, schedule, throughput, and sizing)
    • Technical plans have been updated, as necessary, including a risk management plan, SDP/SMP, V&V planning, software maintenance plan, QA plan, CM plan
    • Tradeoffs are completed, and those planned for Phase B adequately address the option space
    • Adequate planning exists for the development of any enabling new technology
    • Significant development, mission, and safety risks are identified and technically assessed, and a process and resources exist to manage the risks
    • The operations concept is consistent with the proposed design concept(s) and in alignment with the mission requirements
    • The requisite level of detail and resources are available to support the acquisition and development plan within existing constraints
    • All of these software subsystem requirements are traceable to either mission objectives, the concept of operations, or interface requirements
    • Monitoring processes/practices are in place to create a software subsystem within planned technical, schedule, cost, effort, and quality capabilities
  • Preliminary verification approaches are agreed upon

Software Assurance:

  • Confirm all risks and issues are documented.
  • RIDs/RFAs for the SA Plan implemented
  • Agree with the review panel that the items above have been adequately addressed
  • Agree with resolutions or action plans for RFAs/RIDs; track them to closure

Preliminary Design Review (PDR)

The PDR demonstrates that the preliminary design meets all system requirements with acceptable risk and within the cost and schedule constraints and establishes the basis for proceeding with detailed design. It shows that the correct design option has been selected, interfaces have been identified, and verification methods have been described. Full baseline costs and schedules, as well as risk assessments, management systems, and metrics, are presented. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Software requirements have been baselined at the software level

Preliminary Software Design Description

The software's preliminary detailed design is complete and meets the software requirements with adequate margins.

  • The software's preliminary design features and architecture are represented by software requirements.
  • Adequate technical and programmatic margins (e.g., data throughput, memory) and resources exist to complete the software development within budget, schedule, and known risks.
  • Relevant software operational modes and states are defined (e.g., nominal, critical, contingency).
  • Software common cause addressed (if applicable).
  • Software design addresses the software safety-critical requirements.
  • Software design addresses unauthorized use of software applications, command issuing, memory changes, and data changes to the software configuration.
  • Software design provides automatic detection and safing for critical and catastrophic hazards.
  • Software fault management requirements, architecture, and design have been defined.
  • The software design logically isolates the safety-critical design elements and data from non-safety-critical data.
  • The Software command and telemetry design have been defined.
  • Software error detections have been defined.
  • Software design, architecture, components, and interfaces are defined and understood at the level needed for code development.
  • Software design addresses the Software Fault Tree Analysis (FTA) or Software Failure Modes and Effects Analysis results.
  • Cybersecurity has been addressed in the software design.

Known software hazards have been identified.

Avionics architecture

Quality software requirements have been established and baselined requirements.

  • Adequate documentation exists to allow proceeding with software implementation and software testing.
  • All software requirements are verifiable.
  • All Safety mission-critical software designs/requirements have been through a peer review. Peer reviews have required participation.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 20%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 1%
  • High software requirements quality risk score (software requirements quality score of 3 or higher).  Less than 3 is a risk, and fewer than 20% of the requirements have a high-risk or very high-risk score.
  • Software requirements volatility is less than 20%.
  • A sufficient number of detailed software requirements exist.
  • The software requirement volatility metric is less than 30%.
  • The detailed software requirements are agreed upon, finalized, stated clearly, and consistent with the preliminary design.
  • The software requirements reflect and include the Software Fault Tree Analysis (FTA), or Software Failure Modes and Effects Analysis results

Software architecture exist

Bidirectional traceability

Interface control documents, software definitions, and requirements are sufficiently mature to CDR.

Software classifications and safety criticality determination have been updated as necessary

Baseline software requirements

Software metrics to track the software quality and maturity are in use.

  • Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization) and resources exist to complete the software development within budget, schedule, and known risks.

Software development tools, computers, and facilities for implementation have been identified.

Software process audit results

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results

Preliminary interface control documents and interface design descriptions

Avionics architecture, components, and interfaces are defined and understood at the level needed to move to CDR.

  • The project has identified software quality attributes in the software architecture definition.

Software metrics and measurements exist.

Software design analysis results

Software risks to safety and mission success are understood and credibly assessed.

  • Software risks, issues, and findings are documented.
  • The project has addressed all software assurance and Severity 1 or 2 findings.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.
  • Software supply chain risks.
  • Sufficient software workforce or software skillsets exist.

Bidirectional traceability exists between the software requirements and the proletary software design modules.

Software requirements analysis results

The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project.

Preliminary software data dictionary

Planned software development tools, computers, and facilities

The problem reporting process is acceptable (e.g., including problem tracking, causal analysis, problem impact analysis, notification, monitoring, and resolution through change control.)

  • Change control is established (e.g., change tracking, review, approval, disapproval, change impact analysis, resolution, re-validation, and re-verification.)

Preliminary Software Design has the content prescribed in the 7.18 documentation guidance for a Software Design Document.

Preliminary software data dictionary

Software tools and programming language have been selected by the project, sufficient software analysis and development tools, and the software development organization is trained in the use of the software language.

  • Validated and accredited software tool(s) required to develop or maintain software exist. A defined approach to the automatic generation of software source code exists.

Resource utilization estimates and margins

Software development and test facilities

Hardware resources exist to develop and test the software. The required Engineering test units, modeling, and simulations have been identified and planned.

RFAs and RIDs from the previous review have been resolved, and any resulting actions are complete.

Baseline software architecture description

Software Bidirectional traceability is complete and accurate between the preliminary design and the software requirements.

Software configuration management processes and tools are in place

Software Fault Tree Analysis (FTA), or Preliminary Software Failure Modes and Effects Analysis results

NASA has access to the software products in electronic format, including software development and management metrics.

Software development and test facilities are planned and will be in place when needed.

Software hazard analysis includes the list of all software safety-critical components that the system hazard analysis has identified

Certifiable software development practices by the organizations developing the critical software components exist and are being followed.

  • Evidence exists with Software process audit results.
  • Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Identify the software assurance, software safety, and IV&V personnel for the project and milestone review.

Software metrics and measurement data

IV&V concurs that the project’s software requirements and design are sufficiently mature to proceed to CDR. (If IV&V is required)

Lessons learned captured from the software areas of the project

Preliminary software acceptance criteria and conditions


A preliminary high-level test plan identifies testing that needs to occur at each level of implementation/integration.

Baseline software configuration management plan and processes


Configuration Control Board established and Change Control Process in place.

Preliminary software test plans


Completed IV&V software design analysis

Status of software change requests


Completed peer review(s) of the software preliminary design

Baseline Software Engineering and Assurance schedules


Completed peer review(s) of the software requirements

Software Risks


Completed Software Assurance software design analysis

Prototype software, if applicable


Completed software assurance software requirements analysis

Resource utilization estimates and margins


Completed the IV&V software requirements analysis

Software Supplier documentation



Software Trade studies



Completed software requirements analysis by software assurance and IV&V (if required)



Baselined operational concepts



Baseline cost estimate for the project’s Software support



Software classifications and safety criticality determinations



Record of trade-off criteria & assessment (make/buy decision)


Additional Supporting Material

General

  • Successful completion of the SDR or MDR and responses made to all SDR or MDR Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review were made available to participants prior to PDR
  • Baseline documentation updated, as required
  • Risk assessment and mitigation updated
  • Cost and schedule data baselined
  • Peer reviews completed: SRS, software architectural design (if identified for SW peer review/inspection in SW development plans), integration test plans
  • Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)

Software Assurance:

  • Confirm that RFAs and RIDs from the previous review have been resolved and any resulting actions are complete
  • Participate in or review materials for peer reviews
  • Confirm other general entrance criteria have been completed

Plans

  • Applicable technical plans (e.g., technical performance measurement plan, payload-to-carrier integration plan, producibility/manufacturability program plan, reliability program plan, baselined quality assurance plan) available
  • SDP/SMP updated, if appropriate
    • Work Breakdown Structure
    • For the corresponding detailed design activities
  • Metrics established and gathered to measure software development progress
  • Logistics documentation (e.g., maintenance plan) updated, as required
  • Configuration management plan baselined
  • Configuration Control Board established for software (and change control procedures working)
  • Procedures and tools developed for mechanizing management and configuration management plans
  • Supplier documentation available for review, if applicable:
    • Software Data Dictionary(s)
    • Software Classification(s)
    • SDP/SMP [with verification and validation (V&V) separate]
    • Software configuration management plan(s)
    • Software assurance plan(s)
    • Software maintenance plan(s)
  • Test tools and facility requirements identified with plans and actions to ensure their availability when needed
  • Development environment ready (e.g., hardware diagram, operating system(s), compilers, DBMS, tools)
    • Developmental tools and facility requirements identified plans made and actions taken to ensure their availability when needed
    • Tools needed for software implementation are completed, qualified, installed, and accepted, and the team trained in their use
    • Facilities for software implementation in place, operating, and ready for use
  • Software quality assurance group formed and contributing as a team member to the design and test activities

Software Assurance:

  • Review/Assess plans including updated SDP/SMP, configuration plans, risk management plans, V&V plans, and maintenance plans; identify any issues or inconsistencies
  • Confirm issues and/or inconsistencies in software plans are addressed
  • Confirm that development tools, computers, and facilities for implementation are in place for development
  • Confirm tools for testing and facilities are planned and will be in place when needed
  • Confirm implementation team is trained on tools, and development environment and is familiar with the requirements and their allocation to components
  • Confirm configuration management processes and tools are in place
  • Conduct software assurance as an independent contributor to the quality of the design and test
  • Confirm other plan entrance criteria are in place and review items are available

Requirements

  • A preliminary traceability matrix to the CSCI (Computer Software Configuration Item) level exists, including V&V trace
    • Safety-critical requirements highlighted
    • Requirements allocated to components of the architecture (to CSCI level)
  • SRS baselined after SwRR/SRR is updated, if appropriate

Software Assurance:

  • Assess the requirements traceability to the CSCI level, to the lower level software elements, and the V&V tests, if available
  • Confirm that the requirements that flow from the hazard analysis reports are tracked and included in the architectural design
  • Confirm that all safety requirements, performance requirements, security requirements, and derived requirements, as well as requirements to be satisfied by Off-the-Shelf (OTS), are documented in the requirements baseline and included in the architectural design
  • Confirm that the Software Requirements Specification has been baselined, review updates if necessary

Design

  • Applicable standards available to the review team
  • Preliminary interface control documents available for review
  • Technical resource utilization estimates and margins ready for review
    • Storage or memory resource allocations are developed allocating those resources to each software segment in the architecture
  • Design Solutions, analysis, decision, and rationale documented
    • Inherited capabilities identified and compatible with the designs
  • Security and supportability requirements factored into the design
  • Trade studies completed
    • Addressing COTS (Commercial Off The Shelf), reuse, etc.
    • Trade-off analysis and data supporting design, as required
    • Alternative design solutions and selection criteria
  • Results of prototyping factored into the architectural design
  • Preliminary Software Design Document (SDD) including, as appropriate, the items from the 7.18 documentation guidance
  • Results available from evaluations of prototype software, if necessary to evaluate design
  • Human engineering aspects of design addressed with solutions acceptable to potential users
  • SDD and traceability matrix review by test team completed and SDD updated as needed
  • Critical components identified and trial coding scheduled
  • Confirmation exists that
    • The test group participated in requirements and design analysis
    • Interdisciplinary teams are working on design issues that cross (sub)system component boundaries (software, hardware, etc.)

Software Assurance:

  • Review or participate in design team meetings and peer reviews
  • Confirm that Preliminary Interface Control Documents, Preliminary Software Design Documents, and a completed definition of the software architecture and preliminary database design description are available
  • Review architectural and design documentation and confirm that the following have been considered:
    • Technical resource utilization
    • Storage or memory resource allocations for each software segment
    • Inherited capabilities
    • Design Solutions, analysis, decision, and rationale
    • Preliminary database design description, as applicable
    • Overview of software architecture, including context diagram
    • List of subsystems or major components
    • Functional allocations, descriptions of major modules, and internal interfaces
    • Security and supportability requirements
    • External interfaces and end-to-end data flow
    • Safety considerations in design elements and interfaces
    • Design verification approach/methods
    • Human engineering aspects of the design
  • Confirm the preliminary SDD has the content prescribed in the 7.18 documentation guidance for an SDD
  • Review other architectural documentation and participate in peer reviews to understand design and design decisions

Analysis

  • Safety analyses and plans baselined:
    • Matrix showing each subsystem/task/component's software classification (per NPR 7150.2), its safety classification (per NASA-STD-8739.8A), the rationale for the classifications, and the status of the classifications' approval by Software Assurance and Management
    • Updated PHA, Software Safety criticality determined, if necessary
    • Approved SMP/ PHA/Software Assurance Classification
  • Analyses for the following completed, as appropriate:
    • Partitioning analysis (modularity)
    • Executive control and Start/Recovery
    • Control and Data flow analysis
    • Operability
    • Preliminary failure modes and effects analyses
  • Operational Concepts revised, as applicable, and baselined
    • Normal operations scenarios
    • Fault detection, isolation, and recovery (FDIR) strategy
    • Hazard reduction strategies
  • Status of change requests available for review

Software Assurance

  • Perform a requirements analysis (See requirements analysis description (SAANALYSIS) in this Handbook in Section 7.18: Documentation Guidance
  • Perform a preliminary design analysis  (See description in this Handbook in Section 7.18: Documentation Guidance
  • Review the software classifications and safety criticality determinations and update them as necessary
  • Review the updated Preliminary Hazard Analysis and Hazard Reports to identify any new safety-critical software and requirements
  • The safety-related analysis may include Fault Tree Analysis (FTA), or Preliminary Failure Modes and Effects Analysis (See Topics 8.7 and 8.5, respectively, in the SA tab of Software Topics in this Handbook)
  • Be prepared to report on the audits and analysis performed during the Preliminary Design Phase or in a report to the project management
  • Confirm that all the input criteria for the review have been completed
  • Risk assessment and mitigation
  • Safety and assurance analysis and plans
  • Cost and schedule data
  • Logistics documentation (e.g., maintenance plan)
  • Technical plans (e.g., QA plan, performance measurement plan)
  • Interface control documents
  • Software V&V planning
  • Resource utilization estimates and margins
  • SDP/SMP
  • Bidirectional traceability matrix
  • Software design documents
  • Supplier documentation
  • Requirements documents
  • Concept of operations
  • Trade studies
  • Documented solutions, analysis, decisions, and rationale
  • Completed analyses
  • Prototype software, if applicable
  • Plans for development and test tools and facilities
  • Software development progress metrics
  • CM plan
  • Peer review results/proof of completion
  • Status of change requests

Software Assurance:

  • Reviews material for review; attends review
  • Summarize or present any SA analysis or audit results from the preliminary design phase
  • Submits any RIDs/RFAs for identified issues or risks
  • Top-level requirements including mission success criteria, Technical Performance Measures (TPMs), and any sponsor-imposed constraints are agreed upon, finalized, stated clearly, and consistent with the preliminary design
  • The review panel agrees that:
    • Flow down of verifiable requirements is complete and proper or, if not, an adequate plan exists for timely resolution of open items; requirements are traceable to mission goals and objectives
    • All supplier software requirements are verifiable
    • Preliminary design is expected to meet the functional and performance requirements at an acceptable level of risk
    • The definition of technical interfaces is consistent with overall technical maturity and provides an acceptable level of risk
    • Adequate technical interfaces are consistent with the overall technical maturity and provide an acceptable level of risk
    • Adequate technical margins exist with respect to TPMs
    • Any required new technology has been developed to an adequate state of readiness, or backup options exist and are supported to make them a viable alternative
    • Project risks are understood and credibly assessed; plans, processes, and resources exist to effectively manage them
    • The operational concept is technically sound, includes (where appropriate) human factors, and includes flow down of requirements for its execution
    • The proposed design approach has sufficient maturity to proceed to the final design
    • Subsystem requirements, subsystem preliminary design, results of peer reviews, and plans for development, testing, and evaluation form a satisfactory basis for proceeding into detailed design and test procedure development
    • SMP, the software architectural design, and integration test plans adequate and feasible to support software detailed design
  • All RIDs/actions are completed or have closure plans and customer approval received to proceed to the detailed design phase
  • Products from this review are approved, baseline, and placed under configuration management
  • Approval received for software inputs/contributions:
    • Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and IEEE parts) is adequately addressed in preliminary designs and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis, and failure modes and effects analysis)
    • Management processes used by the mission team are sufficient to develop and operate the mission
    • Cost estimates and schedules indicate that the mission will be ready to launch and operate on time and within budget and that the control processes are adequate to ensure remaining within allocated resources

Software Assurance: 

  • Baseline Software Assurance Plan
  • Baseline NASA-STD-8739.8 compliance matrix
  • Confirm NPR 7150.2 compliance matrix is complete and approved
  • Confirm that cost, schedule, Software CM Plan, Software Requirements Specification, Software Design Description, and Operations Concept are baselined
  • Agree with the review panel that the items above have been adequately addressed
  • Confirm that all issues, risks, RFAs/RIDs are documented
  • Agree with resolutions or action plans for RFAs/RIDs; track them to closure

Critical Design Review (CDR)

The CDR demonstrates that the maturity of the design is appropriate to support proceeding with full-scale fabrication, assembly, integration, and test and that the technical effort is on track to complete the flight and ground system development and mission operations to meet mission performance requirements within the identified cost and schedule constraints. Progress against management plans, budget, and schedule, as well as risk assessments, are presented. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Bidirectional traceability exists between the software requirements and the software design modules.

Baseline software detailed design description

The software's detailed design is complete and meets the software requirements with adequate margins.

  • Adequate technical and programmatic margins (e.g., data throughput, memory) and resources exist to complete the software development within budget, schedule, and known risks.
  • Relevant software operational modes and states are defined (e.g., nominal, critical, contingency).
  • Software common cause addressed (if applicable).
  • Software design addresses the software safety-critical requirements.
  • Software design addresses unauthorized use of software applications, command issuing, memory changes, and data changes to the software configuration.
  • Software design provides automatic detection and safing for critical and catastrophic hazards.
  • Software fault management requirements, architecture, and design have been defined.
  • The software design logically isolates the safety-critical design elements and data from non-safety-critical data.
  • The Software command and telemetry design have been defined.
  • Software error detections have been defined.
  • Software design, architecture, components, and interfaces are defined and understood at the level needed for code development.
  • Software design addresses the Software Fault Tree Analysis (FTA) or Software Failure Modes and Effects Analysis results.
  • Cybersecurity has been addressed in the software design.

Software requirements have been baselined at the software level

Avionics architecture

Interface control documents, software definitions, and requirements are sufficiently mature to proceed with software coding, software integration, and software testing, and plans are in place to manage any open items.

Known software hazards have been identified.

Baselined bidirectional traceability

The software baseline requirements and adequate documentation exist to allow proceeding with software implementation and software testing.

  • All software requirements are verifiable.
  • All Safety mission-critical software designs/requirements have been through a peer review. Peer reviews have required participation.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 20%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 1%
  • High software requirements quality risk score (software requirements quality score of 3 or higher).  Less than 3 is a risk, and fewer than 10% of the requirements have a high-risk or very high-risk score.
  • Software requirements volatility is less than 20%

Completed Software Assurance software design analysis

Updated baselined software requirements

The software test plans, test procedures, test cases, test environment (including simulations), and test design at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent for verification and validation of the source code and system functions allocated to the software.

  • Acceptable planned use of STB (Simulation Test Bed) instead of using flight hardware for flight certification.
  • The OTS software components will be verified and validated to the same level required to accept a similar developed software component for its intended use.
  • Software verification approach and regression testing approach are established for the project.
  • The software verification and validation requirements and plans are complete. Including stress testing, end-to-end testing, and software regression testing plans
  • The software testing approach is comprehensive, and the planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into the next phase.
  • Software verification approach and regression testing approach are established for the project.
  • Engineering test units, modeling, and simulations have been developed and certified per plan.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.

Completed Software Assurance software safety analysis

Baseline software data dictionary

Software risks to safety and mission success are understood and credibly assessed.

  • Software risks, issues, and findings are documented.
  • The project has addressed all software assurance and Severity 1 or 2 findings.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.

Software architecture exist

Baseline software interface control documents and software interface design description

Certifiable software development practices by the organizations developing the critical software components exist and are being followed.

  • Evidence exists with Software process audit results.
  • Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Data flow diagrams

Updated software architecture

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Software data dictionary content

Software development and testing tools, computers, facilities for implementation

Software Bidirectional traceability is complete and accurate between the detailed design and the software requirements.

Software design

Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results

Software data dictionary fields are correct. More than 95% of the Data dictionary data definitions are complete.

Procedures for deliverable software products

Software metrics and measurement data

Sufficient use of static code analysis tools on the project

Regression software testing plans are available.

Software process audit results

The software operational concept has matured, is at a CDR level of detail, and has been considered in test planning.

Resource utilization estimates and margins

Baseline software acceptance criteria and conditions

The use of secure coding practices is planned.

Results of a preliminary Software Assurance test analysis

Resource utilization estimates and margins

The program/project software cost and schedule estimates are credible and within program/project constraints. Software schedule has margin and reasonable timeframes for software development and testing.

  • Data from the software metrics and measurements shows credible cost, schedule, and technical issues.

RFAs and RIDs from the previous review have been resolved, and any resulting actions are complete

Planned software development tools, computers, and facilities

Problem reporting process is acceptable (e.g., including problem tracking, causal analysis, problem impact analysis, notification, monitoring, and resolution through change control.)

  • Change control is established (e.g., change tracking, review, approval, disapproval, change impact analysis, resolution, re-validation, and re-verification.)

Identify the software assurance, software safety, and IV&V personnel for the project and milestone review.

Software Assurance schedule

Software products (requirements, data dictionary, and design) are approved, baselined, and placed under configuration management.

Software classifications and safety criticality determination have been updated as necessary

Software Assurance software design analysis results

IV&V concurs that the project’s software requirements, design, and test plans are sufficiently mature to proceed to Code implementation. (If IV&V is required)

Software configuration management processes and tools are in place

Software assurance software requirements analysis results

 

Confirm supplier documentation exists and review it as appropriate

Software classifications and safety criticality determinations


Software development and test facilities are planned and will be in place when needed.

Software configuration management processes


Software development tools, computers, and facilities for implementation have been identified

Software hazard analysis includes the list of all software safety-critical components that the system hazard analysis has identified

 

Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results

Software development and test facilities


Software metrics and measurements exist.

Software Trade studies


Software schedule status

Software test planning


Status of change requests

Status of software change requests


Technical resource utilization estimates and margins

Supplier documentation


Update safety criticality determinations and software classifications, if necessary.

The safety-related analysis may include Fault Tree Analysis (FTA) or Preliminary Failure Modes and Effects Analysis.


Lessons learned captured from the software areas of the project

Traceability between software requirements and hazards


Bidirectional traceability exists between system hazards and software requirements.

IV&V Software requirements analysis results


Completed IV&V software design analysis

Software regression testing plans


Completed peer review(s) of the software design

IV&V software design analysis results



Cost estimate for the project’s Software Assurance support



Updated operational concepts



Prototype software, if applicable



Procedures for deliverable software products 


Additional Supporting Material

General

  • Successful completion of the previous review (typically PDR) and responses made to all Requests for Actions (RFAs) and Review Item Discrepancies (RIDs), or a timely closure plan exists for those remaining open
  • Final agenda, success criteria, and charge to the board have been agreed to by the technical team, project manager, and review chair
  • Technical products for this review were made available to participants prior to CDR
  • Baseline documents updated, as required
  • Peer reviews for software and rework accomplished, as defined in the s/w and/or project plans
  • NPR 7150.2 compliance matrix baselined
  • Lessons Learned captured from software areas of the project (indicate the problem or success that generated the LL, what the LL was, and its applicability to future projects)

Software Assurance:

  • Confirm NPR 7150.2 compliance matrix is approved and baselined
  • Confirm that any lessons learned to date have been added to the LL database
  • Attend or review any software peer reviews of design material
  • Confirm that all RFAs/RIDs from PDR have been completed

Plans

  • Previously baselined software documentation updated as appropriate
    • Updated cost and schedule data
  • Progress against software management plans available for review
  • Software requirements, management process, including documents used and produced, and verification and validation (V&V) planning updated and baselined
  • Staffing-up problems are being addressed, and contingency plans are in place
  • Independent verification and validation (IV&V) plans and status are available for review, if applicable
  • Management procedures and tools for measuring and reporting progress available and working
  • Software measurements available for review (planned and actual regarding product size, cost, schedule, effort, and defect)
  • Procedures established and working for software quality assurance and quality an integral part of the product being produced
  • The implementation process exists, incl. standards, review process, problem reporting, unit test, integration
  • Supplier documentation available for review, if applicable :
    • Software Design Description(s)
    • Interface Design Description(s)
    • Updated Supplier Software V&V Plan(s)
    • Preliminary Supplier Software Test Procedure(s)
    • Systems and subsystem certification plans and requirements exist (as needed)
  • Changes since PDR is available for review:
    • Updated product assurance and software safety plans and activities
    • System safety analysis with associated verifications
    • Status of configuration management processes since PDR available, including discrepancy reporting and tracking (development and post-release)
  • A build plan exists, including
    • Test timeline and ordered list of components and requirements to be tested in each build ready for review
    • Test group trained and prepared to evaluate the code using their facilities and tools
  • Coding, integration, and test plans and procedures available for review
    • Test levels described (e.g., unit testing, integration testing, software system testing) – description, who executes, test environment, standards followed, verification methodologies
    • Testing preparation and execution activities planned, incl. testing of reused/heritage software, if applicable
    • Test environments described for each test level --diagram and description of tools, testbeds, facilities
  • Preliminary plans available for review:
    • Launch site operations plan
    • Checkout and activation plan
    • Disposal plan (including decommissioning or termination)
  • Delivery, installation, and maintenance processes planned
  • Preliminary system and acceptance testing defined – operational scenarios to be tested, including stress tests and recovery testing, if applicable
  • Acceptance process exists – reviews (e.g., Acceptance Test Readiness Review, Acceptance Test Review), approval, and signoff processes
  • Acceptance criteria baselined

Software Assurance:

  • Review updates to baselined software plans (e.g. SDP, SMP, V&V plans)
  • Assess whether requirements management processes and configuration management processes are in place and working
  • Assess whether management procedures and tools for assessing progress are in place and working
  • SA measurement process has been established and is working; measures are being collected analyzed and reported on
  • Confirm supplier documentation exists and review it as appropriate
  • Confirm that procedures and standards for implementation, review process, problem reporting, unit test, and integration are in place
  • Confirm planning for testing has been addressed including test procedures, levels of testing, test team training, test environment and facility, test tools and accreditation plans, acceptance process, acceptance criteria
  • If applicable, confirm launch site operations plan, checkout and activation plan, and decommissioning plans are in place
  • Update software assurance and software safety plans, if needed

Requirements

  • Changes to IT security requirements since PDR available for review (Mission-specific)
  • SRS updated to the Computer Software Unit (CSU) level
  • Traceability matrix updated (to CSU level)
  • Verification exists that detailed designs cover the requirements

Software Assurance:

  • Confirm any changes in IT security requirements are available; review changes
  • Assess the updates to the Software Requirements Specification
  • Assess the traceability matrix updates: Be sure SRS updates are included
  • Verify that the detailed designs cover the requirements

Design

  • Technical data package (e.g., integrated schematics, spares provisioning list, interface control documents, engineering analyses, and specifications) available for review
  • Design process exists, including methodology and standards used, design documentation produced, inspections and reviews
  • Software design document(s) baselined (including interface design documents, detailed design, and unit test)
  • Command and telemetry list available for review
  • The final design solution, evaluation, and rationale available
    • Reused/heritage software or functionality from previous projects; necessary modifications
  • Final architecture definition available
  • Subsystem/component context diagram available
  • Data flow diagrams available
  • Software subsystem design diagram available (e.g., Level 0 data flow diagram or Unified Modeling Language (UML))
  • For each task in the software subsystem design diagram
    • Design diagrams for the task
    • Description of functionality and operational modes
    • Safety considerations addressed in the design
  • Resource and utilization constraints (e.g., CPU, memory); how the software will adapt to changing margin constraints; performance estimates
  • Data storage concepts and structures
  • Input and output data and formats identified
  • Interrupts and/or exception handling available, including event, FDC, and error messages
  • IT Security features (design features) identified
  • A detailed description of software operation and flow exists
  • Operational limits and constraints identified
  • Technical resource utilization estimates and margins updated
    • Detailed timing and storage allocation compiled
  • Algorithms exist sufficient to satisfy their requirements
  • Failure detection and correction (FDC) requirements, approach, and detailed design available for review
  • The trial code was analyzed and designs were modified accordingly
  • Designs comprising the software completed, peer-reviewed, and placed under change control

Software Assurance:

  • Review/Assess all design diagrams and design documentation - Confirm existence and completeness (see next section for design analysis)
  • Attend design reviews
  • Confirm safety and security considerations are included in the design
  • Update hazard analysis reports and safety plans, based on detailed design

Analysis

  • Analyses completed:
    • Algorithm accuracy
    • Critical timing and sequence control
    • Undesired event handling
    • Operability
    • Failure modes and effects analyses
  • Final status and results of analyses ready for review
  • Hazard analysis / Software Assurance Classification updated, if necessary
  • Subsystem-level and preliminary operations safety analyses exist
  • Risk assessment and mitigation updated
  • Reliability analyses and assessments updated
  • Operational Concepts updated
  • Product build-to specifications exist for each hardware and software configuration item, along with supporting trade-off analyses and data
  • Status of change requests available for review

Software Assurance:

  • Review and update risks
  • Perform design analysis (SADESIGN) and prepare results for reporting (See Section 7.18: Documentation Guidance in this Handbook )
  • Perform safety analysis (See Software Safety Analysis Topic in SA Tab of Topics in this Handbook) and assist with security analysis; prepare results for reporting
  • Update safety criticality determinations, and software classifications, if necessary
  • Perform failure modes and effects analyses (See Topic 8.5 in the SA tab of Topics in this Handbook), if necessary
  • Review updated documentation and analysis performed by software 

Other

  • Software requirement verification recording, monitoring, and current status available for review – databases and test reports; sample test verification matrix
  • Preliminary operations handbook created
  • The programmer's manual drafted
  • User's manual drafted

Software Assurance:       

  • Confirm draft manuals above exist if planned
  • Baseline documents
  • Technical data package
  • SDP/SMP
  • Progress against software management plans
  • Plan and status for reviews
  • Documentation plan
  • NPR 7150.2 compliance matrix
  • Design and implementation processes
  • Status of management procedures and tools
  • Software measurements
  • Logistics documentation (e.g., maintenance plan)
  • Status of any staffing problems
  • Software design document(s)
  • Command and telemetry list
  • Final Design Solution, Evaluation, and Rationale
  • Final Architecture Definition
  • Software subsystem design diagram
  • Data flow diagrams
  • Identification and formats of input and output data
  • Interrupts and/or exception handling, including event, FDC, and error messages
  • IT Security requirements and features
  • A detailed description of software operation and flow
  • Operational limits and constraints
  • Technical resource utilization estimates and margins
  • Status and results of analyses
  • Algorithms sufficient to satisfy their requirements
  • Failure detection and correction (FDC) requirements, approach, and detailed design
  • Subsystem/component context diagram
  • Status of trial code analysis and design
  • Supplier documentation
  • Status of SW designs and requirements coverage verification
  • SRS
  • Bidirectional Traceability Matrix
  • Status of software QA and safety plans, procedures, activities
  • Hazard analysis / Software Assurance Classification Report (SACR), if necessary
  • Risk assessment and mitigation
  • Reliability analyses and assessments
  • IV&V plans and status
  • Systems and subsystem certification plans and requirements (as needed)
  • CM processes
  • Status of the development environment and personnel training
  • Build plan
  • Product build-to specifications along with supporting trade-off analyses and data
  • Coding, integration, and test plans and procedures
  • V&V planning
  • Build test timeline and ordered list of components and requirements to be tested in each build
  • Launch site operations plan
  • Checkout and activation plan
  • Disposal plan
  • Preliminary Operations Handbook
  • Draft of Programmer's Manual
  • Draft of User's Manual
  • Status of change requests

Software Assurance:          

  • Review the CDR review materials and updates
  • Be prepared to present SA status and results of SA analysis, audits, SA metrics, and analysis of software metrics – SA should be able to provide a general “state of the software project” assessment
  • Submit RFAs/RIDs on any identified issues, risks
  • The review panel agrees that:
    • All supplier software requirements have been mapped to the software design
    • All elements of the design are compliant with functional and performance requirements (detailed design is expected to meet requirements with adequate margins at an acceptable level of risk)
    • Interface control documents are sufficiently matured to proceed with fabrication, assembly, integration, and testing, and plans are in place to manage any open items
    • Product verification and product validation requirements and plans are complete; the verification approach is viable and will confirm compliance with all requirements
    • Management processes used by the project team are sufficient to develop and operate the mission
    • The testing approach is comprehensive, and planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into the next phase
    • Adequate technical and programmatic margins and resources exist to complete development within budget, schedule, and risk constraints
    • Risks to mission success are understood and credibly assessed, and plans and resources exist to effectively manage them
    • SDP/SMP, software detailed designs, and unit test plans are an adequate and feasible basis for the implementation and test activities
    • High confidence exists in the product baseline, and adequate documentation exists or will exist on time to allow proceeding with coding, integration, and test
  • Approval received for software inputs/contributions:
    • Safety and mission assurance (e.g., safety, reliability, maintainability, quality, and EEE parts) have been adequately addressed in system and operational designs, and any applicable S&MA products (e.g., Probabilistic Risk Assessment (PRA), system safety analysis and failure modes and effects analysis) have been approved
  • High-priority RIDs against the SDD are closed/actions are completed and customer approval received to proceed to the next phase
  • Approved readiness to proceed with software implementation and test activities
  • Products from this review are approved, baselined, and placed under configuration management

Software Assurance:

  • Confirm that all issues, risks, RFAs/RIDs are documented
  • Agrees with the review team that the exit criteria listed above have been met and the project is ready to move into the next phase of development
    • Agree that the review panel items above have been adequately addressed
    • Agree with resolutions or action plans for RFAs/RIDs; track them to closure

Production Readiness Review (PRR)

The PRR is held for projects developing or acquiring multiple similar or identical flight and/or ground support systems. The purpose of the PRR is to determine the readiness of the system developer(s) to efficiently produce (build, integrate, test, and launch) the required number of systems. The PRR also evaluates how well the production plans address the system's operational support requirements. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Bidirectional traceability exists between the software requirements and the software design modules.

Baseline software detailed design description

The software's detailed design is complete and meets the software requirements with adequate margins.

  • Adequate technical and programmatic margins (e.g., data throughput, memory) and resources exist to complete the software development within budget, schedule, and known risks.
  • Relevant software operational modes and states are defined (e.g., nominal, critical, contingency).
  • Software common cause addressed (if applicable).
  • Software design addresses the software safety-critical requirements.
  • Software design addresses unauthorized use of software applications, command issuing, memory changes, and data changes to the software configuration.
  • Software design provides automatic detection and safing for critical and catastrophic hazards.
  • Software fault management requirements, architecture, and design have been defined.
  • The software design logically isolates the safety-critical design elements and data from non-safety-critical data.
  • The Software command and telemetry design have been defined.
  • Software error detections have been defined.
  • Software design, architecture, components, and interfaces are defined and understood at the level needed for code development.
  • Software design addresses the Software Fault Tree Analysis (FTA) or Software Failure Modes and Effects Analysis results.
  • Cybersecurity has been addressed in the software design.

Software requirements have been baselined at the software level

Avionics architecture

Interface control documents, software definitions, and requirements are sufficiently mature to proceed with software coding, software integration, and software testing, and plans are in place to manage any open items.

Known software hazards have been identified.

Baselined bidirectional traceability

The software baseline requirements and adequate documentation exist to allow proceeding with software implementation and software testing.

  • All software requirements are verifiable.
  • All Safety mission-critical software designs/requirements have been through a peer review. Peer reviews have required participation.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 20%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 1%
  • High software requirements quality risk score (software requirements quality score of 3 or higher).  Less than 3 is a risk, and fewer than 10% of the requirements have a high-risk or very high-risk score.
  • Software requirements volatility is less than 20%

Completed Software Assurance software design analysis

Updated baselined software requirements

The software test plans, test procedures, test cases, test environment (including simulations), and test design at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent for verification and validation of the source code and system functions allocated to the software.

  • Acceptable planned use of STB (Simulation Test Bed) instead of using flight hardware for flight certification.
  • The OTS software components will be verified and validated to the same level required to accept a similar developed software component for its intended use.
  • Software verification approach and regression testing approach are established for the project.
  • The software verification and validation requirements and plans are complete. Including stress testing, end-to-end testing, and software regression testing plans
  • The software testing approach is comprehensive, and the planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into the next phase.
  • Software verification approach and regression testing approach are established for the project.
  • Engineering test units, modeling, and simulations have been developed and certified per plan.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.

Completed Software Assurance software safety analysis

Baseline software data dictionary

Software risks to safety and mission success are understood and credibly assessed.

  • Software risks, issues, and findings are documented.
  • The project has addressed all software assurance and Severity 1 or 2 findings.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.

Software architecture exist

Baseline software interface control documents and software interface design description

Certifiable software development practices by the organizations developing the critical software components exist and are being followed.

  • Evidence exists with Software process audit results.
  • Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Data flow diagrams

Updated software architecture

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Software data dictionary content

Software development and testing tools, computers, facilities for implementation

Software Bidirectional traceability is complete and accurate between the detailed design and the software requirements.

Software design

Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results

Software data dictionary fields are correct. More than 95% of the Data dictionary data definitions are complete.

Procedures for deliverable software products

Software metrics and measurement data

Sufficient use of static code analysis tools on the project

Regression software testing plans are available.

Software process audit results

The software operational concept has matured, is at a CDR level of detail, and has been considered in test planning.

Resource utilization estimates and margins

Baseline software acceptance criteria and conditions

The use of secure coding practices is planned.

Results of a preliminary Software Assurance test analysis

Resource utilization estimates and margins

The program/project software cost and schedule estimates are credible and within program/project constraints. Software schedule has margin and reasonable timeframes for software development and testing.

  • Data from the software metrics and measurements shows credible cost, schedule, and technical issues.

RFAs and RIDs from the previous review have been resolved, and any resulting actions are complete

Planned software development tools, computers, and facilities

Problem reporting process is acceptable (e.g., including problem tracking, causal analysis, problem impact analysis, notification, monitoring, and resolution through change control.)

  • Change control is established (e.g., change tracking, review, approval, disapproval, change impact analysis, resolution, re-validation, and re-verification.)

Identify the software assurance, software safety, and IV&V personnel for the project and milestone review.

Software Assurance schedule

Software products (requirements, data dictionary, and design) are approved, baselined, and placed under configuration management.

Software classifications and safety criticality determination have been updated as necessary

Software Assurance software design analysis results

IV&V concurs that the project’s software requirements, design, and test plans are sufficiently mature to proceed to Code implementation. (If IV&V is required)

Software configuration management processes and tools are in place

Software assurance software requirements analysis results

 

Confirm supplier documentation exists and review it as appropriate

Software classifications and safety criticality determinations


Software development and test facilities are planned and will be in place when needed.

Software configuration management processes


Software development tools, computers, and facilities for implementation have been identified

Software hazard analysis includes the list of all software safety-critical components that the system hazard analysis has identified

 

Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results

Software development and test facilities


Software metrics and measurements exist.

Software Trade studies


Software schedule status

Software test planning


Status of change requests

Status of software change requests


Technical resource utilization estimates and margins

Supplier documentation


Update safety criticality determinations and software classifications, if necessary.

The safety-related analysis may include Fault Tree Analysis (FTA) or Preliminary Failure Modes and Effects Analysis.


Lessons learned captured from the software areas of the project

Traceability between software requirements and hazards


Bidirectional traceability exists between system hazards and software requirements.

IV&V Software requirements analysis results


Completed IV&V software design analysis

Software regression testing plans


Completed peer review(s) of the software design

IV&V software design analysis results



Cost estimate for the project’s Software Assurance support



Updated operational concepts



Prototype software, if applicable



Procedures for deliverable software products

 


Additional Supporting Material

  • Significant production engineering problems encountered during development are resolved
  • Design documentation is adequate to support production
  • Production plans and preparation are adequate to begin the fabrication
  • Production-enabling products and adequate resources are available, allocated, and ready to support end-product production
  • Production risks and mitigations identified
  • The schedule reflects production activities

Software Assurance:

  • Confirms that production engineering problems encountered during development are resolved and any production risks and mitigations have been identified
  • Confirms that design documentation, production plans, and preparation are adequate to support production
  • Confirm all necessary resources are available and ready to support production
  • Confirm the schedule is reasonable for a production
  • Design documentation
  • Production plans and preparation
  • Production risks and mitigations
  • Schedule

Software Assurance:         

  • Review all materials for review
  • Submit RFAs/RIDs on any identified issues, risks
  • The review panel agrees that:
    • System requirements are fully met in the final production configuration
    • Adequate measures are in place to support production
    • Design-for-manufacturing considerations ensure ease and efficiency of production and assembly
    • Risks are identified, credibly assessed, and characterized, and mitigation efforts defined
    • Alternate sources for resources identified, as appropriate
    • Required facilities and tools are sufficient for end-product production
    • Specified special tools and test equipment are available in proper quantities
    • Production and support staff are qualified
    • Production engineering and planning are sufficiently mature for cost-effective production
    • Production processes and methods are consistent with quality requirements
    • Qualified suppliers are available for materials that are to be procured
  • Delivery schedules are verified

Software Assurance:

  • Agrees with the panel that exit criteria have been met
  • Agree with resolutions or action plans for RFAs/RIDs; track them to closure
  • Confirm that all issues, risks, RFAs/RIDs are documented

System Integration Review (SIR)

The SIR evaluates the readiness of the project to start flight system assembly, test, and launch operations. V&V Planning, integration plans, and test plans are reviewed. Test articles (hardware/software), test facilities, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Confirm software integration plans and procedures are in place and approved

Software Integration plans and procedures

High confidence exists that the software will support system integration and system integration testing.

  • Interface control documents software definitions and requirements are sufficiently mature to proceed with software integration and system testing, and plans are in place to manage any open items.
  • The system testing approach for software is comprehensive, and the planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into the next phase.
  • The project has addressed all Severity 1 or 2 findings.

Confirm all known software discrepancies have been identified and disposed of per the agreed-upon plan.

Software Unit Test Results

Previous software unit testing and software verification test results form a satisfactory basis for proceeding to integration.

  • The project has addressed all Severity 1 or 2 findings.

Confirm that the software assurance/control organization is prepared to support the integration

Software builds plan

Adequate software resources and hardware for development and testing have been planned and budgeted, including support system implementation and system testing. Software integration support staff are qualified.

  • TBD and TBR items are identified with acceptable plans and schedules for their dispositions.

Software test results

Software test plans and results

Program/project leadership identifies and accepts software risks as required.

Confirm segments and components are ready for integration; all facilities are ready and available

Software test preparation (facilities, tools, equipment, personnel)

The software integration procedures and workflow have been clearly defined and documented or are on schedule to be clearly defined and documented before their need date.

  • The review of the software integration plans, as well as the procedures, configuration data maturity, load data maturity, and software configuration status of the items to be integrated, provides a reasonable expectation that the integration will proceed successfully.

Confirm support personnel are adequately trained.

Software Static Analysis tool results

Software maintainability considerations have been incorporated to ensure the ease and efficiency of software changes.

Confirm that all previous design review success criteria and critical issues have been satisfied per the agreed-upon plan.

Software code coverage data

Required software facilities and tools are sufficient for system integration and testing.

  • Engineering test units and/or modeling and simulations have been developed and tested per plan.

Confirm that all electrical and mechanical interfaces have completed qualification testing before integration.

Software cyclomatic complexity data

Software system integration processes, facilities, and methods are within acceptable risk from threats and cybersecurity vulnerabilities.

Review any safety or handling issues and address any possible risks during testing.

Software metrics and measurement data

 

Adequate technical and programmatic margins (e.g., data throughput, memory) and resources exist to complete the software development within budget, schedule, and known risks.

Assess all handling and safety requirements; identify any issues or risks

Software process audit results

Risks to safety and mission success are understood and credibly assessed and plans and resources exist to effectively manage them.

Confirm that all software components and data are under configuration control.

Software data dictionary

Software data dictionary fields are complete and correct. (Guideline- More than 95% of the Data dictionary data definitions are complete)


Submits RFAs/RIDs for any identified risks or issues

System integration testing includes testing for all known software hazards.


Handling and safety requirements

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

A functional, unit-level, subsystem, and qualification test results/proof of completion

Software products (code and data) are approved, baselined, and placed under configuration management.


Interface control documentation

IV&V concurs that the project’s software requirements, design, and testing are sufficiently mature to proceed to system integration. (If IV&V is required)

Additional Supporting Material

  • Integration plans and procedures completed and approved
  • Segments and/or components available for integration
  • Mechanical and electrical interfaces verified against the interface control documentation
  • All applicable functional, unit-level, subsystem, and qualification testing was conducted successfully
  • Integration facilities, including clean rooms, ground support equipment, electrical test equipment, and simulators ready and available
  • Support personnel adequately trained
  • Handling and safety requirements documented
  • All known system discrepancies identified and disposed of in accordance with the agreed-upon plan
  • All previous design review success criteria and key issues were satisfied following the agreed-upon plan
  • Quality control organization is ready to support the integration effort

Software Assurance:

  • Confirm integration plans and procedures are in place and approved
  • Confirm segments and components are ready for integration; all facilities are ready and available
  • Confirm all mechanical and electrical interfaces have been verified; Confirm all applicable qualification testing has been completed
  • Confirm support personnel are adequately trained
  • Assess all handling and safety requirements; identify any issues or risks
  • Confirm that all previous design review success criteria and key issues have been satisfied as per previously agreed-upon plan
  • Confirm all known system discrepancies have been identified and disposed of in accordance with the agreed-upon plan
  • Confirm software assurance/control organization is prepared to support the integration
  • Integration plans and procedures
  • Interface control documentation
  • A functional, unit-level, subsystem, and qualification test results/proof of completion
  • Test preparation (facilities, tools, equipment, personnel)
  • Handling and safety requirements
  • V&V planning, test plans

Software Assurance:         

  • Review all documentation, attend review
  • Submits RFAs/RIDs for any identified risks or issues
  • The review panel agrees that:
    • Adequate integration plans and procedures are completed and approved for the system to be integrated
    • Previous component, subsystem, and system test results form a satisfactory basis for proceeding to integration
    • Integration procedures and workflow have been clearly defined and documented
    • A review of integration plans, as well as procedures, environment, and configuration of items to be integrated, provides a reasonable expectation that integration will proceed successfully
    • Integration personnel have received the appropriate training in integration and safety procedures
  • The risk level is identified and accepted by program/project leadership, as required

Software Assurance:

  • Agrees with the panel that exit criteria have been met
  • Agree with resolutions or action plans for RFAs/RIDs; track them to closure
  • Confirm that all issues, risks, RFAs/RIDs are documented

Test Readiness Review (TRR)

The TRR ensures that the test article (hardware/software), test facility, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7123.1 041)

Entrance CriteriaItems ReviewedExit / Success Criteria

All previous design review success criteria and critical issues were satisfied per the agreed-upon plan.

Completed evaluations of the software unit, verification, and integration test plans, procures, and results

The software test plans, test procedures, test cases, test environment (including simulations), and test design at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent for verification and validation of the source code and system functions allocated to the software.

  • Acceptable planned use of STB (Simulation Test Bed) instead of using flight hardware for flight certification.
  • The OTS software components will be verified and validated to the same level required to accept a similar developed software component for its intended use.
  • Software verification approach and regression testing approach are established for the project.
  • The software testing approach is comprehensive, and the planning for system assembly, integration, test, and launch site and mission operations is sufficient to progress into the next phase.
  • Software verification approach and regression testing approach are established for the project.
  • Engineering test units, modeling, and simulations have been developed and certified per plan.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.
  • The software verification and validation requirements and plans are complete, including stress testing, end-to-end testing, fault management, unauthorized use of software applications, command issuing, memory changes, and data changes to the software configuration, and software regression testing plans.
  • Software test procedures are complete, bi-directional traceability between the software requirements and software test procedures is complete, and the software test coverage approach is approved.
  • All Safety mission-critical software test procedures have been through a peer review. Peer reviews have required participation.
  • The software configuration data has been tested.
  • Informal software test runs are completed.

Completed code reviews and walkthroughs, test plan and test procedures, peer reviews

Software and Technical Metric Data and Reports

All software-related TBD and TBR items have been resolved.

  • The project has addressed all Severity 1 or 2 findings.

Confirm adherence to a secure coding standard

Bidirectional traceability matrix

Software risks to safety and mission success are understood and credibly assessed.

  • Software risks, issues, and findings are documented.
  • The project has addressed all software assurance and Severity 1 or 2 findings.
  • Verify that the project does not have a risk associated with developing software at the same time as the hardware is being designed, the risk of misunderstanding the software-hardware interfaces, and not having the hardware available to use in the software development.

Configuration management of test artifacts in place

Code Analysis and Assessment Results

The software test data has been reviewed and analyzed for expected results and the results are consistent with the test plans and objectives.

Software build created from CM and ready for testing.

Software Requirements and software design information

The software test team has been trained and has demonstrated knowledge of the software test processes, configuration control processes, and defect reporting processes.

Confirm all known discrepancies are identified and addressed in an agreed-upon manner.

Results for all software testing completed to date

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Confirm audits or assessments of documentation and processes have been performed throughout the implementation.

Software builds procedures for testing

All software static code analysis has been completed, and appropriate findings have been corrected.

Confirm end-to-end functional flows and database linkages will be tested

Software data dictionary

The software requirements are baselined, and adequate documentation exists to allow proceeding with software implementation and software testing.

  • All software requirements are verifiable.
  • All Safety mission-critical software designs/requirements have been through a peer review. Peer reviews have required participation.
  • TBD and TBR items are identified with acceptable plans and schedules for their disposition, and the software volatility metric is less than 20%. The % of TBD/TBC/TBRs in the software requirements document does not exceed 1%.
  • The total software requirements quality risk score is above 4 with 0% of the detailed software requirements scoring at a high or very high-risk level.
  • The software requirement volatility is less than 5%.

Confirm processes and procedures are in place for testing

Software change requests

Software data dictionary fields are correct and complete and have been tested.

  • The software configuration data has been tested.

Version Description Document

Software Test Plans

Certifiable software development practices by the organizations developing the critical software components exist and are being followed.

  • Evidence exists with Software process audit results.
  • Evidence that the software development organization followed the required processes for this point in the software lifecycle.

Software Test Analysis

Status of known software or system discrepancies

Software products under test and used for testing are approved, baselined, and placed under configuration management.

Confirm any waivers or deviations are approved

Test contingency planning

An acceptable approach for independent software testing is planned.

Confirm all plans have been updated as needed and are ready for testing.

Software Test Analysis results

Software Bidirectional traceability between the requirements and test procedures is complete and accurate.

Confirm any necessary verification of computations has been completed

Software defect or problem reports

All identified safety-critical software components have a cyclomatic complexity value of 15 or lower.

Confirm findings from all previous analyses have been resolved as agreed-upon

Software test facilities definition and fidelity

100 percent code test coverage has been achieved for All identified safety-critical software components using the Modified Condition/Decision Coverage (MC/DC) criterion.

Applicable functional, unit-level, subsystem, system, and qualification testing were conducted successfully.

 

Software Static Analysis tool results

The software Version Description Document is complete and accurate.

Confirm resources needed and facility are prepared for testing

Software test cases, scenarios, databases, procedures, environment, expected results, and configuration of test item(s)

Software structural quality is evident in the software products.

  • Software architecture quality exists.
  • The code is maintainable.
  • Safety-critical and secure coding standards are used.
  • Code is fault tolerance.
  • The code is secure.
  • The code is testable.

Confirm that the software build is ready for testing and that previous testing has been completed successfully.

Resources (people, facilities, tools, etc.)

IV&V concurs that the project’s software requirements, design, and testing are sufficiently mature to proceed to software testing and integration. (If IV&V is required)

Confirm that all baselined documents from previous reviews are available

Status of quality assurance (QA) activities


Confirm TRR-specific materials are available (e.g., test plans, test procedures, test cases, version descriptions document, test schedule, test status, test data)

Submit any RFAs/RIDs on identified issues or risks


Confirms that all requirements have a corresponding test(s) in the test procedures document, including changes due to discrepancy reports, and all requirements are up to date in the traceability matrix

Supplier Software VDD(s)


Databases for integration and testing have been created and validated

Test network


Expected results defined

Test Preparation


Have an agreed-upon plan for SA review of testing (witnessing, sampling, etc.)

Test schedule


Have assessed bidirectional traceability from requirements to test procedures/cases

VDD and VDD audit results


Informal dry run completed without errors.

Severity 1 or 2 findings


Known problems, issues

Baseline documentation from previous reviews


Lessons learned captured from the software areas of the project

Current risks, issues, or requests for action (RFAs)


Objectives of tests

Risk analysis, list, and management plan


Outstanding software change requests (SCRs) ready for review

Interfaces definitions


Procedures in place for use for testing: capturing test results, reporting discrepancies

Requirements Analysis and Traceability Reports


Review results of testing completed to date, any outstanding software change requests

Software cost estimate and expenditures report


Safety-critical and security considerations have been identified.

Operations and user manuals


Successful functional configuration audit (FCA) of the version description document (VDD) (such as FSW), including fixes



Test network showing interdependencies among test events and planned time deviations for these activities prepared



Test Plans, Test Procedures, Test Cases with applicable data, Test Scenarios, Test Environment



Tests reusable for regression testing exist.



Validation of operations and user manuals completed.



Verification of computations using nominal and stress data



Code inspection results



Verification of end-to-end functional flows and database linkages



Verification of performance throughout the anticipated range of operating conditions, including nominal, abnormal, failure, and degraded mode situations



Completed functional configuration audit of the VDD



Confirm any problems/issues have been documented.



Additional Supporting Material

General

  • All TRR-specific materials, such as test plans, test cases, procedures, and version description documents available to all participants before TRR
  • Updated baselined documentation available (from previous reviews – SwRR, SRR, PDR, CDR)
  • Required documents are in the state/status required; any required deviations or waivers are in place and approved
  • All known system discrepancies identified and disposed of in accordance with the agreed-upon plan
  • Software cost estimate updated, and software-related expenditures collection and report by life cycle phases available
  • Test schedules are updated and are reasonable based on the results of unit testing
  • Lessons Learned captured from software areas of the project ( indicate the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects)

Software Assurance:

  • Confirm that all baselined documents from previous reviews are available
  • Confirm any waivers or deviations are approved
  • Confirm TRR-specific materials are available (e.g. test plans, test procedures, test cases, version descriptions document, test schedule, test status, test data)
  • Confirm all known discrepancies are identified and addressed in an agreed-upon manner
  • Assess schedules for reasonableness
  • Lessons learned captured from the software areas of the project
  • Confirm all input criteria for TRR have been met

Plans

  • Objectives of testing and testing approach clearly defined and documented and supported by:
    • Test plans, test cases, procedures, environment
    • Defined configuration of test item(s)
  • Interfaces placed under configuration management or defined in accordance with an agreed-to plan
  • All required test resources, people (including a designated test director), facilities, test articles, test instrumentation, and other test-enabling products identified and available to support required tests
  • Facilities and tools for integration and test ready, qualified, validated, and available for operational use including test engineering products (test cases, procedures, tools, etc.), testbeds, simulators, and models
  • The roles and responsibilities of all test participants were defined and agreed to and all personnel have been trained
  • Software Version Description(s) available
  • Metric data and reports (implementation and test) ready for review
  • IV&V report/status - if applicable
  • Any current risks, issues, or requests for action (RFAs) that require follow-up and how they will be tracked to closure ready for review
  • Risk analysis and risk list updated and associated risk management plan updated
  • The test plan includes test scenarios:
    • User-defined scenarios to test interactive or operator-oriented software
    • Safety-critical scenarios
    • Security scenarios
    • For all software/system requirements defined in the bidirectional traceability matrix
    • Performance checks at the limit of ranges specified for the requirements and operational scenarios, including test limitations and constraints
  • Test case structure established that identifies per test case:
    • Software requirements to be tested and SW entities to be exercised
    • Required inputs
    • Facilities and test tools required, setup, and required qualifications
    • Limitations of the test environment
  • Test plan updated for integration and test activities
  • Software test procedures baselined, including safety criticality and security considerations:
    • Defined CM process and procedures used for testing and problem reporting/resolution activities
    • Process for capturing test data and storing it
    • Role of Quality Assurance including redlining and QA witnessing role and responsibilities
    • Any safety and security issues relevant to the testing activity
    • All workarounds and non-functioning software components

Software Assurance:

  • Confirm all plans have been updated as needed and are ready for testing, including:
    • Test Plans, Test Procedures, Test Cases with applicable data, Test Scenarios, Test Environment
    • Version Description Document
    • Any safety-critical and security considerations
  • Confirm processes and procedures are in place for testing:
    • Configuration management of test artifacts
    • Procedures used for testing; capturing test results; reporting discrepancies
  • Confirm resources needed and facility are prepared for testing
  • Have an agreed-upon plan for SA review of testing (witnessing, sampling, etc.)
  • Have assessed bidirectional traceability from requirements to test procedures/cases

Requirements

  • All requirements included in the baselined test procedure document and uniquely identified and traceable in the updated bidirectional traceability matrix (includes necessary corrections due to discrepancy reports)

Software Assurance:          

  • Confirms that all requirements have a corresponding test(s) in the test procedures document, including changes due to discrepancy reports and all requirements are up to date in the traceability matrix.

Design

  • All previous design review success criteria and key issues were satisfied in accordance with the agreed-upon plan

Software Assurance:

  • Confirms all previous review criteria have been satisfied and any key issues resolved in accordance with the agreed-upon plan

Analysis

  • Outstanding software change requests (SCRs) ready for review
  • Code inspection results available for review
  • Results of testing completed to date available for review:
    • Objectives of tests
    • Expected results defined
    • Known problems, issues
    • Deviations, waivers

Software Assurance:

  • Confirm findings from all previous analyses have been resolved as agreed-upon
  • Participate in code reviews and walkthroughs, test plans and test procedures peer reviews
    • Confirm adherence to standards
  • Review results of testing completed to date, any outstanding software change requests
    • Confirm any problems/issues have been documented

Other

  • Software build created from CM and ready for testing
  • Applicable functional, unit-level, subsystem, system, and qualification testing conducted successfully
  • Informal dry run completed without errors
  • Validation of operations and user manuals completed
  • Successful functional configuration audit (FCA) of the version description document (VDD) (such as FSW) including fixes
  • Tests reusable for regression testing exist
  • Databases for integration and testing have been created and validated
  • Test network showing interdependencies among test events and planned time deviations for these activities prepared
  • Verification of computations using nominal and stress data
  • Verification of performance throughout the anticipated range of operating conditions including nominal, abnormal, failure, and degraded mode situations
  • Verification of end-to-end functional flows and database linkages
  • Exercise logic switching and executive control options at least once

Software Assurance:

  • Confirm audits or assessments of documentation and processes have been performed throughout the implementation
  • Confirm software build is ready for testing and that previous testing has been completed successfully
  • Conduct or participate in a functional configuration audit of the VDD
  • Confirm any necessary verification of computations has been completed
  • Confirm end-to-end functional flows and database linkages will be tested
  • Test Preparation
    • Test plans, test cases, scenarios, databases, procedures, environment, expected results, and configuration of test item(s)
    • Software build ready for testing
    • Resources (people, facilities, tools, etc.)
    • Test schedule
    • Test contingency planning
    • Test network
  • Results for all testing completed to date
  • Interfaces
  • Software V&V Plan
  • VDD and VDD audit results
  • Software change requests
  • Bidirectional traceability matrix
  • Current risks, issues, or requests for action (RFAs)
  • Baseline documentation from previous reviews
  • Requirements and design
  • Status of quality assurance (QA) activities
  • Status of known system discrepancies
  • Software cost estimate and expenditures report
  • Supplier Software VDD(s)
  • Requirements Analysis and Traceability Reports
  • Code Analysis and Assessment Results
  • Metric Data and Reports
  • Operations and user manuals
  • Completed evaluations of the unit, integration tests
  • Risk analysis, list, and management plan

Software Assurance:          

  • Attend review and review documentation related to the review
  • Submit any RFAs/RIDs on identified issues or risks
  • The review panel agrees that:
    • Peer reviews completed for implementation and tests to be performed, as defined in the software plans
    • Adequate identification and coordination of required test resources are completed
    • Previous component, subsystem, and system test results form a satisfactory basis for proceeding into planned tests
    • All the entrance criteria have been met
    • Test cases have been reviewed and analyzed for expected results, and results are consistent with test plans and objectives
    • Test personnel have received the appropriate training in test operation and safety and security procedures
    • Provisions have been made should test levels or system response exceed established limits or if the system exceeds its expected range of response
    • The software is ready to be tested
    • Requirements, software implementations, and test plans are an adequate and feasible basis for integration and test activities
  • A formal dry test run completed
  • Adequate test plans are completed and approved to proceed for the system under test
  • The risk level associated with testing is identified (during TRR) and accepted by the appropriate program/competency leadership, as required
  • Products from this review are approved, baseline, and placed under configuration management

Software Assurance:

  • Confirm any risks, issues, RIDs/RFAs are documented
  • Agrees with the panel that exit criteria have been met by the review team
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

System Acceptance Review (SAR)

The SAR verifies the completeness of the specific end item with respect to the expected maturity level and assesses compliance with stakeholder expectations. The SAR examines the system, its end items and documentation, and test data and analyses that support verification. It also ensures that the system has sufficient technical maturity to authorize its shipment to the designated operational facility or launch site. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Completed software test results

Open items and waivers/deviations

The software is deemed acceptably safe for flight/mission operations.

  • The software safety-critical requirements were met.
  • Safety risk is accepted by stakeholders impacted by that risk.
  • The software meets the established project acceptance criteria.
  • Evidence exists that the code meets the NASA NPR 7150.2 SWE 134 and the criteria in the NASA SSP 50038 document, Computer-Based Control System Safety Requirements.
  • Software change history is understood and is acceptable.

Software version description document

Open software defect or problem reports

The software team has been trained and demonstrated knowledge of the software operational, configuration control, and defect reporting processes.

Software Maintenance Plan

Baselined\Updated Software Maintenance Plan

Software testing is complete, and all known software defects have been dispositioned.

  • Software test results have been analyzed and are consistent with the operational environment for the mission.
  • Per the basis for the functional software test criteria (i.e., Software Requirements) all software has been verified and unit-tested.
  • Operational procedures for updating the software and data loads are acceptable and have been tested.
  • Software systems are secure.
  • The software configuration data has been tested.
  • All software-related TBD and TBR items have been resolved, and all waivers/deviations and anomalies have been closed.

Software user's manual

Software Test Results

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Software Data Loads

Software version description document

All software operational risks and known defects are understood and credibly assessed and plans and resources exist to manage them effectively or have been accepted by the program.

  • An acceptable operational risk exists based on the number of known software defects remaining open for the flight.

Confirm that the entrance criterion for review has been met

Software Audit Results

Software data dictionary fields are correct, data dictionary content is correct, and has been tested.

  • The software configuration data has been tested.

Confirm that software assurance has signed off on the certification package

Software Quality Data

The software version description document, user manual, and maintenance plan are completed and ready to use.

Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems

Baseline software user's manual

Acceptable data exist showing code software test coverage, including an analysis showing covered and uncovered software code. Software code coverage meets the project criteria (100% code coverage for safety-critical code).

  • 100 percent code test coverage has been achieved for All identified safety-critical software components using the Modified Condition/Decision Coverage (MC/DC) criterion.

Confirm that the system and supporting elements are correctly configured and ready to support flight

Software Metric Data

Data or evidence that shows good software processes exist and were followed by the software development organizations.


Lessons learned captured from the software areas of the project

Tests, demonstrations, analyses, audits

Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization, worst-case execution time, resource limitations) exist, means used for measuring each characteristic exist, and resources exist to operate the software.


Status of failures and anomalies from previously completed flights and reviews

Software structural and functional quality is evident in the software products.

  • Software architecture quality exists.
  • The code is maintainable.
  • Safety-critical and secure coding standards are used.
  • The code is fault tolerance.
  • The code is secure.
  • The code is testable.
  • The code meets the requirements.

System configuration

Software products (code and data) are approved, baselined, and placed under configuration management.


System and support elements configuration confirmation

All identified safety-critical software components have a cyclomatic complexity value of 15 or lower.


Status of interface compatibility and functionality

IV&V concurs that the project’s software is sufficiently mature to proceed to operation. (If IV&V is required)


System state



Review materials prepared for review; attend review


Additional Supporting Material

  • A final agenda coordinated (nominally)
  • Technical products for this review were made available to participants prior to the SAR
  • Acceptance test readiness available for review
    • Process for Analysis of Test Results
    • Acceptance Test testbed (environment) setup (hardware)
    • Setup and use of Simulators or other Test tools and their required qualifications
    • Limitations of the testbed (environment)
    • Tests that require: hardware for verification and/or human input
    • Description, at a high level, of what each test does, how long it lasts, and any special circumstances
    • IV&V report/status - if applicable
    • Preparedness for Acceptance Testing
    • Requests For Action (RFAs)
    • The decision to proceed to Acceptance Testing
  • Results available from SARs conducted at major suppliers
  • Transition to production and/or manufacturing plan exists
  • Product verification results / final test reports available
  • Product validation results available
  • Acceptance plans and acceptance criteria
  • Documentation exists to confirm that the delivered system complies with the established acceptance criteria
  • Documentation exists to confirm that the system will perform properly in the expected operational environment
  • Technical data package updated to include all test results
  • Certification package available for review
  • Risk assessment and mitigations updated
  • Previous milestone reviews completed
  • Metrics data and reports available for review
  • Remaining liens or unclosed actions and plans for closure available for review
  • Waivers and deviations available for review
  • The software build has been updated
  • Functional audit (FCA) completed
  • Software presentation prepared (for SAR):
    • Software overview
    • Project System Diagram
    • Functional software overview
    • Software products/artifacts
    • Software traceability matrix examples
    • Software Test Procedures status
    • Open RIDs
    • Open SCRs
    • Software summary and recommendations
  • Lessons Learned captured from software areas of the project ( indicating the problem or success that generated the Lesson Learned, what the Lesson Learned was, and its applicability to future projects)

Software Assurance:

  • Confirmation that all previous milestone reviews have been completed, including SARs conducted by suppliers
  • Confirm  system acceptance testing has been completed with product verification and validation results available for review
  • Conduct or participate in a functional configuration audit
  • Review metrics data analysis
  • Review any open RIDs, software change reports, issues, or unclosed actions
  • Confirm software documentation has been updated, as needed and the following is available for review: project system design, functional software overview, software products, traceability matrix examples, status and results of the previous testing, technical data package, open RIDs, SCRs, lessons learned
  • Review materials for review in advance
  • Test readiness information
  • Results of the SARs conducted at the major suppliers
  • Transition to production and/or manufacturing plan
  • Product verification results/test reports
  • Product validation results
  • Baseline Software Build
  • Certification package, if acquiring software or using COTS/GOTS/MOTS/OpenSource
  • Documentation that the delivered system complies with the established acceptance criteria
  • Documentation that the system will perform properly in the expected operational environment
  • Technical data package
  • Risk assessment and mitigation
  • Hazard report
  • Results/proof of completion for previous milestone reviews
  • Remaining liens or unclosed actions and plans for closure
  • Waivers/deviations
  • Metrics Data and Reports

Software Assurance:         

  • Reviews materials prepared for review; Attends review
  • Submit any RFAs/RIDs on identified issues or risks
  • The review panel agrees that:
    • Required tests and analyses are complete and indicate that the system will perform properly in the expected operational environment
    • Risks are known and manageable
    • Software system meets established acceptance criteria
    • Required safe shipping, handling, and checkout procedures are complete and ready for use
    • Required operational plans and procedures are complete and ready for use
    • The technical data package is complete and reflects the delivered system, including the software user's manual and version description document
    • All applicable lessons learned for organizational improvement and system operations are captured
    • The software system has sufficient technical maturity to authorize shipment to the designated operational facility or launch site

Software Assurance:

  • Confirm issues, risks, RFAs, and RIDs are documented
  • Agrees with the panel that exit criteria have been met by the review team
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

Operational Readiness Review (ORR)

The ORR examines the actual system characteristics and the procedures used in the system or product's operation and ensures that all system and support (flight and ground) hardware, software, personnel, and procedures are ready for operations and that user documentation accurately reflects the deployed state of the system. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Completed software test results

Open items and waivers/deviations

The software is deemed acceptably safe for flight/mission operations.

  • The software safety-critical requirements were met.
  • Safety risk is accepted by stakeholders impacted by that risk.
  • The software meets the established project acceptance criteria.
  • Evidence exists that the code meets the NASA NPR 7150.2 SWE 134 and the criteria in the NASA SSP 50038 document, Computer-Based Control System Safety Requirements.
  • Software change history is understood and is acceptable.

Software version description document

Open software defect or problem reports

The software team has been trained and demonstrated knowledge of the software operational, configuration control, and defect reporting processes.

Software Maintenance Plan

Baselined\Updated Software Maintenance Plan

Software testing is complete, and all known software defects have been dispositioned.

  • Software test results have been analyzed and are consistent with the operational environment for the mission.
  • Per the basis for the functional software test criteria (i.e., Software Requirements) all software has been verified and unit-tested.
  • Operational procedures for updating the software and data loads are acceptable and have been tested.
  • Software systems are secure.
  • The software configuration data has been tested.
  • All software-related TBD and TBR items have been resolved, and all waivers/deviations and anomalies have been closed.

Software user's manual

Software Test Results

System Hazard Analyses address or include all known software hazards,

  • The software hazards are clearly defined.
  • Traceability between software requirements and hazards is complete.
  • Hazard controls are defined.
  • Hazard verifications are defined.
  • All software requirements that trace to a hazard analysis are to be verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.

Software Data Loads

Software version description document

All software operational risks and known defects are understood and credibly assessed and plans and resources exist to manage them effectively or have been accepted by the program.

  • An acceptable operational risk exists based on the number of known software defects remaining open for the flight.

Confirm that the entrance criterion for review has been met

Software Audit Results

Software data dictionary fields are correct, data dictionary content is correct, and has been tested.

  • The software configuration data has been tested.

Confirm that software assurance has signed off on the certification package

Software Quality Data

The software version description document, user manual, and maintenance plan are completed and ready to use.

Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems

Baseline software user's manual

Acceptable data exist showing code software test coverage, including an analysis showing covered and uncovered software code. Software code coverage meets the project criteria (100% code coverage for safety-critical code).

  • 100 percent code test coverage has been achieved for All identified safety-critical software components using the Modified Condition/Decision Coverage (MC/DC) criterion.

Confirm that the system and supporting elements are correctly configured and ready to support flight

Software Metric Data

Data or evidence that shows good software processes exist and were followed by the software development organizations.


Lessons learned captured from the software areas of the project

Tests, demonstrations, analyses, audits

Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization, worst-case execution time, resource limitations) exist, means used for measuring each characteristic exist, and resources exist to operate the software.


Status of failures and anomalies from previously completed flights and reviews

Software structural and functional quality is evident in the software products.

  • Software architecture quality exists.
  • The code is maintainable.
  • Safety-critical and secure coding standards are used.
  • The code is fault tolerance.
  • The code is secure.
  • The code is testable.
  • The code meets the requirements.

System configuration

Software products (code and data) are approved, baselined, and placed under configuration management.


System and support elements configuration confirmation

All identified safety-critical software components have a cyclomatic complexity value of 15 or lower.


Status of interface compatibility and functionality

IV&V concurs that the project’s software is sufficiently mature to proceed to operation. (If IV&V is required)


System state



Review materials prepared for review; attend review


Additional Supporting Material

  • All validation testing completed
  • Test failures and anomalies from the validation testing were resolved and results were incorporated into all supporting and enabling operational products
  • All operational supporting and enabling products (e.g., facilities, equipment, documents, updated databases) that are necessary for the nominal and contingency operations have been tested and delivered/installed at the site(s) necessary to support operations
  • Operations manual complete
  • Physical audit (PCA) completed
  • Software inputs/contributions completed for:
    • Training provided to users and operators on correct operational procedures for the system
    • Ground systems readiness
      • Diagram describing the main functionality of the project, how parts interact, and the main flow of data between major functional parts
      • Problem reporting and change request process for discrepancy reports (DR), enhancement reports (ER), Database change requests (DCR)
      • Current DR, ER, and DCR status, including historical trend data, and details on current open DRs, ERs, DCRs
      • Key parts of the system, their current operational readiness, and how verified
      • Any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
      • Key interactions with other systems, their operational readiness, and how verified; any issues, how they will be handled, and workarounds available including when permanent fixes will be completed
      • Outstanding items that need to be completed before readiness is achieved along with the scheduled date
  • Software maintenance plan completed
    • When software is frozen, what types of fixes will be approved for implementation under a freeze, etc.
    • How the change control board (CCB) will handle software changes or bug fixes
  • Science planning and processing system readiness are available for review, as applicable:
    • Diagram describing science data processing products and general timelines involved
    • Diagram describing science system context (relationship of main Mission Operations Center, Mission Planning Office, Science Validation Facility, Ground stations, interconnecting networks, and the main science data Instrument teams)
    • Description of these main components in high-level detail including planning and processing functions; include any special cases for launch, in-orbit checkout, end of the mission, etc.; description of testing, results, and issues done to verify and validate these components
    • Summary of all testing done, results, and outstanding issues for Science Data Processing
  • Safety and security issues status available for review:
    • Software issues with safety, how addressed, and current status
    • Software issues with security, how addressed, and current status
  • Simulations status available for review:
    • The number and main details for simulations by subsystem exercised, for example, Launch, Attitude Control System, Command & Data Handling, Communication, Flight Software, Power System Electronics, Mission Operations Center, Pre-Launch, and others deemed important for the project
    • Outstanding issues from Simulation testing, schedule impact, workarounds, and risks; for workarounds, when will the problem/issue be permanently fixed
  • Contingencies and constraints available for review:
    • State of Contingency Flow Chart Book and any planned updates
    • List of current constraints on the system, state of the database that details these constraints, and any outstanding actions that need to be taken
    • Audits that were done and against what areas to verify constraints
    • Operational problem escalation process
    • Operational emergency notification process including telephone numbers to be called
  • Status of documentation readiness available for review:
    • Version Description Document(s); its location, and any outstanding issues
    • Baseline Software User's Manual; its location, and any outstanding issues
    • Software Operations Plan; its location, and any outstanding issues
    • Software Maintenance Plan; its location, and any outstanding issues
    • Planned software retirement activities; location, and any outstanding issues
  • Lessons Learned captured from software areas of the project (indicating the problem or success that generated the Lesson Learned, what the LL was, and its applicability to future projects)
  • Status of work remaining available for review:
    • All critical work that needs to be completed along with the expected completion date

Software Assurance:

  • Confirm that all entrance criteria for review have been met (or are not applicable)
  • Confirm all verification/validation activities have been completed and anomalies addressed in software and documentation
  • Confirm all operational and supporting products have been tested and delivered
  • Confirm all documentation has been updated and delivered, including maintenance manual, software user guide, operations manual
  • Conduct or participate in a physical configuration audit
  • Confirm that operational processes and maintenance processes have been defined, including operational problem escalation, emergency notification, retirement activities, CCB processes, DR/PR process
  • Confirm complete system status is available for review: constraints and contingencies, status of outstanding issues, security and safety issue status, simulation status, science planning system status, training status
  • Prepare to report on the status and findings of any audits/assessments and metrics analysis done by SA
  • Validation test results/proof of completion
  • Status of test failures and anomalies from validation testing
  • Status of all testing, delivery, and installation for operational supporting and enabling products necessary for nominal and contingency operations
  • Status of software user's manual
  • Status of operations manual
  • Software Maintenance Plan
  • Science Planning and Processing System Readiness
  • Safety and Security Issues
  • The number and main details for simulations by subsystem exercised and any open issues
  • Contingencies and constraints
  • Status of documentation readiness
  • Work Remaining

Software Assurance:

  • Review materials prepared for review; attend review
  • Submit RIDs/RFAs on any identified issues, risks
  • Track RIDs/RFAs to closure
  • The review panel agrees that:
    • The system, including any enabling products, is ready to be placed in operational status
    • All applicable lessons learned for organizational improvement and systems operations have been captured
    • All waivers/deviations and anomalies have been closed
    • Systems hardware, software, personnel, and procedures are in place to support operations
    • All project and support h/w, s/w, and procedures are ready for operations and user documentation accurately reflects the deployed state of the entire system
  • RFA and review item discrepancy (RID) reports generated, as needed, as a result of this ORR

Software Assurance:

  • Confirm that all issues, risks, RIDs/RFAs have been documented
  • Agrees with the panel that exit criteria have been met by the review team
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

Flight Readiness Review (FRR)

The FRR examines tests, demonstrations, analyses, and audits that determine the system's readiness for a safe and successful flight/launch and subsequent flight operations. It also ensures that all flight and ground hardware, software, personnel, and procedures are operationally ready. (NPR 7120.5 082)

Entrance CriteriaItems ReviewedExit / Success Criteria

Completed software test results

Open items and waivers/deviations

The software is deemed acceptably safe for flight/mission operations.

  • The software safety-critical requirements were met.
  • Safety risk is accepted by stakeholders impacted by that risk.
  • The software meets the established project acceptance criteria.
  • Evidence exists that the code meets the NASA NPR 7150.2 SWE 134 and the criteria in the NASA SSP 50038 document, Computer-Based Control System Safety Requirements.
  • Software change history is understood and is acceptable.

Software version description document

Open software defect or problem reports

The software team has been trained and demonstrated knowledge of the software operational, configuration control, and defect reporting processes.

Software Maintenance Plan

Software Maintenance Plan

Software testing is complete, and all known software defects have been dispositioned.

  • Software test results have been analyzed and are consistent with the operational environment for the mission.
  • Per the basis for the functional software test criteria (i.e., Software Requirements) all software has been verified and unit-tested.
  • Operational procedures for updating the software and data loads are acceptable and have been tested.
  • Software systems are secure.
  • The software configuration data has been tested.
  • All software-related TBD and TBR items have been resolved, and all waivers/deviations and anomalies have been closed.

Software user's manual

Software Test Results

All known software hazards have been identified, the hazards are sufficiently controlled, and all controls are comprehensively verified.

  • All software requirements that trace to a hazard analysis have been verified by test.
  • Software hazards address the software Failure Modes and Effects Analysis (FEMAs) and software Fault Tree results.
  • All identified safety-critical software components have a cyclomatic complexity value of 15 or lower.

Software Data Loads

Software version description document

All software operational risks and known defects are understood and credibly assessed and plans and resources exist to manage them effectively or have been accepted by the program.

  • An acceptable operational risk exists based on the number of known software defects remaining open for the flight.
  • Acceptable problem report summary including known errors, software functional limitations, software operational restrictions, adverse effects on safety, and justification for allowing the problem report to remain open.

Confirm that the entrance criterion for review has been met

Software Audit Results

Software data dictionary fields are correct, data dictionary content is correct, and has been tested.

  • The software configuration data has been tested.

Confirm that software assurance has signed off on the certification package

Software Quality Data

The software version description document, user manual, and maintenance plan are completed and ready to use.

Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems

Software user's manual

Acceptable data exist showing code software test coverage, including an analysis showing covered and uncovered software code. Software code coverage meets the project criteria (100% code coverage for safety-critical code).

Confirm that the system and supporting elements are correctly configured and ready to support flight

Software Metric Data

Data or evidence shows that good software processes exist and are followed by software development organizations.

  • Demonstrated compliance with software criteria specified in the software plans.

Lessons learned captured from the software areas of the project

Tests, demonstrations, analyses, audits

Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization, worst-case execution time, resource limitations) exist, means used for measuring each characteristic exist, and resources exist to operate the software.


Status of failures and anomalies from previously completed flights and reviews

Software structural and functional quality is evident in the software products.

  • Software architecture quality exists.
  • The code is maintainable.
  • Safety-critical and secure coding standards are used.
  • Code is fault tolerance.
  • The code is secure.
  • The code is testable.
  • The code meets the requirements.

System configuration

Software products (code and data) are approved, baselined, and placed under configuration management.


System and support elements configuration confirmation

IV&V concurs that the project’s software is sufficiently mature to proceed to flight and operation. (If IV&V is required)


Status of interface compatibility and functionality



System state



Review materials prepared for review; attend review



Submit RIDs/RFAs on identified issues or risks.


Additional Supporting Material

  • Certification received that flight operations can safely proceed with acceptable risk
  • System and support elements confirmed as properly configured and ready for flight
  • Interfaces compatible and function as expected
  • The system state supports a launch "go" decision based on go/no-go criteria
  • Flight failures and anomalies from previously completed flights and reviews were resolved and results were incorporated into all supporting and enabling operational products.
  • A system configured for flight
  • Tests, demonstrations, analyses, audit support flight readiness

Software Assurance:

  • Confirm that the entrance criterion for review has been met
  • Confirm that software assurance has signed off on the certification package
  • Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems
  • Confirm that the system and supporting elements are properly configured and ready to support flight
  • Open items and waivers/deviations
  • System and support elements configuration confirmation
  • Status of interface compatibility and functionality
  • System state
  • Status of failures and anomalies from previously completed flights and reviews
  • System configuration
  • Tests, demonstrations, analyses, audits
  • Software user's manual

Software Assurance:

  • Review materials prepared for review; attend review
  • Submit RIDs/RFAs on identified issues or risks
  • Track RIDs/RFAs to closure
  • The review panel agrees that:
    • The flight vehicle is ready for flight
    • Software is deemed acceptably safe for flight (i.e., meeting the established acceptable risk criteria or documented as being accepted by the PM and Designated Governing Authority (DGA))
    • Flight and ground software elements are ready to support flight and flight operations
    • Interfaces are checked and found to be functional
    • Open items and waivers/deviations have been examined and found to be acceptable
    • Software contributions to all open safety and mission risk items have been addressed
    • Operators are ready and workarounds have been fully vetted
    • The software user's manual is ready and available to be used for testing

Software Assurance:

  • Confirm that all issues, risks, RIDs/RFAs have been documented
  • Agrees with the panel that exit criteria have been met by the review team and signs the FRR documentation
  • Agrees that all RFAs/RIDs have been satisfactorily resolved or have an action plan to close RFA/RIDs
  • Track RFAs/RIDs to closure

  • No labels