The title of this page has been changed. If you are using a bookmark to get here, please updated it.
You should be redirected to https://swehb.nasa.gov/display//SWEHBVD/7.09+-+Entrance+and+Exit+Criteria. If you do not get there in 2 seconds, click the link to go there.
See edit history of this section
Post feedback on this section
Entrance and Exit Criteria
Background
This guidance provides the recommended life cycle review entrance and exit criteria for software projects and should be tailored for the project class.
This topic describes the recommended best practices for entrance and success criteria for the software life cycle and technical reviews required by NASA. The guidelines are regardless of whether the review is accomplished in a one-step or two-step process. The entrance and items reviewed criteria do not provide a complete list of all products and their required maturity levels. Additional programmatic products may also be required by the appropriate governing NPRs for the project/program.
Tailoring and customizing are expected for projects and programs. The entrance and success criteria and products required for each review will be tailored and customized appropriately for the particular program or project being reviewed and the classification of the software. The decisions made to tailor and customize life-cycle review criteria should be justified to both the Engineering and SMA TA.
The recommended criteria in the following tables are focused on demonstrating acceptable software technical maturity, adequacy of software technical planning and credibility of budget, software schedule, risks (as applicable), and software readiness to proceed to the next phase. Customized or tailored criteria developed by programs or projects for life-cycle reviews should also be focused on assessing these factors. The software entrance and exit criteria guidance is a collection of material from the following core documents: NPR 7150,2 083 requirements, NASA-STD-8739.8 278 requirements, and NPR 7123.1, Appendix G 041.
See also 7.08 - Maturity of Life Cycle Products at Milestone Reviews.
1.1 References
- (SWEREF-041) NPR 7123.1D, Office of the Chief Engineer, Effective Date: July 05, 2023, Expiration Date: July 05, 2028
- (SWEREF-082) NPR 7120.5F, Office of the Chief Engineer, Effective Date: August 03, 2021, Expiration Date: August 03, 2026,
- (SWEREF-083) NPR 7150.2D, Effective Date: March 08, 2022, Expiration Date: March 08, 2027 https://nodis3.gsfc.nasa.gov/displayDir.cfm?t=NPR&c=7150&s=2D Contains link to full text copy in PDF format. Search for "SWEREF-083" for links to old NPR7150.2 copies.
- (SWEREF-172) Defense Acquisition University, Production Date:16-September-2013. Retrieved from https://at.dod.mil/sites/default/files/documents/DefenseAcquisitionGuidebook.pdf.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
1.2 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links |
---|
1.3 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
Mission Concept Review (MCR)
The MCR affirms the mission need and examines the proposed mission's objectives and the concept for meeting those objectives. Key technologies are identified and assessed. It is an internal review that usually occurs in the cognizant system development organization. ROM (Rough Order of Magnitude) budget and schedules are presented. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
The Project identified the software engineering point of contact. | Project/Program Staffing requirements and plans, Initial or draft cost estimate for software engineering, software assurance support, and IV&V support (if required) | Software planning is sufficient to proceed to the next phase. |
The Project identified the software assurance point of contact. | Draft software schedule | The software concept(s) meet the mission's and stakeholder's expectations |
Project-Level, System, and Subsystem Requirements | The program/project has demonstrated planned compliance with NASA NPR 7150.2 and NASA-STD-8739.8. | |
Systems Engineering Management Plan | The staffing resource requirements include adequate resources for software engineering, software assurance, and software safety support on the mission. | |
Partnerships, interagency, and international agreements | Software concepts and software reuse have adequately considered the use of existing assets or products that could satisfy the mission or parts of the mission. | |
Program/Project Plan | Mission objectives and operational concepts are clearly defined. | |
Mission objectives/goals and mission success criteria | The software classifications are correct. | |
Initial Concept of Operations | ||
Mission, Spacecraft, Ground, and Payload Architectures | ||
Heritage Assessment Documentation | ||
Industrial Base and Supply Chain Risk Management | ||
Top technical, cost, schedule, and safety risks, cybersecurity risks, risk mitigation plans, and associated resources | ||
Acquisition Strategy |
Additional Supporting Material
System Requirements Review (SRR)
The SRR examines the functional and performance requirements defined for the system and the preliminary Program or Project Plan and ensures that the requirements and the selected concept will satisfy the mission. (NPR 7120.5 082)
- If not performing a Software Requirements Review (SwRR), include SwRR criteria as part of SRR.
- For software-only projects, the SwRR serves as the SRR.
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
The Project identified the software points of contact. | Preliminary Software Development Plan | The software reflects the software's intended operational use and represents capabilities likely to be achieved within the project's scope. |
The software engineering, software assurance, software safety, and IV&V (if applicable) personnel for the project have been defined. | Preliminary Software Management Plan | The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project. |
Completed initial software classifications and safety criticality assessments | Draft Software Assurance and Software Safety Plan | Software Bi-directional traceability is complete. |
Preliminary Hazard Analysis available | Preliminary Software Requirements | Content and maturity in the software plans, including the IV&V plan, are sufficient to begin Phase B. |
A completed NASA-STD-8739.8 mapping matrix. | Draft requirements mapping table for the software assurance and software safety standard requirements | Quality software requirements have been established. |
A completed NPR 7150.2 mapping matrix. | Preliminary Software Cost Estimate | System Hazard Analyses address or include all known software hazards, |
IV&V Project Execution Plan (if required) | Preliminary Hazard Analysis and software controls and mitigations (Functional Hazard Analysis / Hazard Reports / Hazard Analysis Tracking Index) | Certifiable software development practices by the organizations developing the critical software components exist. |
Completed review of software Lessons Learned from previous similar missions | CMMI-Dev rating documentation | Software risk and mitigation strategies have been identified and are acceptable. |
| Preliminary IV&V Project risk assessment | Adequate IV&V planning and support are in place. (If IV&V is required) |
| Preliminary IV&V Project Execution Plan |
|
Mission, Spacecraft, Ground, and Payload Architectures | ||
Project-Level, System, and Subsystem Requirements | ||
Top software technical, cost, schedule, and safety risks, cybersecurity risks | ||
Preliminary Software Configuration Management Plan (SCMP) | ||
Preliminary Software schedules | ||
Staffing requirements and plans | ||
Preliminary System-level interface requirements for software |
| |
System Security Plans | ||
Project Projection Plans | ||
Acquisition Strategy | ||
Preliminary Concept of Operations | ||
Configuration Management Plan | ||
Systems Engineering Management Plan | ||
Contracts and Statement of Work Documents | ||
Program/Project Plan | ||
Preliminary Safety and Mission Assurance Plan | ||
| Mission objectives/goals and mission success criteria | |
| Human-Rating Certification Package (if crewed) | |
| Partnerships, interagency and international agreements (MOAs/MOUs) | |
| Human Systems Integration Plan (if crewed) | |
| Project Schedule |
Additional Supporting Material
Software Requirements Review (SwRR)
See the definition of SRR.
- If not performing a Software Requirements Review (SwRR), include SwRR criteria as part of SRR.
- For software-only projects, the SwRR serves as the SRR.
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Software Requirements | Software Requirements Specification | The project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B. |
Bi-directional traceability | Software Assurance/Safety Plans | Quality software requirements have been established. |
Software Management/Development Plan | Software Management/Development Plan(s) | Software Bi-directional traceability is complete. |
Software Assurance/Safety Plan | Preliminary high-level software architecture defined. | System Hazard Analyses address or include all known software hazards, |
Completed software classification and safety criticality assessments are available | IV&V Project Execution Plan (if required) | Certifiable software development practices by the organizations developing the critical software components exist. |
NASA NPR 7150.2 requirements mapping matrix is complete | IV&V Project risk assessment (if required) | The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project. |
NASA-STD-8739.8 requirements mapping matrix is complete | Software Classification | Evidence that the software development organization followed the required processes for this point in the software lifecycle. |
Software Configuration Management Plan | Hazard Analysis and software controls and mitigations (Functional Hazard Analysis / Hazard Reports / Hazard Analysis Tracking Index) | Software metrics to track the software quality and maturity are being used. |
Completed software assurance software requirements analysis | Software risks | Software risk and mitigation strategies have been identified and are acceptable. |
Completed the IV&V software requirements analysis | Software Assurance Requirement Analysis Results | Adequate IV&V planning, risk assessment, and support are in place. (If IV&V is required) |
The software engineering, software assurance, software safety, and IV&V personnel for the project and milestone review are defined. | Software FEMAs and Fault Trees |
|
Completed peer review(s) of the software requirements specification | Bidirectional traceability matrix. | |
Completed peer review(s) of the software plans. | Certified software development practices by the organizations developing the critical software components. |
|
CMMI-Dev rating documentation | Top technical, cost, schedule, and safety risks, cybersecurity risks, risk mitigation plans, and associated resources |
|
Software test facilities, needs, and capabilities identified. | IV&V concept review results (concept, reuse, architecture, ops maturity, feasibility, hazard, security) | |
IV&V Project Execution Plan | Results of any software and avionics architecture trade studies | |
Test tool requirements have been identified and plans for any necessary test development are in place. | Software Configuration Management Plans |
|
Concept of Operations Documentation |
| |
System Security Plans |
|
Additional Supporting Material
Mission Definition Review (MDR)
The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5 082)
- MDR is equivalent to SDR for robotic projects.
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Software Requirements | Software Requirements | The project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B. |
Bi-directional traceability | CMMI-Dev rating documentation | Quality software requirements have been established. |
NASA NPR 7150.2 requirements mapping matrix is complete | Bidirectional traceability matrix. | Software Bi-directional traceability is complete. |
NASA-STD-8739.8 requirements mapping matrix is complete | Preliminary Hazard Analysis | System Hazard Analyses address or include all known software hazards, |
Completed software classification and safety criticality assessments are available | Software and avionics architectures | The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project. |
Completed Software Plans | IV&V Project risk assessment | Certifiable software development practices by the organizations developing the critical software components exist. |
Completed the IV&V software requirements analysis | IV&V Project Execution Plan | Software metrics to track the software quality and maturity have been selected. |
Confirm RFAs and RIDs from SRR have been satisfactorily resolved | Software risks | Certifiable software development practices by the organizations developing the critical software components exist and are being followed. |
Identify software safety-critical components. | Software Engineering (NPR 7150.2) and Software Assurance (8739.8) Requirements Mapping matrices | All known software risks are identified and technically assessed. |
Identify the software assurance, software safety, and IV&V personnel for the project and milestone review, | Software Development Plan | |
IV&V concept reviews are completed (concept, reuse, architecture, ops maturity, feasibility, hazard, security) | Software Management Plan |
|
Completed Software assurance and software safety requirements and task plans. | Safety and Mission Assurance Plan or Software Assurance Plan | |
Completed IV&V Project Execution Plan | Top technical, cost, schedule, and safety risks, risk mitigation plans, and associated resources | |
Completed peer review(s) of the software plans | Software Configuration Management Plan | |
Completed peer review(s) of the Software Requirements Specification | Operational Scenarios | |
Completed software assurance software requirements analysis | Concept of operations documentation |
Additional Supporting Material
System Definition Review (SDR)
The MDR (or SDR) examines the proposed requirements, the mission/system architecture, and the flow down to all functional elements of the system. (NPR 7120.5 082)
- MDR is equivalent to SDR for robotic projects.
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Software requirements | Requirements mapping table for the software assurance and software safety standard requirements | The project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B. |
Bi-directional traceability | Software requirements | Quality software requirements have been established. |
Completed Software Plans | Interface requirements documents, including Software interfaces | Software Bi-directional traceability is complete. |
Completed software classification and safety criticality assessments are available | Hazard Analysis and software controls and mitigations (Functional Hazard Analysis / Hazard Reports / Hazard Analysis Tracking Index) | System Hazard Analyses address or include all known software hazards, |
Identification of software safety-critical components by having a preliminary software hazard analysis completed | Software and avionics architectures | The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project. |
NASA NPR 7150.2 requirements mapping matrix is complete | Software data dictionary | Certifiable software development practices by the organizations developing the critical software components exist. |
NASA-STD-8739.8 requirements mapping matrix is complete | Software assurance requirement analysis results | Software metrics to track the software quality and maturity have been selected. |
Software data dictionary | Software classifications | Certifiable software development practices by the organizations developing the critical software components exist and are being followed. |
Completed Software Safety Assessment | Software architecture | All known software risks are identified and technically assessed. |
Completed software assurance software requirements analysis | Software Management Plan(s) | IV&V concurs that the project’s software plans, schedules, resources, and requirements are sufficiently mature to begin Phase B. (If IV&V is required) |
Completed the IV&V software requirements analysis | Software process audit results |
|
Completed peer review(s) of the software requirements | Software verification and validation (V&V) planning | |
Completed peer review(s) of the software plans | Bidirectional traceability matrix. | |
Preliminary Human Rating Plan, if applicable | Technical resource utilization estimates and margins | |
Preliminary Maintenance Plan | Software configuration management (CM) plan |
|
Software configuration management (CM) plan | Software Assurance (SA) and Software Safety Plan(s), including SA Product Acceptance Criteria and Conditions. | |
Completed IV&V Project Execution Plan | Software Development Plan |
|
Confirm RFAs and RIDs from MCR have been satisfactorily resolved | Software peer review results | |
Identify the software assurance, software safety, and IV&V personnel for the project and milestone review. | Software Assurance schedule |
|
The software assurance point of contact for the project has been identified. | Software coding guidelines |
|
Software risks | ||
Software safety analysis results |
| |
Cost estimate for the project’s Software Assurance support | ||
IV&V Project Execution Plan (if required) | ||
IV&V Project risk assessment (if required) |
| |
System architecture, including avionics architecture | ||
Top technical, cost, schedule, and safety risks, risk mitigation plans, and associated resources | ||
Concept Documentation |
Additional Supporting Material
Preliminary Design Review (PDR)
The PDR demonstrates that the preliminary design meets all system requirements with acceptable risk and within the cost and schedule constraints and establishes the basis for proceeding with detailed design. It shows that the correct design option has been selected, interfaces have been identified, and verification methods have been described. Full baseline costs and schedules, as well as risk assessments, management systems, and metrics, are presented. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Software requirements have been baselined at the software level | Preliminary Software Design Description | The software's preliminary detailed design is complete and meets the software requirements with adequate margins. |
Known software hazards have been identified. | Avionics architecture | Quality software requirements have been established and baselined requirements. |
Software architecture exist | Bidirectional traceability | Interface control documents, software definitions, and requirements are sufficiently mature to CDR. |
Software classifications and safety criticality determination have been updated as necessary | Baseline software requirements | Software metrics to track the software quality and maturity are in use. |
Software development tools, computers, and facilities for implementation have been identified. | Software process audit results | System Hazard Analyses address or include all known software hazards, |
Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results | Preliminary interface control documents and interface design descriptions | Avionics architecture, components, and interfaces are defined and understood at the level needed to move to CDR. |
Software metrics and measurements exist. | Software design analysis results | Software risks to safety and mission success are understood and credibly assessed. |
Bidirectional traceability exists between the software requirements and the proletary software design modules. | Software requirements analysis results | The software NPR 7150.2 and NASA-STD-8739.8 requirements tailoring have been completed and approved by the required technical authorities and the project. |
Preliminary software data dictionary | Planned software development tools, computers, and facilities | The problem reporting process is acceptable (e.g., including problem tracking, causal analysis, problem impact analysis, notification, monitoring, and resolution through change control.) |
Preliminary Software Design has the content prescribed in the 7.18 documentation guidance for a Software Design Document. | Preliminary software data dictionary | Software tools and programming language have been selected by the project, sufficient software analysis and development tools, and the software development organization is trained in the use of the software language. |
Resource utilization estimates and margins | Software development and test facilities | Hardware resources exist to develop and test the software. The required Engineering test units, modeling, and simulations have been identified and planned. |
RFAs and RIDs from the previous review have been resolved, and any resulting actions are complete. | Baseline software architecture description | Software Bidirectional traceability is complete and accurate between the preliminary design and the software requirements. |
Software configuration management processes and tools are in place | Software Fault Tree Analysis (FTA), or Preliminary Software Failure Modes and Effects Analysis results | NASA has access to the software products in electronic format, including software development and management metrics. |
Software development and test facilities are planned and will be in place when needed. | Software hazard analysis includes the list of all software safety-critical components that the system hazard analysis has identified | Certifiable software development practices by the organizations developing the critical software components exist and are being followed. |
Identify the software assurance, software safety, and IV&V personnel for the project and milestone review. | Software metrics and measurement data | IV&V concurs that the project’s software requirements and design are sufficiently mature to proceed to CDR. (If IV&V is required) |
Lessons learned captured from the software areas of the project | Preliminary software acceptance criteria and conditions | |
A preliminary high-level test plan identifies testing that needs to occur at each level of implementation/integration. | Baseline software configuration management plan and processes | |
Configuration Control Board established and Change Control Process in place. | Preliminary software test plans | |
Completed IV&V software design analysis | Status of software change requests | |
Completed peer review(s) of the software preliminary design | Baseline Software Engineering and Assurance schedules | |
Completed peer review(s) of the software requirements | Software Risks | |
Completed Software Assurance software design analysis | Prototype software, if applicable | |
Completed software assurance software requirements analysis | Resource utilization estimates and margins | |
Completed the IV&V software requirements analysis | Software Supplier documentation | |
Software Trade studies | ||
Completed software requirements analysis by software assurance and IV&V (if required) | ||
Baselined operational concepts | ||
Baseline cost estimate for the project’s Software support | ||
Software classifications and safety criticality determinations | ||
Record of trade-off criteria & assessment (make/buy decision) |
Additional Supporting Material
Critical Design Review (CDR)
The CDR demonstrates that the maturity of the design is appropriate to support proceeding with full-scale fabrication, assembly, integration, and test and that the technical effort is on track to complete the flight and ground system development and mission operations to meet mission performance requirements within the identified cost and schedule constraints. Progress against management plans, budget, and schedule, as well as risk assessments, are presented. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Bidirectional traceability exists between the software requirements and the software design modules. | Baseline software detailed design description | The software's detailed design is complete and meets the software requirements with adequate margins. |
Software requirements have been baselined at the software level | Avionics architecture | Interface control documents, software definitions, and requirements are sufficiently mature to proceed with software coding, software integration, and software testing, and plans are in place to manage any open items. |
Known software hazards have been identified. | Baselined bidirectional traceability | The software baseline requirements and adequate documentation exist to allow proceeding with software implementation and software testing. |
Completed Software Assurance software design analysis | Updated baselined software requirements | The software test plans, test procedures, test cases, test environment (including simulations), and test design at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent for verification and validation of the source code and system functions allocated to the software. |
Completed Software Assurance software safety analysis | Baseline software data dictionary | Software risks to safety and mission success are understood and credibly assessed. |
Software architecture exist | Baseline software interface control documents and software interface design description | Certifiable software development practices by the organizations developing the critical software components exist and are being followed. |
Data flow diagrams | Updated software architecture | System Hazard Analyses address or include all known software hazards, |
Software data dictionary content | Software development and testing tools, computers, facilities for implementation | Software Bidirectional traceability is complete and accurate between the detailed design and the software requirements. |
Software design | Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results | Software data dictionary fields are correct. More than 95% of the Data dictionary data definitions are complete. |
Procedures for deliverable software products | Software metrics and measurement data | Sufficient use of static code analysis tools on the project |
Regression software testing plans are available. | Software process audit results | The software operational concept has matured, is at a CDR level of detail, and has been considered in test planning. |
Resource utilization estimates and margins | Baseline software acceptance criteria and conditions | The use of secure coding practices is planned. |
Results of a preliminary Software Assurance test analysis | Resource utilization estimates and margins | The program/project software cost and schedule estimates are credible and within program/project constraints. Software schedule has margin and reasonable timeframes for software development and testing. |
RFAs and RIDs from the previous review have been resolved, and any resulting actions are complete | Planned software development tools, computers, and facilities | Problem reporting process is acceptable (e.g., including problem tracking, causal analysis, problem impact analysis, notification, monitoring, and resolution through change control.) |
Identify the software assurance, software safety, and IV&V personnel for the project and milestone review. | Software Assurance schedule | Software products (requirements, data dictionary, and design) are approved, baselined, and placed under configuration management. |
Software classifications and safety criticality determination have been updated as necessary | Software Assurance software design analysis results | IV&V concurs that the project’s software requirements, design, and test plans are sufficiently mature to proceed to Code implementation. (If IV&V is required) |
Software configuration management processes and tools are in place | Software assurance software requirements analysis results |
|
Confirm supplier documentation exists and review it as appropriate | Software classifications and safety criticality determinations | |
Software development and test facilities are planned and will be in place when needed. | Software configuration management processes | |
Software development tools, computers, and facilities for implementation have been identified | Software hazard analysis includes the list of all software safety-critical components that the system hazard analysis has identified |
|
Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results | Software development and test facilities | |
Software metrics and measurements exist. | Software Trade studies | |
Software schedule status | Software test planning | |
Status of change requests | Status of software change requests | |
Technical resource utilization estimates and margins | Supplier documentation | |
Update safety criticality determinations and software classifications, if necessary. | The safety-related analysis may include Fault Tree Analysis (FTA) or Preliminary Failure Modes and Effects Analysis. | |
Lessons learned captured from the software areas of the project | Traceability between software requirements and hazards | |
Bidirectional traceability exists between system hazards and software requirements. | IV&V Software requirements analysis results | |
Completed IV&V software design analysis | Software regression testing plans | |
Completed peer review(s) of the software design | IV&V software design analysis results | |
Cost estimate for the project’s Software Assurance support | ||
Updated operational concepts | ||
Prototype software, if applicable | ||
Procedures for deliverable software products |
Additional Supporting Material
Production Readiness Review (PRR)
The PRR is held for projects developing or acquiring multiple similar or identical flight and/or ground support systems. The purpose of the PRR is to determine the readiness of the system developer(s) to efficiently produce (build, integrate, test, and launch) the required number of systems. The PRR also evaluates how well the production plans address the system's operational support requirements. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Bidirectional traceability exists between the software requirements and the software design modules. | Baseline software detailed design description | The software's detailed design is complete and meets the software requirements with adequate margins. |
Software requirements have been baselined at the software level | Avionics architecture | Interface control documents, software definitions, and requirements are sufficiently mature to proceed with software coding, software integration, and software testing, and plans are in place to manage any open items. |
Known software hazards have been identified. | Baselined bidirectional traceability | The software baseline requirements and adequate documentation exist to allow proceeding with software implementation and software testing. |
Completed Software Assurance software design analysis | Updated baselined software requirements | The software test plans, test procedures, test cases, test environment (including simulations), and test design at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent for verification and validation of the source code and system functions allocated to the software. |
Completed Software Assurance software safety analysis | Baseline software data dictionary | Software risks to safety and mission success are understood and credibly assessed. |
Software architecture exist | Baseline software interface control documents and software interface design description | Certifiable software development practices by the organizations developing the critical software components exist and are being followed. |
Data flow diagrams | Updated software architecture | System Hazard Analyses address or include all known software hazards, |
Software data dictionary content | Software development and testing tools, computers, facilities for implementation | Software Bidirectional traceability is complete and accurate between the detailed design and the software requirements. |
Software design | Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results | Software data dictionary fields are correct. More than 95% of the Data dictionary data definitions are complete. |
Procedures for deliverable software products | Software metrics and measurement data | Sufficient use of static code analysis tools on the project |
Regression software testing plans are available. | Software process audit results | The software operational concept has matured, is at a CDR level of detail, and has been considered in test planning. |
Resource utilization estimates and margins | Baseline software acceptance criteria and conditions | The use of secure coding practices is planned. |
Results of a preliminary Software Assurance test analysis | Resource utilization estimates and margins | The program/project software cost and schedule estimates are credible and within program/project constraints. Software schedule has margin and reasonable timeframes for software development and testing. |
RFAs and RIDs from the previous review have been resolved, and any resulting actions are complete | Planned software development tools, computers, and facilities | Problem reporting process is acceptable (e.g., including problem tracking, causal analysis, problem impact analysis, notification, monitoring, and resolution through change control.) |
Identify the software assurance, software safety, and IV&V personnel for the project and milestone review. | Software Assurance schedule | Software products (requirements, data dictionary, and design) are approved, baselined, and placed under configuration management. |
Software classifications and safety criticality determination have been updated as necessary | Software Assurance software design analysis results | IV&V concurs that the project’s software requirements, design, and test plans are sufficiently mature to proceed to Code implementation. (If IV&V is required) |
Software configuration management processes and tools are in place | Software assurance software requirements analysis results |
|
Confirm supplier documentation exists and review it as appropriate | Software classifications and safety criticality determinations | |
Software development and test facilities are planned and will be in place when needed. | Software configuration management processes | |
Software development tools, computers, and facilities for implementation have been identified | Software hazard analysis includes the list of all software safety-critical components that the system hazard analysis has identified |
|
Software Fault Tree Analysis (FTA) or Preliminary Software Failure Modes and Effects Analysis results | Software development and test facilities | |
Software metrics and measurements exist. | Software Trade studies | |
Software schedule status | Software test planning | |
Status of change requests | Status of software change requests | |
Technical resource utilization estimates and margins | Supplier documentation | |
Update safety criticality determinations and software classifications, if necessary. | The safety-related analysis may include Fault Tree Analysis (FTA) or Preliminary Failure Modes and Effects Analysis. | |
Lessons learned captured from the software areas of the project | Traceability between software requirements and hazards | |
Bidirectional traceability exists between system hazards and software requirements. | IV&V Software requirements analysis results | |
Completed IV&V software design analysis | Software regression testing plans | |
Completed peer review(s) of the software design | IV&V software design analysis results | |
Cost estimate for the project’s Software Assurance support | ||
Updated operational concepts | ||
Prototype software, if applicable | ||
Procedures for deliverable software products
|
Additional Supporting Material
System Integration Review (SIR)
The SIR evaluates the readiness of the project to start flight system assembly, test, and launch operations. V&V Planning, integration plans, and test plans are reviewed. Test articles (hardware/software), test facilities, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Confirm software integration plans and procedures are in place and approved | Software Integration plans and procedures | High confidence exists that the software will support system integration and system integration testing. |
Confirm all known software discrepancies have been identified and disposed of per the agreed-upon plan. | Software Unit Test Results | Previous software unit testing and software verification test results form a satisfactory basis for proceeding to integration. |
Confirm that the software assurance/control organization is prepared to support the integration | Software builds plan | Adequate software resources and hardware for development and testing have been planned and budgeted, including support system implementation and system testing. Software integration support staff are qualified. |
Software test results | Software test plans and results | Program/project leadership identifies and accepts software risks as required. |
Confirm segments and components are ready for integration; all facilities are ready and available | Software test preparation (facilities, tools, equipment, personnel) | The software integration procedures and workflow have been clearly defined and documented or are on schedule to be clearly defined and documented before their need date. |
Confirm support personnel are adequately trained. | Software Static Analysis tool results | Software maintainability considerations have been incorporated to ensure the ease and efficiency of software changes. |
Confirm that all previous design review success criteria and critical issues have been satisfied per the agreed-upon plan. | Software code coverage data | Required software facilities and tools are sufficient for system integration and testing. |
Confirm that all electrical and mechanical interfaces have completed qualification testing before integration. | Software cyclomatic complexity data | Software system integration processes, facilities, and methods are within acceptable risk from threats and cybersecurity vulnerabilities. |
Review any safety or handling issues and address any possible risks during testing. | Software metrics and measurement data
| Adequate technical and programmatic margins (e.g., data throughput, memory) and resources exist to complete the software development within budget, schedule, and known risks. |
Assess all handling and safety requirements; identify any issues or risks | Software process audit results | Risks to safety and mission success are understood and credibly assessed and plans and resources exist to effectively manage them. |
Confirm that all software components and data are under configuration control. | Software data dictionary | Software data dictionary fields are complete and correct. (Guideline- More than 95% of the Data dictionary data definitions are complete) |
Submits RFAs/RIDs for any identified risks or issues | System integration testing includes testing for all known software hazards. | |
Handling and safety requirements | System Hazard Analyses address or include all known software hazards, | |
A functional, unit-level, subsystem, and qualification test results/proof of completion | Software products (code and data) are approved, baselined, and placed under configuration management. | |
Interface control documentation | IV&V concurs that the project’s software requirements, design, and testing are sufficiently mature to proceed to system integration. (If IV&V is required) |
Additional Supporting Material
Test Readiness Review (TRR)
The TRR ensures that the test article (hardware/software), test facility, support personnel, and test procedures are ready for testing and data acquisition, reduction, and control. (NPR 7123.1 041)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
All previous design review success criteria and critical issues were satisfied per the agreed-upon plan. | Completed evaluations of the software unit, verification, and integration test plans, procures, and results | The software test plans, test procedures, test cases, test environment (including simulations), and test design at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent for verification and validation of the source code and system functions allocated to the software. |
Completed code reviews and walkthroughs, test plan and test procedures, peer reviews | Software and Technical Metric Data and Reports | All software-related TBD and TBR items have been resolved. |
Confirm adherence to a secure coding standard | Bidirectional traceability matrix | Software risks to safety and mission success are understood and credibly assessed. |
Configuration management of test artifacts in place | Code Analysis and Assessment Results | The software test data has been reviewed and analyzed for expected results and the results are consistent with the test plans and objectives. |
Software build created from CM and ready for testing. | Software Requirements and software design information | The software test team has been trained and has demonstrated knowledge of the software test processes, configuration control processes, and defect reporting processes. |
Confirm all known discrepancies are identified and addressed in an agreed-upon manner. | Results for all software testing completed to date | System Hazard Analyses address or include all known software hazards, |
Confirm audits or assessments of documentation and processes have been performed throughout the implementation. | Software builds procedures for testing | All software static code analysis has been completed, and appropriate findings have been corrected. |
Confirm end-to-end functional flows and database linkages will be tested | Software data dictionary | The software requirements are baselined, and adequate documentation exists to allow proceeding with software implementation and software testing. |
Confirm processes and procedures are in place for testing | Software change requests | Software data dictionary fields are correct and complete and have been tested. |
Version Description Document | Software Test Plans | Certifiable software development practices by the organizations developing the critical software components exist and are being followed. |
Software Test Analysis | Status of known software or system discrepancies | Software products under test and used for testing are approved, baselined, and placed under configuration management. |
Confirm any waivers or deviations are approved | Test contingency planning | An acceptable approach for independent software testing is planned. |
Confirm all plans have been updated as needed and are ready for testing. | Software Test Analysis results | Software Bidirectional traceability between the requirements and test procedures is complete and accurate. |
Confirm any necessary verification of computations has been completed | Software defect or problem reports | All identified safety-critical software components have a cyclomatic complexity value of 15 or lower. |
Confirm findings from all previous analyses have been resolved as agreed-upon | Software test facilities definition and fidelity | 100 percent code test coverage has been achieved for All identified safety-critical software components using the Modified Condition/Decision Coverage (MC/DC) criterion. |
Applicable functional, unit-level, subsystem, system, and qualification testing were conducted successfully.
| Software Static Analysis tool results | The software Version Description Document is complete and accurate. |
Confirm resources needed and facility are prepared for testing | Software test cases, scenarios, databases, procedures, environment, expected results, and configuration of test item(s) | Software structural quality is evident in the software products. |
Confirm that the software build is ready for testing and that previous testing has been completed successfully. | Resources (people, facilities, tools, etc.) | IV&V concurs that the project’s software requirements, design, and testing are sufficiently mature to proceed to software testing and integration. (If IV&V is required) |
Confirm that all baselined documents from previous reviews are available | Status of quality assurance (QA) activities | |
Confirm TRR-specific materials are available (e.g., test plans, test procedures, test cases, version descriptions document, test schedule, test status, test data) | Submit any RFAs/RIDs on identified issues or risks | |
Confirms that all requirements have a corresponding test(s) in the test procedures document, including changes due to discrepancy reports, and all requirements are up to date in the traceability matrix | Supplier Software VDD(s) | |
Databases for integration and testing have been created and validated | Test network | |
Expected results defined | Test Preparation | |
Have an agreed-upon plan for SA review of testing (witnessing, sampling, etc.) | Test schedule | |
Have assessed bidirectional traceability from requirements to test procedures/cases | VDD and VDD audit results | |
Informal dry run completed without errors. | Severity 1 or 2 findings | |
Known problems, issues | Baseline documentation from previous reviews | |
Lessons learned captured from the software areas of the project | Current risks, issues, or requests for action (RFAs) | |
Objectives of tests | Risk analysis, list, and management plan | |
Outstanding software change requests (SCRs) ready for review | Interfaces definitions | |
Procedures in place for use for testing: capturing test results, reporting discrepancies | Requirements Analysis and Traceability Reports | |
Review results of testing completed to date, any outstanding software change requests | Software cost estimate and expenditures report | |
Safety-critical and security considerations have been identified. | Operations and user manuals | |
Successful functional configuration audit (FCA) of the version description document (VDD) (such as FSW), including fixes | ||
Test network showing interdependencies among test events and planned time deviations for these activities prepared | ||
Test Plans, Test Procedures, Test Cases with applicable data, Test Scenarios, Test Environment | ||
Tests reusable for regression testing exist. | ||
Validation of operations and user manuals completed. | ||
Verification of computations using nominal and stress data | ||
Code inspection results | ||
Verification of end-to-end functional flows and database linkages | ||
Verification of performance throughout the anticipated range of operating conditions, including nominal, abnormal, failure, and degraded mode situations | ||
Completed functional configuration audit of the VDD | ||
Confirm any problems/issues have been documented. |
Additional Supporting Material
System Acceptance Review (SAR)
The SAR verifies the completeness of the specific end item with respect to the expected maturity level and assesses compliance with stakeholder expectations. The SAR examines the system, its end items and documentation, and test data and analyses that support verification. It also ensures that the system has sufficient technical maturity to authorize its shipment to the designated operational facility or launch site. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Completed software test results | Open items and waivers/deviations | The software is deemed acceptably safe for flight/mission operations. |
Software version description document | Open software defect or problem reports | The software team has been trained and demonstrated knowledge of the software operational, configuration control, and defect reporting processes. |
Software Maintenance Plan | Baselined\Updated Software Maintenance Plan | Software testing is complete, and all known software defects have been dispositioned. |
Software user's manual | Software Test Results | System Hazard Analyses address or include all known software hazards, |
Software Data Loads | Software version description document | All software operational risks and known defects are understood and credibly assessed and plans and resources exist to manage them effectively or have been accepted by the program. |
Confirm that the entrance criterion for review has been met | Software Audit Results | Software data dictionary fields are correct, data dictionary content is correct, and has been tested. |
Confirm that software assurance has signed off on the certification package | Software Quality Data | The software version description document, user manual, and maintenance plan are completed and ready to use. |
Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems | Baseline software user's manual | Acceptable data exist showing code software test coverage, including an analysis showing covered and uncovered software code. Software code coverage meets the project criteria (100% code coverage for safety-critical code). |
Confirm that the system and supporting elements are correctly configured and ready to support flight | Software Metric Data | Data or evidence that shows good software processes exist and were followed by the software development organizations. |
Lessons learned captured from the software areas of the project | Tests, demonstrations, analyses, audits | Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization, worst-case execution time, resource limitations) exist, means used for measuring each characteristic exist, and resources exist to operate the software. |
Status of failures and anomalies from previously completed flights and reviews | Software structural and functional quality is evident in the software products. | |
System configuration | Software products (code and data) are approved, baselined, and placed under configuration management. | |
System and support elements configuration confirmation | All identified safety-critical software components have a cyclomatic complexity value of 15 or lower. | |
Status of interface compatibility and functionality | IV&V concurs that the project’s software is sufficiently mature to proceed to operation. (If IV&V is required) | |
System state | ||
Review materials prepared for review; attend review |
Additional Supporting Material
Operational Readiness Review (ORR)
The ORR examines the actual system characteristics and the procedures used in the system or product's operation and ensures that all system and support (flight and ground) hardware, software, personnel, and procedures are ready for operations and that user documentation accurately reflects the deployed state of the system. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Completed software test results | Open items and waivers/deviations | The software is deemed acceptably safe for flight/mission operations. |
Software version description document | Open software defect or problem reports | The software team has been trained and demonstrated knowledge of the software operational, configuration control, and defect reporting processes. |
Software Maintenance Plan | Baselined\Updated Software Maintenance Plan | Software testing is complete, and all known software defects have been dispositioned. |
Software user's manual | Software Test Results | System Hazard Analyses address or include all known software hazards, |
Software Data Loads | Software version description document | All software operational risks and known defects are understood and credibly assessed and plans and resources exist to manage them effectively or have been accepted by the program. |
Confirm that the entrance criterion for review has been met | Software Audit Results | Software data dictionary fields are correct, data dictionary content is correct, and has been tested. |
Confirm that software assurance has signed off on the certification package | Software Quality Data | The software version description document, user manual, and maintenance plan are completed and ready to use. |
Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems | Baseline software user's manual | Acceptable data exist showing code software test coverage, including an analysis showing covered and uncovered software code. Software code coverage meets the project criteria (100% code coverage for safety-critical code). |
Confirm that the system and supporting elements are correctly configured and ready to support flight | Software Metric Data | Data or evidence that shows good software processes exist and were followed by the software development organizations. |
Lessons learned captured from the software areas of the project | Tests, demonstrations, analyses, audits | Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization, worst-case execution time, resource limitations) exist, means used for measuring each characteristic exist, and resources exist to operate the software. |
Status of failures and anomalies from previously completed flights and reviews | Software structural and functional quality is evident in the software products. | |
System configuration | Software products (code and data) are approved, baselined, and placed under configuration management. | |
System and support elements configuration confirmation | All identified safety-critical software components have a cyclomatic complexity value of 15 or lower. | |
Status of interface compatibility and functionality | IV&V concurs that the project’s software is sufficiently mature to proceed to operation. (If IV&V is required) | |
System state | ||
Review materials prepared for review; attend review |
Additional Supporting Material
Flight Readiness Review (FRR)
The FRR examines tests, demonstrations, analyses, and audits that determine the system's readiness for a safe and successful flight/launch and subsequent flight operations. It also ensures that all flight and ground hardware, software, personnel, and procedures are operationally ready. (NPR 7120.5 082)
Entrance Criteria | Items Reviewed | Exit / Success Criteria |
---|---|---|
Completed software test results | Open items and waivers/deviations | The software is deemed acceptably safe for flight/mission operations. |
Software version description document | Open software defect or problem reports | The software team has been trained and demonstrated knowledge of the software operational, configuration control, and defect reporting processes. |
Software Maintenance Plan | Software Maintenance Plan | Software testing is complete, and all known software defects have been dispositioned. |
Software user's manual | Software Test Results | All known software hazards have been identified, the hazards are sufficiently controlled, and all controls are comprehensively verified. |
Software Data Loads | Software version description document | All software operational risks and known defects are understood and credibly assessed and plans and resources exist to manage them effectively or have been accepted by the program. |
Confirm that the entrance criterion for review has been met | Software Audit Results | Software data dictionary fields are correct, data dictionary content is correct, and has been tested. |
Confirm that software assurance has signed off on the certification package | Software Quality Data | The software version description document, user manual, and maintenance plan are completed and ready to use. |
Confirm that flight failures and anomalies from previous flights have been resolved and incorporated into all supporting and enabling systems | Software user's manual | Acceptable data exist showing code software test coverage, including an analysis showing covered and uncovered software code. Software code coverage meets the project criteria (100% code coverage for safety-critical code). |
Confirm that the system and supporting elements are correctly configured and ready to support flight | Software Metric Data | Data or evidence shows that good software processes exist and are followed by software development organizations. |
Lessons learned captured from the software areas of the project | Tests, demonstrations, analyses, audits | Adequate technical and programmatic margins (e.g., data throughput, memory, CPU utilization, worst-case execution time, resource limitations) exist, means used for measuring each characteristic exist, and resources exist to operate the software. |
Status of failures and anomalies from previously completed flights and reviews | Software structural and functional quality is evident in the software products. | |
System configuration | Software products (code and data) are approved, baselined, and placed under configuration management. | |
System and support elements configuration confirmation | IV&V concurs that the project’s software is sufficiently mature to proceed to flight and operation. (If IV&V is required) | |
Status of interface compatibility and functionality | ||
System state | ||
Review materials prepared for review; attend review | ||
Submit RIDs/RFAs on identified issues or risks. |