bannerc
SWE-131 - Independent Verification and Validation Project Execution Plan

1. Requirements

3.6.3 If software IV&V is performed on a project, the project manager shall ensure an IPEP is developed, negotiated, approved, maintained, and executed. 

1.1 Notes

The scope of IV&V services is determined by the IV&V provider, documented in the IPEP, and approved by the NASA IV&V Program. The IPEP is developed by the IV&V provider and serves as the operational document that will be shared with the project receiving IV&V support.  

1.2 History

SWE-131 - Last used in rev NPR 7150.2D

RevSWE Statement
A

2.2.1.3 If a project is selected for software Independent Verification and Validation (IV&V) by the NASA Chief, Safety and Mission Assurance, the NASA IV&V program shall develop an IV&V Project Execution Plan (IPEP).

Difference between A and BRemoves the requirement for the OSMA Chief selection of projects for IV&V;
otherwise, no change
B

3.6.3 If software IV&V is performed on a project, the project manager shall ensure that an IV&V Project Execution Plan (IPEP) is developed.

Difference between B and CAdded to the requirement that the IPEP is "negotiated, approved, maintained, and executed."
C

3.6.3 If software IV&V is performed on a project, the project manager shall ensure an IPEP is developed, negotiated, approved, maintained, and executed. 

Difference between C and DClarified requirement and referenced the NASA standard for software safety and assurance.
D

3.6.3 If software IV&V is required for a project, the project manager, in consultation with NASA IV&V, shall ensure an IPEP is developed, approved, maintained, and executed in accordance with IV&V requirements in NASA-STD-8739.8.



1.3 Applicability Across Classes

This requirement applies to all NASA projects that have a software Independent Verification & Validation (IV&V) requirement. 

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

The rationale for IV&V on a project is to reduce the risk of failures due to software. IV&V Project Execution Plans (IPEP) describe the activities and processes that will be carried out and the products produced to fulfill the project’s IV&V requirements for the software. This plan is created to guide the work and increase the expectations of meeting project objectives and goals. 

3. Guidance

The purpose of the Independent Verification and Validation (IV&V) Project Execution Plan (IPEP) is two-fold. First, it is to communicate IV&V interactions, interfaces, roles and responsibilities, technical products, and reporting methods with the project.  Second, the IPEP serves as the operational document for the IV&V efforts.  The IPEP provides a mechanism for the IV&V provider, including the IV&V Program, to coordinate with a project to identify and describe IV&V activities that apply. Specifically, it includes the basic tenets for an agreement between the IV&V team and the project, including the roles and responsibilities, communications paths, and artifacts anticipated to be shared between the organizations.

The NASA IPEP template 028 states that the “ IPEP is prepared and maintained by the IV&V Project Manager (PM). The IV&V PM coordinates the creation and maintenance of this document with affected individuals and organizations (within the NASA IV&V Program as well as with the Project).”

IV&V is a technical discipline of software assurance, which employs rigorous analysis and testing methodologies identifying objective evidence and conclusions to provide a second independent assessment of critical products and processes throughout the life cycle. The secondary evaluation of products and processes throughout the life-cycle contributes to demonstrating whether the software is fit for nominal operations (required functionality, safety, dependability, etc.), and off-nominal conditions (response to faults, responses to hazardous conditions, etc.). The goal of the IV&V effort is to contribute to the assurance conclusions to the project and stakeholders based on evidence found in software development artifacts and risks associated with the intended behaviors of the software.

Because the Independent Verification and Validation (IV&V) provider is responsible for developing the IV&V Project Execution Plan (IPEP), this guidance will address coordination efforts needed from projects receiving IV&V to support the development and execution of the IPEP. 

Note that the IPEP is reviewed when initially developed, not at any particular life-cycle milestone review.  The milestone review at which the IPEP may be reviewed depends on which milestone the mission is approaching when the IV&V provider gets involved with the mission.  Per the IV&V Program, IPEPs are reviewed every 6-months for applicability or when there are significant changes made on the mission (e.g., a milestone slips out several months) or there is a significant change to IV&V (e.g., a budget cut which forces an IV&V provider to reduce their work on a mission). 

Software Provider Responsibilities

The responsibilities below and the means for carrying them out are documented as part of the IPEP (or other appropriate documents, in the case of contracted software efforts).

Per NASA-STD-8739.8:

  • ”When IV&V has been selected for a project, the provider shall coordinate with IV&V personnel to share data and information.”
  • ”When the IV&V function is required, the provider shall provide all required information to NASA IV&V Facility personnel. (This requirement includes specifying on the contracts and subcontracts, IV&V’s access to system and software products and personnel.)” This statement applies to the IV&V provider, regardless of whether that provider is the IV&V Program or another organization.
  • ”The NASA IV&V Facility shall initially conduct a planning and scoping exercise to determine the specific software components to be analyzed and the tasks to be performed. The IV&V approach will be documented in an IV&V plan.” This statement is also a recommended practice for IV&V providers other than the IV&V Program.
  • ”The IV&V team shall provide input to the appropriate software assurance personnel, as well as provide feedback to the project manager as agreed in the IV&V Plan.” (This helps to ensure that the IV&V team and the project’s software assurance team maintain an active and productive relationship.  Specific details of these interactions and the information to be exchanged for each project may be documented in the IPEP.)

Per NASA-STD-8739.8,  the Center SMA organizations are assumed to carry out the following:

  • “Assure that if IV&V is required on a program, project, or facility, project risk, and software criticality determinations are shared between the safety personnel and IV&V.”
  • “If this project is a candidate for IV&V, a software plan shall address, either specifically or by reference to the IV&V Memorandum of Agreement, the role of IV&V for the safety-critical software and detail how IV&V will work with the software safety program and personnel.”

NASA-GB-8719.13, NASA Software Safety Guidebook, 276 states that for contractor-developed software, “At a minimum, management, software development, and software assurance will need to work with the IV&V or IA personnel, providing information, answers, and understanding of the software system and the development process.

“Depending on the level of IV&V levied on the project, some independent tests may be performed. These may require some contractor resources to implement.”

IV&V Responsibilities

The activities to be carried out by the IV&V provider are coordinated with the appropriate stakeholders, including the project manager. The IV&V provider captures the following types of information in the IV&V Plan:

  • Goals and objectives.
  • The high-level description of the IV&V approach.
  • IV&V activities to be performed for the project.
  • IV&V deliverables.
  • Milestones.
  • IV&V schedule coordinated with project management reviews and technical reviews.
  • The interface among IV&V, Mission Project office, and software developer.
  • Data exchange requirements.
  • IV&V access to appropriate artifact management systems, issues, and risk tracking systems, etc.
  • Need for a heritage review to identify relevant products and artifacts from previous projects.
  • Heritage review results.
  • Preliminary list of processes and products to be evaluated.
  • IV&V access rights to proprietary and classified information; related process to obtain those rights.

It is recommended that the scoping of the IV&V effort be driven by risk, or be risk-based given that IV&V of the full software subsystem is not likely to be feasible within financial, schedule, or resource constraints. 

The IV&V provider is expected to capture detailed fiscal year activity information in the appendices as activities, risks and risk assessment results, schedules, IV&V coverage, and required resources may be subject to change over time based on project life-cycle progress and other factors.

The following actions, summarized from NASA-GB-8719.13  276 and other sources may be performed (this is not an all-inclusive list):

  • IV&V – IV&V analysts conduct in-depth analyses and verifications.
  • Attend project reviews – summarize IV&V findings for software products.
  • Integrated addition to software QA – not a replacement.
  • May perform specific safety analyses – not a replacement for the safety role.
  • May perform some software engineering functions – e.g., systems engineering trade studies, such as studies used to determine if software architectures can satisfy the performance needs of the system.
  • Verify and validate concept documentation (the description of a specific implementation solution).
  • Validate requirements and testing.
  • Verify software architectures and designs.
  • Verify requirements and design.
  • Perform independent system testing.
  • Ensure consistency from requirements to testing.
  • Verify and validate test documentation.
  • Verify and validate the implementation.
  • Verify and validate operations and maintenance content.
  • Capture and report findings – tracking them to closure.

Table of Contents for a Typical IPEP

1          Introduction
1.1       Document Organization
1.2       Document Purpose
1.3       Intended Audience

 2         IV&V Overview
2.1       IV&V Goals and Objectives
2.2       IV&V Approach
2.2.1    Verification and Validation
2.3       IV&V Focus

3          Roles, Responsibilities, and Interfaces
3.1       IV&V Program
3.1.1    Research Support
3.1.2    IV&V Metrics Support
3.2       IV&V Team
3.3       Project Personnel

4          IV&V Products and Communication/Reporting Methods
4.1       IV&V Products
4.1.1    Analysis Reports
4.1.2    Lifecycle Review Presentations
4.1.3    Issues
4.1.4    Risks
4.1.5    Item Tracking/Monitoring and Escalation
4.2       IV&V Communication and Reporting Methods
4.2.1    Lifecycle Review Presentations
4.2.2    Agency/Mission Directorate/Center Management Briefings
4.2.3    Routine Tag-ups

Appendix A:  IV&V Portfolio Based Risk Assessment (PBRA)/Risk Based Assessment (RBA) Results
Appendix B:  IV&V Focus
Appendix D:  Technical Scope & Rigor (TS&R)
Appendix E:  Reference Documentation  
Appendix F:  Acronyms
Appendix G:  Fiscal Year <XX> IV&V Efforts

See https://www.nasa.gov/centers/ivv/ims/templates/index.html for a template for an IPEP. 

Additional guidance related to IV&V may be found in the following related requirement in this Handbook:

4. Small Projects

All projects with a mission- or safety-critical software and receiving IV&V services, are treated the same.  The same process is used for developing an IPEP, regardless of project size, and IPEPs all have the same basic elements (objectives, schedules, interfaces, etc.).  The actual content of each IPEP is, of course, different for each project.  

5. Resources

5.1 References

5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lessons Learned database contains the following lesson learned related to the IV&V plan:

  • Independent Verification and Validation of Embedded Software (Use of IV&V Procedures). Lesson Number 723 518:  "The use of Independent Verification and Validation (IV&V) processes ensures that computer software is developed following original specifications, that the software performs the functions satisfactorily in the operational mission environment for which it was designed, and that it does not perform unintended functions. Identification and correction of errors early in the development cycle are less costly than identification and correction of errors in later phases, and the quality and reliability of software are significantly improved."

6.2 Other Lessons Learned

  • Software IV&V
    • IV&V approaches and resources must align with the program risk posture.
    • A closed-loop, auditable, corrective action toolset and process must be used to manage all IV&V identified issues and risks.

7. Software Assurance

SWE-131 - Independent Verification and Validation Project Execution Plan
3.6.3 If software IV&V is performed on a project, the project manager shall ensure an IPEP is developed, negotiated, approved, maintained, and executed. 

7.1 Tasking for Software Assurance

  1. Confirm that the IV&V Project Execution Plan (IPEP) exists.

7.2 Software Assurance Products

  • Evidence of confirmation of the IPEP.
  • IV&V planning and risk assessment.
  • IV&V Project Execution Plan (Done by IV&V).


    Objective Evidence

    • IV&V Project Execution Plan (IPEP) for the project

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • None identified at this time.

7.4 Guidance

1. Confirm if IV&V is required on the project, see section 4.4 of NASA-STD-8739.8 for the IV&V requirements.

To determine whether IV&V is required on the project, see the requirements in SWE-141 - Software Independent Verification and Validation. If IV&V is required on the project, confirm the requirements in section 4.4 of NASA-STD-8739.8 278 are being followed.

The requirements in the IV&V section of NASA-STD-8739.8, section 4.4, apply to all IV&V efforts performed on a software development project. It also serves as the definition of what NASA considers to be IV&V. IV&V as a risk mitigation activity, and as such, the application of IV&V analysis and the rigor of that analysis is driven by the IV&V Provider’s assessment of software risk.

IV&V is a technical discipline of SA, which employs rigorous analysis and testing methodologies identifying objective evidence and conclusions to provide an independent assessment of products and processes throughout the life cycle. The independent assessment of products and processes throughout the life-cycle demonstrates whether the software is fit for nominal operations (required functionality, safety, dependability, etc.), and off-nominal conditions (response to faults, responses to hazardous conditions, etc.). The goal of the IV&V effort is to provide assurance conclusions to the Project based on evidence found in software development artifacts and risks associated with the intended behaviors of the software.

The IV&V Provider performs two primary activities, often concurrently: verification and validation. Each of the activities provides a different perspective on the system/software. Verification is the process of evaluating a system and its software to provide objective evidence as to whether or not a product conforms to the build-to requirements and design specifications. Verification holds from the requirements through the design and code and into testing. Verification seeks to demonstrate that the products of a given development phase satisfy the conditions imposed at the start of or during that phase. Validation tasks, on the other hand, seek to develop objective evidence that shows that the content of the engineering artifact is the right content for the developed system/software. The content is accurate and correct if the objective evidence demonstrates that it satisfies the system requirements (e.g., user needs, stakeholder needs, etc.), that it fully describes the required capability/functionality needed, and that it solves the right problem.

Software assurance should assess the IV&V activities being planned and performed against the IV&V seven requirements, listed below:

The IV&V Provider shall: (SASS-02)

  1. Conduct an initial planning and risk assessment effort to determine the specific system/software behaviors (including the software components responsible for implementing the behaviors) to be analyzed, the IV&V tasks to be performed, and the rigor to be applied to the tasks.
  2. Develop an IV&V Execution Plan documenting the activities, methods, level of rigor, environments, and criteria to be used in performing verification and validation of in-scope system/software behaviors (including responsible software components) determined by the planning and scoping effort.
  3. Provide analysis results, risks, and assurance statements/data to all the responsible organizations’ Project management, engineering, and assurance personnel.
  4. Participate in Project reviews of software activities by providing status and results of software IV&V activities including, but not limited to, upcoming analysis tasks, artifacts needed from the Project, the results of the current or completed analysis, defects, and risks to stakeholders, customers, and development project personnel.
  5. Participate in planned software peer reviews or software inspections and record peer review measurements guided by the planning and scoping risk analysis performed by the IV&V Provider and the content of the items being reviewed or inspected.
  6. Identify, analyze, plan, track, communicate, and record risks to the software and development project following NPR 8000.4.
  7. Track, record, and communicate defects/issues and other results found during the execution of IV&V analysis during the software development effort to include issues and results found during the conducting of independent IV&V testing.

The IV&V Provider shall verify and validate that the concept documentation represents a specific implementation solution to solve the Acquirer’s problem. (SASS-03)

The IV&V Provider shall verify and validate: (SASS-04)

  1. That the project implements the requirements for the software listed in NPR 7150.2 (SWE-134) and risk-driven assessments determine the types of IV&V analyses.
  2. That the in-scope SW requirements and system requirements are, at a minimum, correct, consistent, complete, accurate, readable, traceable, and testable.
  3. The software usually provides the interface between the user and the system hardware and the interface between system hardware components and other systems. These interfaces are critical to the successful operation and use of the system.
  4. That the mitigations for identified security risks are in the software requirements.

The IV&V Provider shall verify and validate: (SASS-05)

  1. That the relationship between the in-scope system/software requirements and the associated architectural elements is traceable, consistent, and complete.
  2. That the software architecture meets the user’s safety and mission-critical needs as defined in the requirements.
  3. That the detailed design products are traceable, consistent, complete, accurate, and testable.
  4. That the interfaces between the detailed design components and the hardware, users, operators, other software, and external systems are correct, consistent, complete, accurate, and testable.
  5. That the relationship between the software requirements and the associated detailed design components is correct, consistent, and complete.

The IV&V Provider shall verify and validate: (SASS-06)

  1. That the software code products are consistent, complete, accurate, readable, and testable.
  2. That the software code meets the project software coding standard.
  3. That the security risks in the software code are identified and mitigated as necessary.
  4. Analysis to assess the source code for the presence of open-source software.
  5. That software problem reports generated by the IV&V provider have been addressed entirely by the project.
  6. That the project identifies and plans for the security risks in software systems and the security risk mitigations for these systems.
  7. That the project assesses the software systems for possible security vulnerabilities and weaknesses.
  8. That the project implements and tests the required software security risk mitigations to ensure that security objectives for software are satisfied.
  9. The software code uses analysis tools (including but not limited to static, dynamic, and formal analysis tools) as determined by the IV&V risk analysis process.
  10. That the relationship between the software design elements and the associated software units is correct, consistent, and complete.
  11. That the relationship between software code components and corresponding requirements is correct, complete, and consistent.

The IV&V Provider shall: (SASS-07)

  1. Verify and validate that in-scope test plans, design, cases, and procedures at all levels of testing (unit, integration, system, acceptance, etc.) are correct, complete, and consistent to allow for the verified implementation of software code products as well as system/software capabilities/behaviors.
  2. Verify and validate that relationships, between test plans, designs, cases, and procedures and software code products and system/software capabilities/behaviors, are correct, complete, and consistent.
  3. Verify that the test plans, designs, cases, and procedures contain objective acceptance criteria that support the verification of the associated requirements for nominal and off-nominal conditions.
  4. Validate that the test environment (including simulations) is complete, correct, and accurate concerning the intended testing objectives.
  5. Verify that the software code test results meet the associated acceptance criteria to ensure that the software correctly implements the associated requirements.

The IV&V Provider shall assess the software maintenance plan concerning software elements to support the planning of IV&V activities during the maintenance phase. (SASS-08)

Software assurance considers using an audit approach to determine that the IV&V provider performs the IV&V functions defined in the requirements.

2. Develop, if IV&V is required on the project, the IV&V planning, and risk assessment, and an IV&V Execution Plan.

If IV&V is required on the project, obtain the IV&V planning, risk assessment, and  IV&V Execution Plan from the IV&V provider.  Review these documents and work with the project and IV&V provider to confirm that the IV&V plan has been negotiated and approved by the project.  Monitor the IV&V activities to ensure the IV&V plan is executed and maintained.  Maintenance is required if the IV&V interactions, interfaces, roles and responsibilities, technical products, or reporting methods change.  Maintenance is also required if the IV&V activities change for any reason; the IV&V Execution Plan reflects the actual work performed by the IV&V provider and, thus, is to be updated, if those activities change.

Confirm that the IV&V plan contains the following items:

See the IV&V Project Execution Plan (IPEP) Template that’s available from the IV&V site here: https://www.nasa.gov/centers/ivv/ims/templates/index.html. It’s available in Word and PDF format.

Guidelines for the content of an IV&V Execution Plan are:

Table of Contents for a Typical IPEP

1          Introduction

1.1       Document Organization

1.2       Document Purpose

1.3       Intended Audience

2          IV&V Overview

2.1       IV&V Goals and Objectives

2.2       IV&V Approach

2.2.1    Verification and Validation

2.3       IV&V Focus

3          Roles, Responsibilities, and Interfaces

3.1       IV&V Program

3.1.1    Research Support

3.1.2    IV&V Metrics Support

3.2       IV&V Team

3.3       Project Personnel

4          IV&V Products and Communication/Reporting Methods

4.1       IV&V Products 4.1.1    Analysis Reports

4.1.2    Lifecycle Review Presentations

4.1.3    Issues

4.1.4    Risks

4.1.5    Item Tracking/Monitoring and Escalation

4.2       IV&V Communication and Reporting Methods

4.2.1    Lifecycle Review Presentations

4.2.2    Agency/Mission Directorate/Center Management Briefings

4.2.3    Routine Tag-ups

Appendix A:  IV&V Portfolio Based Risk Assessment (PBRA)/Risk Based Assessment (RBA) Results

Appendix B:  IV&V Focus

Appendix D:  Technical Scope & Rigor (TS&R)

Appendix E:  Reference Documentation  

Appendix F:  Acronyms

Appendix G:  Fiscal Year <XX> IV&V Efforts

See https://www.nasa.gov/centers/ivv/ims/templates/index.html for a template for an IPEP.



  • No labels