bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

The title of this page has been changed. If you are using a bookmark to get here, please updated it.

You should be redirected to https://swehb.nasa.gov/display/7150/7.03+-+Acquisition+Guidance. If you do not get there in 2 seconds, click the link to go there. 

7.03 - Acquisition Guidance

1. Purpose

This topic discusses guidance for projects implementing those requirements in NPR 7150.2, NASA Software Engineering Requirements, that address software acquisition. This guidance is intended for all persons responsible for the software acquisition process, from the planning stages through contract closeout. Acquisition may involve procedures and regulations external to the software community, including variations by contract type; therefore, it is important to consult Center guidance and coordinate acquisition activities among the proper stakeholders, including, but not limited to, software engineering, procurement, finance, and contracts.

1.1 Roles

Role

Responsibility

Project Manager

Approve procurement plan.

Software Lead Engineer

Prepare procurement plan; prepare statement of work (SOW) software requirements and software data requirements for the contract; monitor execution of contract; conduct trade studies, engineering analyses.

System Engineer

Conduct trade studies, engineering analyses.

Contracting Officer (CO)

Prepare acquisition approach, prepare solicitation, guide proposal evaluation, prepare contracts, prepare modifications to contracts.

Contracting Officer's Technical Representative (COTR)

Work with CO to plan acquisition approach, prepare SOW, evaluate proposals, determine the technical adequacy of proposed approach, monitor technical implementation.

Software Technical Authority

Before contract release, verify that the SOW includes the complete flowdown of the Agency and Center software requirements [recommended practice].

2. Planning

Before software acquisition can be carried out, a need must be identified for which a solution is required.  During the planning stage, various options to address the identified need are evaluated. The following are possible options:

  • Acquire Off the Shelf (OTS) product.
  • Develop/perform service in-house.
  • Contract development/service.
  • Use/enhance existing product/service.

If the solution to the need will involve software, NPR 7150.2 applies, and the acquisition planning guidance below supports project success:

  1. Define the scope of the system of interest.
  2. Identify the goals and objectives for the software portion of the system.
  3. Identify key stakeholders.
  4. Perform "make or buy" market research/trade studies to determine if an OTS solution exists.
  5. Establish criteria (and a plan) for the studies:
    • Technical requirements (functional, operational, performance):
      • NPR 7150.2 classification.
      • Constraints and limitations (cost, schedule, resources).
      • Use past studies, known alternatives, existing "make or buy" criteria.
    • Conduct studies:
      • Assess potential products and technologies.
      • Assess how well technical requirements are addressed.
      • Assess estimated costs, including support.
      • Assess safety criticality.
      • Identify risks (delivery, safety, development practices used by supplier, supplier track record, etc.).
      • Assess provider business stability, past performance, ability to meet maintenance requirements, etc.
      • Assess commercial off-the-shelf/government off-the-shelf/military off-the-shelf products for potential use. (See SWE-027.)
    • Identify in-house capabilities to meet the need:
      • Assess availability of existing products that could meet the need or be modified to meet the need.
      • Assess availability of qualified personnel for development or modification activities.
      • Assess estimated costs (time, personnel, materials, etc.), including support.
        • Use past projects as basis, where appropriate.
      • Identify risks.
    • Determine if solution will be custom made, an existing product, or a modified existing product.
  6. Identify any acquisition risks based on requirements and "make or buy" decisions.
  7. Create at least one government software cost estimate (SWE-015) for this work.
  8. Document analysis:
    • Expected classification of the software to be acquired.
    • Availability of in-house staff and funding resources.
    • Availability of the software product(s).
    • Projected licensing and support costs.
    • List of potential suppliers.
    • Security considerations.
    • Potential risks related to supplier's viability and past performance.
  9. Document solution choice and basis for that choice:
    • Estimate of in-house versus acquisition costs (including OTS solutions and any associated costs for requirements not met by the OTS solution).
    • Comparison of cost estimates to available funding.
    • Risk assessment.
    • Assumptions, observations, rationale, determining factors.
    • Significant issues, impacts of each option.
    • If solution is in-house development/service, an acquisition is no longer required.
    • If solution is to acquire product/service, continue with this guidance as needed based on development under contract or purchase OTS solution.
    • Other planning decisions resulting in best overall value to NASA.
    • Description of chosen acquisition strategy.
  10. Identify "relevant stakeholders" based on requirements and "make or buy" decisions:
    • Those directly concerned with, or affected by, the acquisition decision.
    • May include management, the project team, procurement, customers, end users, and suppliers.
  11. Ensure acquisition team includes organization from NASA (acquirer) with appropriate (see SWE-032) non-expired CMMI (Capability Maturity Model Integration) rating as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser.*
  12. Report analysis and resulting decision to appropriate stakeholders.
  13. Document lessons learned for future acquisition activities.
  14. Develop acquisition schedule, including solicitation, supplier selection, supplier monitoring, and product acceptance and transition to operations, as appropriate.
  15. Develop acquisition plan using Center-specific template.

*Class A software acquisition guidance – When a project seeks to acquire a system that includes Class A software, the project's acquisition team is required to have support from personnel in an organization that has been rated at a Capability Maturity Model Integration (CMMI) Level 3 or higher. Evidence that a CMMI-DEV Level 3 rated organization has participated in the acquisition activities could include direct support on the acquisition team and/or review and approval of the acquisition products by the CMMI-DEV rated organization. The extent of the CMMI-DEV Level 3 rated organization's support required for a Class A acquisition should be determined by the Center's Engineering Technical Authority responsible for the project. Identification of the appropriate personnel from a organization that has been rated at a CMMI-DEV Level 3 or higher to support the project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center management. 

Class B software acquisition guidance - In the case of the project's acquiring a system that includes Class B software, the project's acquisition team is required to have support from organizations that have been rated at CMMI-DEV Level 2 (via a continuous or staged approach), including the process area Supplier Agreement Management. Evidence that a CMMI-DEV Level 2 rated organization has participated in the acquisition activities could include direct support on the acquisition team and/or review and approval of the acquisition products by the CMMI-DEV rated organization. The Center Engineering Technical Authority responsible for the project determines the extent of the CMMI-DEV Level 2 rated organization's support required for a Class B acquisition. Identification of the appropriate personnel from a organization that has been rated at a CMMI-DEV Level 2 or higher to support the project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center management. 

Classes A and B general guidance - The Center Engineering Technical Authority has the responsibilities for ensuring that the appropriate and required NASA software engineering requirements are included in an acquisition. For those cases in which a Center or project desires a general exclusion from the NASA software engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester shall submit a waiver for those exclusions or alternate requirements for approval by the NASA Headquarters' Chief Engineer with appropriate justification. (See SWE-120.)

3. Solicitation, Selection, and Award

Once the planning activities for software acquisition have been completed and the decision has been made to acquire the software or software development services, a selection process needs to be followed to choose the best provider for the project. This process typically begins with development of a Statement of Work (SOW).  The following recommendations should be considered as part of this process. Additionally, a SOW checklist reference is included in the Useful Practices tab of this guidance document.

1. Develop solicitation, including Statement of Work (SOW):

  • Acceptance criteria (SWE-034).
  • Solicitation constraints.
  • Proper requirements (SWE-048) from the software development perspective:
    • Software classification (from NPR 7150.2) and safety criticality (from Software Safety Litmus Test).
    • Technical requirements.
    • Development standard to be followed, if any.
    • Development life cycle to be followed or indication that developer can choose appropriate life cycle (SWE-019).
    • Surveillance activities (and acquirer involvement), including monitoring activities, reviews, audits (SWE-045), decision points, meetings, etc.(SWE-039)
    • Management and support requirements (project management, schedule and schedule updates (SWE-046), configuration management, non-conformance and change tracking (SWE-043), risk management, metrics collection (SWE-044), Independent Verification and Validation (IV&V) support, required records, traceability records, electronic records (SWE-047) and code access (SWE-042), V&V, etc.)
    • Requirements for maintenance, support, updates, new versions, training to be included in life-cycle and cost estimates.
    • Concise task and deliverable descriptions, including delivery format (SWE-040).
    • Media format for code deliverables (SWE-040).
    • Templates or Data Item Descriptions (DIDs) for documentation deliverables.
    • Complete set of deliverables with delivery dates, review periods, and acceptance procedures for each.
    • Time period for responses to review findings, including making changes.
    • Data Requirements Documents (DRDs) for deliverables, if appropriate.
    • Government and contractor proprietary, usage, ownership, warranty, data, and licensing rights, including transfer.
    • Requirement to include notice of use of open source software (SWE-041) in developed code.
    • OTS software requirements (SWE-027) (identify which requirements are met by OTS software, provide OTS software documentation, such as usage instructions, etc.)
    • List of all mandatory NASA software development standards and DIDs, as applicable.
    • Requirements for non-expired CMMI rating as measured by a Lead Appraiser certified by the Software Engineering Institute (SEI) (SWE-032). (See the Useful Practices tab of this topic for sample text for the solicitation.)

*Acquisition should not simply levy NPR 7150.2 as a whole on a potential supplier, as it contains some NASA institutional requirements. If a project is acquiring software development services for Classes A through H software, the project should only levy the applicable minimal set of supplier requirements, plus additions that address specific risk concerns. Requirements that are the responsibility of the Agency, Center, or Headquarters should not be levied on a contractor, as they will cause confusion and unnecessary expense.

If the class of software and the safety-critical designation are known when the SOW is written, the project can levy, via a compliance matrix, the project-/system-specific set of NPR 7150.2 requirements to be satisfied by the contractor. If the class and/or safety-critical designation are not yet known, the SOW should list applicable requirements for each class and safety-critical designation with instructions to comply accordingly when a class and safety-critical designation are determined.

In the case of requirements marked P (Center), the responsible NASA Center and project supply the applicable elements of these requirements for inclusion in the SOW.

The full list of project-related requirements can be found in the compliance matrices found in the Software Engineering Community of Practice Document Repository on the NASA Engineering Network (NEN).

2. Ensure proper review of SOW before delivery to procurement/contracts official:

  • Technical Authority to ensure proper flowdown of NPR 7150.2 requirements.
  • Coordinate with the Safety and Mission Assurance Office to ensure all quality assurance requirements, clauses, and intended delegations are identified and included.

3. Identify potential suppliers.

4. Distribute solicitation package.

5. Evaluate proposals (typically an evaluation team), based on selection criteria established during the acquisition planning phase, including:

  • Cost estimation comparisons.
  • Evaluation of how well proposed solutions meet the requirements (including interface and technology requirements, NPR 7150.2 requirements).
  • Staff available.
  • Past performance.
  • Software engineering and management capabilities.
  • Prior expertise on similar projects.
  • Available resources (facilities, hardware, software, training, etc.)
  • Capability Maturity Model Integration (CMMI) ratings.
    • Check the SEI Published Appraisal Results (PARs) to confirm non-expired rating (http://sas.sei.cmu.edu/pars).
    • Be sure to check the scope of the organization holding the CMMI rating to confirm that the rating is held by the specific organization submitting the proposal.
  • Other factors relevant to the project.

6. Select supplier/contractor, and document basis for selection.

7. Negotiate, finalize, and document contract:

  • Based on SOW.
  • Management reviews and meetings, such as:
    • Formal reviews, such as those found in NPR 7123.1, NASA Systems Engineering Processes and Requirements, and NPR 7120.4, NASA Engineering and Program/Project Management Policy.
    • Technical reviews.
    • Progress reviews.
    • Peer reviews (see Topic 7.10 - Peer Review and Inspections Including Checklists in this Handbook).
    • Software quality assurance meetings.
    • System integration test and verification meetings.
    • System safety meetings.
    • Configuration management meetings.
    • Other relevant review for this project.
  • Consider for inclusion in contract provisions (description of the method to be used) for verification of:
    • Contractor handling of requirements changes.
    • Accuracy of contractor transformation of high-level requirements into software requirements and detailed designs.
    • Interface specifications between the contractor's product and systems external to it.
    • Adequacy of contractor's risk management plan and its implementation in accordance with the required activities in the project Software Risk Management Plan.
    • Adequacy of the contractor's integration and test plan and its implementation in accordance with the required activities in the project Software Integration and Test Plan.
    • Adequacy of the contractor's configuration management plan and its implementation in accordance with the required activities in the project Software Configuration Management Plan.
  • Consider for inclusion in the contract the content and frequency of progress reports and metrics submissions.
  • Consider for inclusion in the contract identification of quality records to be maintained by the supplier.
  • Consider for inclusion in the contract the delivery process and how it will be accomplished; if incremental development and delivery are agreed upon, state how the validation process works, e.g., incremental validation, and whether it requires integration and test with software/hardware products developed by acquirer and/or other contractors or organizations (other institutes, universities, etc.).
  • Consider for inclusion in the contract a policy for maintaining the software after delivery: who is responsible for maintenance of the software, tools, testbeds, and documentation updates.

4. Technical Monitoring and Quality Assurance

Once the provider has been chosen, the acquisition process moves into a technical monitoring role. The following guidance should be included when establishing the process for provider monitoring and quality assurance:

  1. Provide technical requirements interpretation for contractor.
  2. Ensure contractor requirements documents meet original intent.
  3. Evaluate contractor progress with respect to cost.
  4. Periodically monitor contractor skill mix to ensure agreed-upon skills and experience levels are being provided.
  5. Oversee government-furnished equipment (GFE) to ensure equipment and information are provided in timely manner.
  6. Periodically assess contractor processes to ensure conformance to process requirements stated in the contract.
  7. Review and assess adequacy of contractor-provided documentation, and ensure contractor implementation of feedback; consider using Formal Inspections (SWE-087) to accomplish this task.
  8. Track status considering the following example questions:
    • Is the contractor meeting the staffing plan?
    • Have the project and the contractor met the user's needs?
    • Does the contractor have stable, educated staff?
    • Does the contractor's project have adequate resources, e.g., adequate staffing and computer resources?
    • Is realistic planning/budgeting in place?
    • Is the build plan being met?
    • Does the contractor have a good understanding of what is required?
    • Are the requirements stable?
    • Is the completion of designed functionality visible?
    • Is the evolving capability and performance of the contractor's product likely to impact development on the acquirer side of the interface?
    • Are integration and testing proceeding as planned?
    • Is contractor cost/schedule performance on target?
    • Is contractor developing a quality product?
  9. Provide regular status reviews to higher level management on contractor progress.
  10. Regularly assess status of identified risks and provide reports during management reviews.
  11. Software engineering should provide technical review to the level required to enhance the probability of mission success. (See the Useful Practices tab of this topic for a list of areas to consider for software engineering technical review.)

5. Contract Administration

In addition to monitoring the selection provider's progress and quality, contract administration activities are also carried out for the project. The following guidance should be included when establishing the process for contract administration:

  1. Regularly assess contractor financial data and invoices against budget.
  2. Work with Contracting Officer to ensure timely resolution of any contract-related issues.
  3. Work with Contracting Officer to ensure timely address of needed modifications to contract terms and conditions, as needed. These are primarily those affecting schedule, costs, services/products, resources (people, facilities), deliverables.
  4. Periodically evaluate contractor performance in manner consistent with contract and provide documented evaluation to Contracting Officer.

6. Product Acceptance and Control

Once the provider is ready to deliver the software product, the acquirer should have a process in place for review and acceptance of the product. The following guidance should be included when establishing the process for product acceptance:

  1. Review deliverables based on agreed-upon acceptance criteria (or generally accepted standards, if specific criteria have not been established), document results, and work with contractor to resolve acceptance issues.
    • Typically, an acceptance test plan is created, addressing the following:
      • Acquirer and contractor roles and responsibilities.
      • Defined test strategy.
      • Defined test objectives.
      • Defined acceptance criteria.
      • Developed test scenarios.
      • Developed test scripts.
      • Developed test matrix.
      • Time and resources estimate.
      • Approval cycle.
      • Strategy for post-delivery problem resolutions.
    • Once approved, the test plan is executed and results are documented:
      • Select test tools.
      • Select and train team members.
      • Execute the test plan (manual and automated methods).
      • Track test progress.
      • Regression test.
      • Document test results.
      • Resolve problems.
  2. Complete a Physical Configuration Audit (PCA) and a Functional Configuration Audit (FCA) to ensure that traceability is complete, that necessary waivers are in place, and that all required documentation has been developed.
  3. Place formal deliverables under configuration control.
  4. After acceptance of delivered products, support transition to an operational and/or maintenance environment.

7. Contract Closeout

The final acquisition step is to close out the contract. The following guidance should be included when establishing the process for contract closeout:

  1. Verify satisfaction of all contract terms and conditions, considering the following sample questions:
    • Has the contract period of performance expired (level of effort type contract)?
    • Have all deliverables been delivered (completion type contract)?
    • Have all Contract Data Requirements List (CDRL) items been delivered and accepted?
    • Was the contractor's performance of the SOW acceptable?
    • If the contract involved patent rights, has the final patent report been filed?
    • Has the final invoice been received?
  2. Verify return of all GFE, as appropriate.
  3. Complete final reports as requested by Contracting Officer.
  4. Provide final contractor performance evaluation to Contracting Officer.
  5. Capture Lessons Learned, if not captured earlier in the project life cycle.

8. Useful Practices, Activities and Templates

The documents below are tools collected from various Centers that have been deemed good practices or practices that work well and produce good results. They are included here as aides for carrying out the software acquisition process.

8.1 CMMI Rating Language for Request for Proposal (RFP)

If a project wants to procure the development of Classes A, B or C software, the project must levy the associated requirements for which the project has responsibility and also clearly specify that the contractor meet CMMI maturity level requirements associated with the class.  Below are examples of wording that could be used in a statement of work to describe the CMMI maturity level requirements:

For Class A software:

The contractor responsible for the acquisition, development, or maintenance of Class A software shall have a non-expired Capability Maturity Model Integration for Development (CMMI-DEV) rating, as measured by a Software Engineering Institute (SEI) authorized or certified lead appraiser, of CMMI-DEV Maturity Level 3 rating or higher for software, or CMMI-DEV Capability Level 3 rating or higher in all CMMI-DEV Maturity Levels 2 and 3 process areas for software.

For Class B software:

The contractor responsible for the acquisition, development, or maintenance of Class B software shall have a non-expired CMMI-DEV rating, as measured by an SEI authorized or certified lead appraiser, of CMMI-DEV Maturity Level 2 rating or higher for software or CMMI-DEV Capability Level 2 rating or higher for software in the following process areas:

  1.       Requirements management.
  2.       Configuration management.
  3.       Process and product quality assurance.
  4.       Measurement and analysis.
  5.       Project planning.
  6.       Project monitoring and control.
  7.       Supplier agreement management (if applicable).

For Class C software:

The project can minimally choose to pass down this requirement in accordance with the Center's procedures related to Class C, as long as the project provides some action that determines the contractor's capability to develop software in a "meets or exceeds" manner. As many NASA contractors are already at CMMI Maturity Level 2 or higher, the project may alternatively choose to simply require Maturity Level 2 in the RFP for Class C software.

If a contractor chooses to subcontract the development of Classes A, B, or C software, then the subcontractor(s) is also required to have a CMMI Maturity Level 2 (for Class B), Maturity Level 3 (for class A), or Center-specified (for Class C) rating.

8.1.1  General Example

The contractor and its subcontractors' organizations associated with software development responsibilities shall be at SEI Software CMMI-DEV Maturity Level 3 (Staged Representation) or higher, before the Preliminary Design Review. 

8.1.2  Examples for RFP Information Technology (IT) Management Section

Example 1: IT Management

For IT applications other than mission-specific flight and non-flight software, the contractor shall use commercial off-the-shelf and existing government off-the-shelf products where cost effective to NASA. All IT applications, other than mission-specific flight and non-mission flight software, shall comply with NASA requirements as outlined in NPR 7150.2 for the appropriate software classes, limited to Classes E, F, and G, and as applicable for the project, MPR 2800.4, Marshall Operational Readiness Review (MORR) for Center Applications and Web Sites, and NPR 2830.1, NASA Enterprise Architecture (EA) Procedures.

Example 2: IT Management

For IT applications, other than mission-specific software, the contractor shall:

  • Where cost effective to NASA, use commercial off-the-shelf and existing government off-the-shelf products.
  • Ensure compatibility with existing NASA applications and systems.
  • Comply with NASA requirements for NPR 7150.2 for the appropriate software classes, limited to Classes E, F, and G, and as applicable for the project.

8.1.3  Examples for RFP Software Section

Example 1: Embedded Software (Firmware)

The contractor shall develop and maintain software in accordance with NPR 7150.2 for the appropriate software classes and as applicable for the project and NASA-STD-8739.8 278, NASA Software Assurance Standard (chapters 6 and 7).

Example 2: Software Engineering

a.       The contractor shall define, design, develop, test, qualify, integrate, verify, validate, deliver, and maintain all software. The plans for accomplishing this work shall be documented in DRD, Software Development Plan.

b.      The contractor shall justify the reuse of existing software, modification of existing software, and the development of new software in DRD, Software Development Plan.

c.       The contractor shall, under project direction, participate in coordinating with the NASA IV&V Facility in accordance with NASA-STD-8739.8 (chapters 6 and 7) to plan for the participation of the NASA IV&V Facility in the software development life-cycle activities.

d.      The contractor and its subcontractors' organizations associated with software development responsibilities shall be at SEI Software CMMI-DEV Maturity Level 3 (Staged Representation) or higher, before the Preliminary Design Review. This requirement does not apply to commercial off-the-shelf software procured for the project.

e.       The contractor shall develop, update, and maintain all software and software development tools under configuration management in accordance with the DRD, Software Configuration Management Plan.

f.        The contractor shall develop and maintain electronic Software Development Folders for all flight, ground, and test software in accordance with DRD, Software Development Folder.

g.       The contractor shall use the following guidance document for the development of all software document deliverables:

  • Project Classification Matrix (use as guidance in interpreting flight software classification definitions in NPR 7150.2).

h.      The contractor shall use the following Standards for designing, developing, and testing all software:

  • NPR 7150.2.
  • NASA-STD-8739.8 (chapters 6 and 7).

Example 3:

The contractor shall define, design, code, test, integrate, and qualify the software. The contractor shall treat the software component of firmware, which consists of computer programs and data loaded into a class of memory that cannot be dynamically modified by the computer during processing, as software for the purposes of this SOW. The scope of this activity applies to the reuse of existing software, modification of existing software, and/or development of new software. The contractor shall provide information and access to products under development to provide the Government with insight into software development and test activities, including monitoring integration and verification adequacy, auditing the software development process, and participation in all software and system reviews.  The contractor shall support the implementation of the overall risk management process, as well as program status and progress reviews for the software development process. The contractor shall support software Technical Interchange Meetings and other status meetings, as required, to facilitate Government insight into the software development. The Government insight may include Government civil servant insight, Government support contractor insight, and independent verification and validation review.  The contractor shall perform peer reviews on Software Requirements Specifications, Software Test Plans, and on selected design and code items and provide results to the Government via Software Inspection/Peer Review Reports. The contractor shall maintain software metrics and provide Software Metrics Reports in accordance with DRD.

The contractor shall provide the Government web-based electronic access (with access control) to intermediate and final software products (including code) and software process tracking information, including software development and management metrics.

The software development shall comply with NPR 7150.2 as applicable for the project by NASA software classification.

8.1.4  Example for RFP EGSE Software Section

Example 1: EGSE Software

a.       The contractor shall perform the design, code, verification, validation, and delivery of all EGSE code and executables in response to EGSE Subsystem Requirements Document.

b.      The contractor shall develop and maintain software in accordance with NPR 7150.2 for the appropriate software classes and as applicable for the project and NASA-STD-8739.8 (chapters 6 and 7).

8.2 Recommended Technical Review Activities List

Areas to consider for software engineering technical review consist of the following:

  • Performing independent assessment of software systems engineering, software processes, software products, software integration, and software test analyses.
  • Reviewing all mission-critical software products.
  • Performing software schedule and resource assessments and analyses.
  • Developing software technical oversight plans.
  • Coordinating any software related issues with the project.
  • Participating in reviews and Technical Interchange Meetings.
  • Performing periodic audits on pre-defined process(es).
  • Serving as chair board, as board member, or as Review Item Disposition (RID) writer, at a formal review.
  • Participating in resolution and closure of issues.
  • Checking and comparing vendor data with independent models.
  • Performing evaluations of software products (software documentation, code, etc.)
  • Serving as Software Technical Authority responsible for acquired software products.
  • Providing planning and project pupport:
    • Support and coordinate software trade studies.
    • Assess software development processes.
    • Support review of system-level requirements specifications.
    • Support development and review of system-level verification and validation test plans.
    • Verify compliance with Software Development Plan(s).
    • Verify compliance with Software Quality and Configuration Management Plans.
    • Participate in project documentation reviews.
    • Support risk management activities.
    • Participate in project and software developer review boards, Technical Interchange Meetings, Working Groups, and telecons.
    • Participate in developer's daily and/or weekly software development activities to maintain knowledge of software development progress.
    • Identify and track software metrics.
    • Review and assess schedule of the software development activities.
    • Provide a status of the developer's software progress, metrics, and any problems to the project.
    • Conduct periodic site visits as needed to attain knowledge of software development progress.
    • Review and assess the content and completeness of instrumentation and command control list (engineering integration database).
  • Requirements analysis:
    • Verify absence of problems and risk items associated with requirements:
      • Documentation standards used and properly applied.
      • System requirements clearly organized.
      • Even emphasis and levels of detail.
      • Consistent identification schemes.
      • Clear or concise requirement statements.
      • Good sentence structure.
    • Good word selection, unambiguous terms.
    • Track growth in size and complexity of requirements to identify positive/negative trends.
    • Estimate variances in schedule and costs based on requirements size and completeness.
    • Support software requirements problem and issue resolution.
    • Review and assess the interface specifications and data.
    • Verify software requirements traceability.
    • Support software requirements walkthroughs.
    • Support evaluation of potential requirements changes and associated impacts through the life of the project.
  • Design Phase:
    • Support review of preliminary and detailed design specifications (DDSs)
    • Support software design problem and issue resolution.
    • Verify traceability of design to software requirements.
    • Support design walk-throughs.
  • Code analysis:
    • Track growth and complexity of source code modules across builds.
    • Rank source code modules according to their relative risk, as determined by:
      • Percent of internal documentation.
      • Overly large files or modules.
      • Use of unstructured programming constructs.
      • High decision or calling complexity.
      • Unused or "dead" code.
    • Poor implementation, if applicable.
    • Comply with program coding standards.
    • Develop and maintain knowledge of code functionality.
    • Present code functionality to subsystems for validity.
    • Support code development and integration testing.
    • Support software code problem and issue resolution.
    • Support developer code walk-throughs.
  • Test Phase:
    • Support development and review of test plans, test procedures, and test cases.
    • Support Test Readiness Reviews (TRR):
      • Review and identify discrepancies in software documentation.
      • Support final closure of discrepancies.
    • Support software test problem and issue resolution.
    • Support Computer Software Configuration Items (CSCI) integration and test activities.
    • Review software test reports.
  • Software problem report and effort data analyses:
    • Analyze problem reports and present understandable graphical summaries.
    • Track error detection and correction rates.
    • Assess adequacy of test program.
    • Detect schedule risks early.
    • Predict effective completion date.
  • Software metrics:
    • Help project office identify applicable software metrics.
    • Review and assess the software metric data provided by the contractor.
    • Develop, maintain, and report software insight metric data to the project.
  • Software Independent Verification and Validation (IV&V) support:
    • Perform software criticality assessments.
    • Perform software risk assessments.
    • Develop software IV&V project plans.
    • Develop software IV&V statements of work.
    • Support projects in review of all software IV&V products.
    • Provide expertise and assistance to the projects in resolution and implementation of any software IV&V recommendations.

8.3 Statement of Work Checklist

Langley Research Center LMS-CP-5523, Rev. B, Statement of Work (SOW) Review Procedures  341, contains a useful SOW checklist. See the NASA Agency Process Asset Library (PAL) for the latest version of this checklist.

8.4 Example Templates

The following NASA DIDs are listed as sample documentation templates that can be called for during the solicitation portion of the software acquisition process. Center PALs are to be consulted for DIDs and DRDs relevant to a specific NASA Center.


8.4.1 Software Documentation Requirements

The NASA software engineering documentation requirements are provided in NPR 7150.2, chapter 5, and their applicability to specific classes is described in Appendix D of that document. Included are descriptions for:

  • Software Development or Management Plan.
  • Software Configuration Management Plan.
  • Software Test Plan.
  • Software Maintenance Plan.
  • Software Assurance Plan.
  • Software Requirements Specification.
  • Software Data Dictionary.
  • Software Design Description.
  • Software Change Request/Problem Report.
  • Software Test Procedures.
  • Software User Manual.
  • Software Version Description.
  • Software Metrics Report.
  • Software Test Report.
  • Software Peer Review/Inspection Report.

8.4.2 Center DIDs and DRDs

The following DIDs and DRDs are samples available from Center PALs. Consult your own Center PAL for templates relevant to work performed for your Center.



Marshall Space Flight Center Templates 248
Available from the "Templates" section of the Marshall Space Flight Center Software Process Improvement website and the individual Project Asset sections of the Marshall Space Flight Center Process Asset Library (PAL):.

  • Software Configuration Management Plan.
  • Software Test Report (STR) Template.

Goddard Space Flight Center Templates 363

  • Software Management Plan/Product Plan (SMP/PP) for Classes A, B, and C Software.
  • ISD Software Management Plan/Product Plan (SMP/PP) for Classes D and E Software..
  • Version Description Document.
  • Template for the Software Quality Assurance Plan.
  • Configuration Management Plan Template.
  • Other templates in progress or not available publicly.

9. Resources

  • (SWEREF-007) WBS Checklist Tool, NASA Goddard Space Flight Center (GSFC), 2007. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
  • (SWEREF-062) Software Supplier Agreement Management Plan (SSAMP) Template, NASA Jet Propulsion Laboratory, 2002. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-118) Mary Catherine Ward, Joseph P. Elm, Software Engineering Institute (SEI), 2005. Download the PDF at http://www.sei.cmu.edu/library/abstracts/presentations/method2005.cfm.
  • (SWEREF-120) Suellen Eslinger (The Aerospace Corporation), Karen L. Owens (The Aerospace Corporation), Mary A. Rich (The Aerospace Corporation), This 2004 presentation was delivered at the 3rd OSD Conference on the Acquisition of Software Intensive Systems by Richard J. Adams and others of the Aerospace Corporation.
  • (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
  • (SWEREF-301) Polydys, M. L. and Wisseman, S. (May 2007). CrossTalk The Journal of Defense Software Engineering, Vol. 20. No. 5 (14-18). Retrieved February 29, 2012 from http://www.crosstalkonline.org/storage/issue-archives/2007/200705/200705-0-Issue.pdf.
  • (SWEREF-327) Software Engineering Institute (SEI), architecture web site.

SWEs and Topics related to this Handbook topic:

SWE-015

Cost Estimation

SWE-019

Software Life Cycle

SWE-027

Use of Commercial, Government, and Legacy Software

SWE-032

CMMI Levels for Class A, B, and C Software

SWE-033

Acquisition vs. Development Assessment

SWE-034

Acceptance Criteria

SWE-037

Software Milestones

SWE-038

Acquisition Planning

SWE-040

Access to Software Products

SWE-041

Open Source Software Notification

SWE-042

Source Code Electronic Access

SWE-043

Track Change Request

SWE-044

Supplier Metric Data

SWE-045

Project Participation in Audits

SWE-046

Supplier Software Schedule

SWE-047

Traceability Data

SWE-048

Solicitation

SWE-087

Software Peer Reviews and Inspections for Requirements, Test Plans, Design, and Code

SWE-102

SW Development-Management Plan

7.04 - Flowdown of NPR Requirements on Contracts and to Other Centers in Multi-Center Projects

Flowdown of NPR Requirements and Contracts

9.1 Tools

Tools to aid in compliance with this Topic, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.


  • No labels