bannerd


SWE-039 - Software Supplier Insight

1. Requirements

3.1.8 The project manager shall require the software developer(s) to periodically report status and provide insight into software development and test activities; at a minimum, the software developer(s) will be required to allow the project manager and software assurance personnel to:

    1. Monitor product integration.
    2. Review the verification activities to ensure adequacy.
    3. Review trade studies and source data.
    4. Audit the software development processes and practices.
    5. Participate in software reviews and technical interchange meetings.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-039 - Last used in rev NPR 7150.2D

RevSWE Statement
A

2.6.1.1 The project shall require the software supplier(s) to provide insight into software development and test activities; at a minimum the following activities are required:  monitoring integration, review of the verification adequacy, review of trade study data and results, auditing the software development process, participation in software reviews and systems and software technical interchange meetings.

Difference between A and B

No change

B

3.12.8 The project manager shall require the software supplier(s) to provide insight into software development and test activities; at a minimum, the software supplier(s) will be required to allow the project manager or designate to: 

    1. Monitor product integration.
    2. Review the verification activities to ensure adequacy.
    3. Review trade studies and source data.
    4. Audit the software development process.
    5. Participate in software reviews and systems and software technical interchange meetings.
Difference between B and C

Changed "supplier(s)" to "developer(s)";
Added requirement to periodically report status;
Added requirement for SA personnel to be given the same access as the PM;
Added auditing of development practices; Removed requirement to participate in "System" technical interchange meetings.

C

3.1.8 The project manager shall require the software developer(s) to periodically report status and provide insight into software development and test activities; at a minimum, the software developer(s) will be required to allow the project manager and software assurance personnel to:

    1. Monitor product integration.
    2. Review the verification activities to ensure adequacy.
    3. Review trade studies and source data.
    4. Audit the software development processes and practices.
    5. Participate in software reviews and technical interchange meetings.

Difference between C and DNo change
D

3.1.8 The project manager shall require the software developer(s) to periodically report status and provide insight into software development and test activities; at a minimum, the software developer(s) will be required to allow the project manager and software assurance personnel to:

    1. Monitor product integration.
    2. Review the verification activities to ensure adequacy.
    3. Review trade studies and source data.
    4. Audit the software development processes and practices.
    5. Participate in software reviews and technical interchange meetings.



1.3 Applicability Across Classes

Classes F and G are labeled with “X (not OTS)”.  This means that this requirement does not apply to off-the-shelf software for these classes. 

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Software supplier activities are monitored to assure the software work products are produced per the project and software requirements. Appropriate use of software project "insight" (surveillance mode requiring monitoring of customer identified and contracted milestones) allows NASA to detect problems early and take corrective action if necessary. The insight activities cited in this requirement comprise a minimum set that assures a continual knowledge of the work status achieved by the software developer/supplier when executed over the software product's life cycle.

3. Guidance

SWE-039 specifies minimum NASA insight activities to be included in the contract with the software supplier. Although the SWE is written to address contract requirements for insight, its contents may also be applied to internal supplier organizations. The minimum activities mentioned in the requirement are to be included as part of the issued SOW. A listing and description of the agreed-upon final activities are included in the contract statement of work, for external suppliers, or in a formal agreement between organizations, for internal suppliers. NASA oversight (oversight is a surveillance process that implies more active supervision of a contractor's processes and decision making)) activities are not covered in SWE-039. These latter activities are included in the overall project plan, the Software Management Plan (see 5.08 - SDP-SMP - Software Development - Management Plan), and the contract SOW. See 7.03 - Acquisition Guidance, and 7.04 - Flow Down of NPR Requirements on Contracts and to Other Centers in Multi-Center Projects for guidance on what to put into a contract SOW, advice on how to flow down requirements to suppliers, and, in some instances, suggestions for language for the contract. Consult the Center or program planning documents for help with proposal language. (Proposal information is typically unique and is usually developed by Source Evaluation Boards (SEBs).)

Once a qualified supplier is selected for the software acquisition activity (see 7.03 - Acquisition Guidance ), the software development team still needs to monitor the supplier's activities so the software development work is accomplished according to the project and software requirements. Insight into the development of software and the testing of software gives NASA knowledge about the progress being made by the software developer throughout the life cycle.

Insight is a continuum that can range from low intensity, such as reviewing quarterly reports, to high intensity, such as performing surveys and reviews or witnessing software development or testing activities. Insight begins with attendance at important meetings. It enables the customer to see the work, work products, and supporting processes of the supplier. The level of insight to be exercised on the project is specified in the contract. The use of the insight role to monitor the software developer's progress requires a balance between too much and too little involvement with the software developer.

While insight is defined in one case as a "surveillance mode requiring the monitoring of customer-identified metrics and contracted milestones" (NASA-STD 8709.22 274), this requirement specifies more. This SWE lists five areas that, if properly executed, will provide sufficient knowledge about the state of development of the software. These activities, in turn, enable the correct evaluation of the metrics and milestones associated with the contract. The following sections discuss the five areas.

See also Topic 8.06 - IV&V Surveillance

3.1 Monitoring Integration

Software integration proceeds from the development of individual components that have coded interfaces up to the time when the team assembles them into integrated systems of coded software and hardware.

Successful software integration depends on whether:

  • The hardware is compatible with the software.
  • There is agreement about the communications protocols to be used.
  • There is an understanding of how to transfer information among the information architectures involved.
  • The software interrelationships are well defined.
  • The information at the interface between systems is in an open format or mapped to adequate neutral-format standards.
  • The purpose of communication is mutually understood.
  • The terminology used is unambiguous in meaning.

Understanding the state of development of these integration characteristics must be a goal of the project’s insight activity.

In addition to software development, there are also basic COTS and legacy software issues that must be considered during the monitoring of software integration. Evaluation of the following characteristics usually occurs during monitoring activities.

  • Structure of the architecture and constraints.
  • Techniques for mining legacy assets.
  • Re-coding and re-hosting.
  • Long-term maintenance.
  • Translators and emulators.
  • Middleware standards and products.
  • Management of interfaces.
  • Issues resolution.

For a deeper discussion of these characteristics, see “COTS and Legacy Software Integration Issues,” 321 from the 1998 Ground System Architectures Workshop.

See also SWE-040 - Access to Software Products, SWE-042 - Source Code Electronic Access

3.2 Review of the Verification Adequacy

Software verification is the software engineering activity that shows confirmation that software products properly reflect the requirements specified for them. Insight into the adequacy of the verification activities includes an examination of the following:

  • Objective evidence of the quality of the product from the software developer (e.g., accompanying documentation, test reports, statistical records, process control).
  • Inspections and audits of the software.
  • Reviews of the required documentation.
  • Reviews of any delegation of verification to the supplier made in the SOW.

When the NASA software team uses periodic reports (e.g., progress reports, or summary and individual test reports) to verify supplier-developed software, the requirements for the data in these reports are specified in the SOW. When software verification will be performed at the supplier's site, the requirements for the method of verification are specified in the SOW.

The NASA software team must be allowed to participate in reviews of the verification activity at the software developer's site. The intent and level of these participation activities are specified in the contract. 

3.3 Review of Trade Study Data and Results

The team exercising the "insight" role uses contract-specified access to enable the review of all applicable trade studies performed by the software developer or the project. The performance of adequate trade studies helps assure the correctness of software development and the software testing choices during the actual software development.

Trade studies compare the relative merits of alternative approaches, and so ensure that the most cost-effective system is developed. They maintain traceability of design decisions back to fundamental requirements. Trade studies do this by comparing alternatives at various levels for the system being developed. They may be applied to the concept, design, implementation, verification, support, and other areas. They provide a documented, analytical rationale for choices made in system development.

Trade studies can be used to compare several options.

3.4 Auditing the Software Development Process

NASA considers a software audit to be an examination of a work product or set of work products performed by a group independent from the developers to assess compliance with specifications, standards, contractual agreements, or other criteria. This NASA definition is from NASA-STD-8739.8 "Software Assurance and Software Safety Standard," and is based on IEEE 610.12.

"Software product" mostly, but not exclusively, refers to some kind of technical document. IEEE Std. 1028, IEEE Standard for Software Reviews, 219 offers a list of 33 "examples of software products subject to audit," including source code and documentary products, such as various sorts of plans, contracts, specifications, designs, procedures, standards, and reports, but also non-documentary products, such as data, test data, and deliverable media.

Software audits are distinct from software peer reviews and software management reviews in that they are conducted by personnel external to, and independent of, the software development organization, and are concerned with compliance of products or processes, rather than with their technical content, technical quality, or managerial implications.

See also Topic 8.12 - Basics of Software Auditing

3.5 Participation in Software Reviews and Systems and Software Technical Interchange Meetings

The NASA software development team must be allowed to attend the regular and periodic meetings that include software development progress reviews, milestone reviews, and formal reviews [such as those found in NPR 7123.1 041 (Systems Engineering), NPR 7120.5 082 (Space Flight Program and Project Management), NPR 7120.7 264 (IT and Institutional Infrastructure) and NPR 7120.8 269 (Research and Technology)], systems engineering meetings regarding software activities, and regular technical interchange meetings. The material reviewed and the ideas for software development that are discussed at these meetings and interchanges often provide a more robust description of the software architecture, the software requirements, the design, and discussions of major issues that are typically presented monthly or quarterly submitted reports.

The meetings, reviews, and technical interchange opportunities must be specified in the SOW. (See 7.03 - Acquisition Guidance for advice on oversight activities, specifically what to include for binding reviews, participation on review boards, and how to handle comments and assigned actions.)

If the Defense Contract Management Agency 475 services are available at the software development site, consider using them for basic monitoring and verification services. Arrangements for these services are typically made by the Center's acquisition department.

See also SWE-046 - Supplier Software Schedule

3.6 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:


3.7 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

4. Small Projects

This requirement applies to all projects regardless of size.

5. Resources

5.1 References

5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.


6. Lessons Learned

6.1 NASA Lessons Learned

A documented lesson from the NASA Lessons Learned database notes the following:

  • Space Shuttle Program/External Tank (ET)/Super Light Weight Tank (SLWT). Lesson Number 1048 535: A discussion on the shuttle program's certification of its super lightweight tank attests to the value of exercising insight and guidance through limited oversight activities.
  • Acquisition and Oversight of Contracted Software Development (1999). Lesson Number 0921 528: Tailorable acquisition management and oversight processes for NASA contracted software development are essential to ensure that customers receive a quality product. A documented lesson from the NASA Lessons Learned database includes a cause of the loss of a mission "the lack of a controlled and effective process for acquisition of contractor-developed, mission-critical software." In this particular case, the quality of the contractor's product was not monitored as it would have been if the proper milestones for reviewing and auditing contractor progress were in place.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-039 - Software Supplier Insight
3.1.8 The project manager shall require the software developer(s) to periodically report status and provide insight into software development and test activities; at a minimum, the software developer(s) will be required to allow the project manager and software assurance personnel to:
    1. Monitor product integration.
    2. Review the verification activities to ensure adequacy.
    3. Review trade studies and source data.
    4. Audit the software development processes and practices.
    5. Participate in software reviews and technical interchange meetings.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Confirm that software developer(s) periodically report status and provide insight to the project manager.

2. Monitor product integration.

3. Analyze the verification activities to ensure adequacy.

4. Assess trade studies, source data, software reviews, and technical interchange meetings.

5. Perform audits on software development processes and practices at least once every two years.

6. Develop and provide status reports.

7. Develop and maintain a list of all software assurance review discrepancies, risks, issues, findings, and concerns.

8. Confirm that the project manager provides responses to software assurance and software safety submitted issues, findings, and risks and that the project manager tracks software assurance and software safety issues, findings, and risks to closure.

7.2 Software Assurance Products

  •  Software Assurance Status Reports (Periodic status reports on SA involvement, including analysis and assessment results. (See Tasks 1, 2, 3, 4, 6)

  • Assessment of Software Reviews results
  • Assessment of Technical Interchange Meetings results
  • Assessment of Trade Studies and Source Data Results
  • Results of SA Audits performed, including audit findings (Task 5)

 List of review discrepancies (e.g. RIDS), risks, issues, findings, and concerns found by the software assurance team (Task 7)


Objective Evidence

  • Software assurance and software safety analysis results
  • Software assurance process audit results
  • Software assurance status reports and metrics
  • List of software assurance and software safety discrepancies, risks, issues, findings, and concerns.

Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

  • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
  • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
  • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
  • Signatures on SA reviewed or witnessed products or activities, or
  • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
    • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
    • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
  • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics:

  •  # of software work product Non-Conformances identified by life cycle phase over time
  •  # of software process Non-Conformances by life-cycle phase over time
  •  # of Compliance Audits planned vs. # of Compliance Audits performed
  •  # of Open versus Closed Audit Non-Conformances over time
  • Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)
  • # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)
  • of process Non-Conformances (e.g., activities not performed) identified by SA versus # accepted by the Project
  •   Trends of # Open versus # Closed over time
  • # of Non-Conformances (activities not being performed)
    •      # of Non-Conformances accepted by the project
    •      # of Non-Conformances (Open, Closed, Total)
    •      Trends of Open vs. Closed Non-Conformances over time
  • # of Non-Conformances from reviews (Open vs. Closed; # of days Open)
  • # of Risks identified in each life cycle phase (Open, Closed)
  • # of Risks by Severity (e.g., red, yellow, green) over time
  • # of Risks with mitigation plans vs. total # of Risks
  • # of Risks trending up over time
  • # of Risks trending down over time
  • # of safety-related requirement issues (Open, Closed) over time
  • # of safety-related non-conformances identified by life cycle phase over time

See also Topic 8.18 - SA Suggested Metrics

7.4 Guidance

  1. Confirm that software developer(s) are periodically reporting status and providing insight to the project manager.
  2. Monitor product integration.
  3. Analyze the verification activities to ensure adequacy.
  4. Assess the trade studies and source data. - look for risks or issues, look for decisions that change the software safety-criticality designation.
  5. Monitor and participate in software reviews and technical interchange meetings.- participate in all major software reviews and try to participate in the technical interchange meetings if available.  Write findings, discrepancies, and RIDS in major reviews as required by the review process.
  6. Develop and provide SA status reports to the project manager and engineering. See the guidelines below for content in a status report.  The content guidelines are optional but status reports should be provided to the engineering and project office.
  7. Develop and maintain a list of all software assurance review discrepancies, risks, concerns, and issues.

A primary function of software assurance is to help determine the status of the software project and the quality of the products being produced. Many methods and tools can assist with this, including direct observations, spot checks, audits, analysis of measurements, use of analysis tools (static code analyzers), techniques (FTA, FMEA), reviewing results (tests, traceability matrices), and risk/issue identification and tracking. A combination of these techniques should be used to provide appropriate coverage of the engineering effort to assess product quality. One of the most common methods is conducting audits, but care must be taken not to overuse this method.

Perform audits on software development processes and practices at least once every two years. - audit each software engineering process area at once every two years, as a minimum period, the more frequently the better.  If the project is a short-term activity, less than two years in development, audit each process used during the development cycle.  Capture and record the audit findings, and communicate any findings that you think are an issue or risks.

7.4.1 Planning software assurance audits:

When software assurance is planning their audit schedule, they should consider the following types of audits: 

  • Project performance milestone audits (A check of the project’s progress against the planned progress) 
  • Documentation audits/Product audits (for example, project development/management plan, CM plan, risk management plan, maintenance plan, safety plan, Version Description Document, etc.) (Does the plan have the necessary content? Does the product meet its requirements?)
  • Process audits (examples: CM process, change management process, unit test process, etc.) Is the team following the documented process for the activity? Is the process working well?
  • Physical Configuration audits (Does the delivery contain all the expected contents in the correct versions?)
  • Functional configuration audits (Does the delivery meet its requirements?)
  • Project Compliance to NPR 7150.2 requirements and Center requirements 
  • Audit to confirm that Software Assurance is compliant with NASA-STD-8739.8  278

Classification and software criticality should be taken into consideration when planning audits. Audits should be planned on a schedule that provides the audit results when they can be of the most benefit to the project. For example, a planning process audit is probably not very useful if it is performed during the test phase of the project. However, if it is performed during the project’s planning phase, it may be able to point out some critical planning elements that have been overlooked.

Results of audits or any other analysis done during a phase should be reported at least at the milestone review points. (See the contents for a status report below). Software assurance should record any non-conformances, issues, or risks they have identified and track them to closure. Critical, time-sensitive issues or risks should be reported to the project management immediately, rather than waiting for a scheduled review.

Every task that involves performing an audit should also clarify that all audit findings are promptly shared with the project and will be addressed in the handbook guidance.

7.4.2 SA status reports 

The following section defines the content of a SA Status Report. The Status Report is a scheduled periodic communication tool to help manage expectations between the SA representative(s),  project engineering and management, and OSMA stakeholders. It provides insight into the overall status relative to the value SA adds to the project as well as performance against the SA plan. Pre-coordinate with the stakeholders receiving the status reports and define the specifics/content of the status report in the SA Plan. Status reports should provide the information to satisfy the needs of the stakeholders receiving the report. The SA Status Report content, in no specific order, addresses:

SA Project Title and Date – Identify the project and the date(s) of the reporting period.

Overall Status Dashboard or Stoplight Table – Provide a high-level status of progress, risk, schedule, and whether or not assistance/awareness is required. Typically a Green/Yellow/Red scheme is used to indicate go, no go status and whether the work is approaching minimum threshold or limits.

Key Contributions/Accomplishments/Results (Value Added) – Identify any activities performed during the reporting period that have added value to the project. The reporting should include key SA contributions, accomplishments, and results of SA Tasking activities performed. (See Table 1 in NASA-STD-8739.8  278  (SA Requirements Mapping Matrix)). Examples are:

  • Analyses performed (e.g., PHAs, HAs, FMEAs, FTAs, Static Code Analysis)
  • Audits performed (e.g., process, product, PCAs, FCAs, )
  • Products Reviewed (e.g., Project Plans, Requirements, Design, Code, Test docs)
  • Tests witnessed
  • Assessments performed (e.g., Safety-Criticality, Software Class, Risk, Cybersecurity)

See also Topic 8.05 - SW Failure Modes and Effects Analysis.

Current/Slated Tasks – Identify in-work and upcoming assurance activities. Identify (planned and unplanned) software assurance and oversight activities for the next reporting period. Examples are:

  • Analyses (e.g., PHAs, HAs, FMEAs, FTAs, Static Code Analysis)
  • Audit (e.g., process, product, PCAs, FCAs )
  • Product Reviews (e.g., Project Plans, Requirements, Design, Code, Test docs)
  • Test witnessing
  • Assessments (e.g., Safety-Criticality, Software Class, Risk, Cybersecurity)

Issue Tracking – Record and track software issues to follow progression until resolution. Track issues by priority status, safety, criticality, or some other criteria combination.

Metrics – Identify the set of SA metrics used and analyzed for the reporting period. At a minimum, collect and report on the list of SA metrics specified in the SA Standard. Include analysis results, trends, or updates and provide supportive descriptions of methodology and criteria.

Process Improvement suggestions and observations

Obstacles/Watch Items – Identify and describe any obstacles, roadblocks, and watch items for the reporting period. Obstacles/Watch Items are an informal list of potential concerns.

Risk Summary – List and provide status on SA risks associated with any activities/tasks required by the SA Standard. Highlight status changes and trends.

Funding (As Applicable) – Provide funding status and trending information as needed to support the SA tasking for the project. Consider how the data/information can be used for future planning and cost estimating.

ScheduleProvide the necessary schedule/information to communicate the status of the SA tasking in support of the scope and timeline established for the project

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

  • No labels