Return to 8.16 - SA Products
This topic provides a definition with some examples of "objective evidence" and contains a listing of all the tasks in NPR-8739.8 278 where "objective evidence" may be the only product.The list of all the tasks that may have objective evidence as their only product is formatted as a checklist. This can be used to record the type of evidence available from performing each of these tasks and the location where the evidence is stored. Items listed as major products or sub-products are also objective evidence.
2. Definition of Objective Evidence
Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:
- Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
- Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
- Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
- Signatures on SA reviewed or witnessed products or activities, or
- Status report, email or memo containing Short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
- To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
- To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
- The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.
3.1 PAT-033 - TASKS NEEDING OBJECTIVE EVIDENCE
Click on the image to preview the file. From the preview, click on Download to obtain a usable copy.
Tasks Needing Objective Evidence
Don’t forget that you may have tailored out some of these tasks and also one piece of objective evidence may apply to several tasks.
Have Evidence (Y/N)/ Notes
1. Confirm that the options for software acquisition versus development have been evaluated.
2. Confirm the flow down of applicable software engineering, software assurance, and software safety requirements on all acquisition activities. (NPR 7150.2 and NASA-STD-8739.8).
1. Confirm that all plans, including security plans, are in place and have expected content for the life cycle events, with proper tailoring for the classification of the software.
2. Confirm that closure of corrective actions associated with the performance of software activities against the software plans, including closure rationale.
3. Confirm changes to commitments are recorded and managed.
1. Confirm the following are approved, implemented, and updated per requirements:
a. Software processes, including software assurance,
software safety, and IV&V processes,
b. Software documentation plans,
c. List of developed electronic products, deliverables, and
d. List of tasks required or needed for the project’s
2. Confirm that any required government actions are established and performed upon receipt of deliverables (e.g., approvals, reviews).
1. Confirm that milestones for reviewing and auditing software developer progress are defined and documented.
1. Confirm that software developer(s) periodically report status and provide insight to the project manager.
2. Monitor product integration.
8. Confirm that the project manager provides responses to software assurance and software safety submitted issues, findings, and risks and that the project manager tracks software assurance and software safety issues, findings, and risks to closure.
1. Confirm that software artifacts are available in electronic format to NASA.
1. Confirm that software developers provide NASA with electronic access to the source code generated for the project in a modifiable form.
1. Confirm that any requirement tailoring in the Requirements Mapping Matrix has the required approvals.
1. Confirm that the project maintains a requirements mapping matrix (matrices) for all requirements in NPR 7150.2.
1. Confirm that the conditions listed in "a" through "f" are complete for any COTS, GOTS, MOTS, OSS, or reused software that is acquired or used.
a. The requirements to be met by the software component are identified.
1. Confirm that the required number of software cost estimates are complete and include software assurance cost estimate(s) for the project, including a cost estimate associated with handling safety-critical software and safety-critical data.
1. Confirm that all the software planning parameters, including size and effort estimates, milestones, and characteristics, are submitted to a Center repository.
2. Confirm that all software assurance and software safety software estimates and planning parameters are submitted to an organizational repository.
1. Confirm the generation and distribution of periodic reports on software schedule activities, metrics, and status, including reports of software assurance and software safety schedule activities, metrics, and status.
2. Confirm closure of any project software schedule issues.
1. Confirm the project's schedules, including the software assurance’s/software safety’s schedules, are updated.
1. Confirm that any project-specific software training has been planned, tracked, and completed for project personnel, including software assurance and software safety personnel.
2. Confirm that software assurance and software safety personnel have completed the appropriate software assurance and/or software safety training to satisfactorily conduct assurance and safety activities.
1. Confirm that records of the software Requirements Mapping Matrix and each software classification are maintained and updated for the life of the project.
1. Confirm that IV&V requirements (section 4.4) are complete on projects required to have IV&V.
1. Confirm that the IV&V Project Execution Plan (IPEP) exists.
1. Confirm that IV&V has access to the software development artifacts, products, source code, and data required to perform the IV&V analysis efficiently and effectively.
1. Confirm that the project manager responds to IV&V submitted issues, findings, and risks and that the project manager tracks IV&V issues, findings, and risks to closure.
1. Confirm that the hazard reports or safety data packages contain all known software contributions or events where software, either by its action, inaction, or incorrect action, leads to a hazard.
4. Confirm that the traceability between software requirements and hazards with software contributions exists.
1. Confirm that the identified safety-critical software components and data have implemented the safety-critical software assurance requirements listed in this standard.
3. Confirm that the values of the safety-critical loaded data, uplinked data, rules, and scripts that affect hazardous system behavior have been tested.
6. Ensure the SWE-134 implementation supports and is consistent with the system hazard analysis.
1. Confirm that 100% code test coverage is addressed for all identified safety-critical software components or that software developers provide a technically acceptable rationale or a risk assessment explaining why the test coverage is not possible or why the risk does not justify the cost of increasing coverage for the safety-critical code component.
2. Confirm that all identified safety-critical software components have a cyclomatic complexity value of 15 or lower. If not, assure that software developers provide a technically acceptable risk assessment, accepted by the proper technical authority, explaining why the cyclomatic complexity value needs to be higher than 15 and why the software component cannot be structured to be lower than 15 or why the cost and risk of reducing the complexity to below 15 are not justified by the risk inherent in modifying the software component.
1. Confirm that NASA, engineering, project, software assurance, and IV&V have electronic access to the models, simulations, and associated data used as inputs for auto-generation of software.
1. Confirm that Class A and B software acquired, developed, and maintained by NASA is performed by an organization with a non-expired CMMI-DEV rating, as per the NPR 7150.2 requirement.
1. Confirm that the project has considered reusability for its software development activities.
1. Confirm that any project software contributed as a reuse candidate has the identified information in items “a” through “f.”
a. Software Title.
1. Confirm the project has performed a software cybersecurity assessment on the software components per the Agency security policies and the project requirements, including risks posed by the use of COTS, GOTS, MOTS, OSS, or reused software components.
1. Confirm that cybersecurity risks, along with their mitigations, are identified and managed.
1. For software products with communications capabilities, confirm that the software requirements, software design documentation, and software implementation address unauthorized access per the requirements contained in the Space System Protection Standard, NASA-STD-1006.
1. Confirm that testing is complete for the cybersecurity mitigation.
1. Confirm that the software requirements exist for collecting, reporting, and storing data relating to the detection of adversarial actions.
1. Confirm that bi-directional traceability has been completed, recorded, and maintained.
2. Confirm that the software traceability includes traceability to any hazard that includes software.
1. Confirm that all software requirements are established, captured, and documented as part of the technical specification, including requirements for COTS, GOTS, MOTS, OSS, or reused software components.
1. Confirm the software requirements changes are documented, tracked, approved, and maintained throughout the project life cycle.
1. Monitor identified differences among requirements, project plans, and software products and confirm differences are addressed and corrective actions are tracked until closure.
1. Confirm that the project software testing has shown that software will function as expected in the customer environment.
4. Confirm that the software design implements all of the required safety-critical functions and requirements.
1. Confirm that the software code implements the software designs.
2. Confirm that the code does not contain functionality not defined in the design or requirements.
1. Assure the project manager selected and/or defined software coding methods, standards, and criteria.
2. Confirm the static analysis tool(s) are used with checkers to identify security and coding errors and defects.
4. Confirm that the software code has been scanned for security defects and confirm the result.
7. Confirm that Software Quality Objectives or software quality threshold levels are defined and set for static code analysis defects, checks, or software security objectives.
1. Confirm that the project successfully executes the required unit tests, particularly those testing safety-critical functions.
2. Confirm that the project addresses or otherwise tracks to closure errors, defects, or problem reports found during unit testing.
1. Confirm that the project maintains the procedures, scripts, results, and data needed to repeat the unit testing (e.g., as-run scripts, test procedures, results).
1. Confirm that the project creates a correct software version description for each software release.
2. For each software release, confirm that the software has been scanned for security defects and coding standard compliance and confirm the results.
1. Confirm that the software tool(s) needed to create and maintain software is validated and accredited.
1. Confirm that software test plans have been established, contain correct content, and are maintained.
2. Confirm that the software test plan addresses the verification of safety-critical software, specifically the off-nominal scenarios.
1. Confirm that the test procedures have been established and are updated when changes to tests or requirements occur.
1. Confirm that the project creates and maintains any code specifically written to perform test procedures in a software configuration management system.
2. Confirm that the project records all issues and discrepancies in the code specifically written to perform test procedures.
3. Confirm that the project tracks to closure errors and defects found in the code specifically written to perform test procedures.
1. Confirm that the project creates and maintains the test reports throughout software integration and test.
2. Confirm that the project records the test report data and that the data contains the as-run test data, the test results, and required approvals.
3. Confirm that the project records all issues and discrepancies found during each test.
4. Confirm that the project tracks to closure errors and defects found during testing.
1. Confirm test coverage of the requirements through the execution of the test procedures.
3. Confirm that any newly identified software contributions to hazards, events, or conditions found during testing are in the system safety data package.
1. Confirm that software items to be tested are under configuration management before the start of testing.
2. Confirm the project maintains the software items under configuration management through the completion of testing.
1. Confirm that test results are assessed and recorded.
2. Confirm that the project documents software non-conformances in a tracking system.
3. Confirm that test results are sufficient verification artifacts for the hazard reports.
1. Confirm that the software models, simulations, and analysis tools used to achieve the qualification of flight software or flight equipment have been validated and accredited.
1. Confirm that the project validates the software components on the targeted platform or a high-fidelity simulation.
1. Confirm that code coverage measurements have been selected, performed, tracked, recorded, and communicated with each release.
1. Confirm that the project performs code coverage analysis using the results of the tests or a code coverage tool.
1. Confirm that the project plans regression testing and that the regression testing is adequate and includes retesting of all safety-critical code components.
2. Confirm that the project performs the planned regression testing.
4. Confirm that the regression test procedures are updated to incorporate tests that validate the correction of critical anomalies.
1. Through testing, confirm that the project verifies the software requirements which trace to a hazardous event, cause, or mitigation techniques.
1. Confirm that the project develops acceptance tests for loaded or uplinked data, rules, and code that affect software and software system behavior.
2. Confirm that the loaded or uplinked data, rules, scripts, or code that affect software and software system behavior are baselined in the software configuration system.
3. Confirm that loaded or uplinked data, rules, and scripts are verified as correct prior to operations, particularly for safety-critical operations.
1. Confirm that the project is testing COTS, GOTS, MOTS, OSS, or reused software components to the same level as developed software for its intended use.
2. Confirm that the project implements software operations, software maintenance, and software retirement plans.
1. Confirm that the correct version of the products is delivered, including as-built documentation and project records.
1. Confirm that the project has identified the software requirements to be met, the approved changes to be implemented, and defects to be resolved for each delivery.
2. Confirm that the project has met all software requirements identified for delivery.
3. Confirm requirements once planned for delivery but no longer appearing in delivery documentation have been dispositioned.
4. Confirm that approved changes have been implemented and tested.
5. Confirm that the approved changes to be implemented and the defects to be resolved have been resolved.
1. Confirm that the project has identified the records and software tools for archival.
2. Confirm that the project archives all software and records selected for archival, as planned.
2. Confirm the following:
a. The project tracks the changes.
b. The changes are approved and documented before
c. The implementation of changes is complete.
d. The project tests the changes.
3. Confirm software changes follow the software change control process.
1. Confirm that the project has identified the configuration items and their versions to be controlled.
1. Confirm that software assurance has participation in software control activities.
1. Confirm that the project maintains records of the configuration status of the configuration items.
1. Confirm that the project manager performed software configuration audits to determine the correct version of the software configuration items and verify that the results of the audit conform to the records that define them.
1. Confirm that the project establishes procedures for storage, processing, distribution, release, and support of deliverable software products.
1. Confirm and assess that a risk management process includes recording, analyzing, planning, tracking, controlling, and communicating all software risks and mitigation plans.
1. Confirm that software peer reviews are performed and reported on for project activities.
2. Confirm that the project addresses the accepted software peer review findings.
4. Confirm that the source code satisfies the conditions in the NPR 7150.2 requirement SWE-134, "a" through "l," based upon the software functionality for the applicable safety-critical requirements at each code inspection/review.
1. Confirm that the project meets the NPR 7150.2 criteria in "a" through "d" for each software peer review.
2. Confirm that the project resolves the actions identified from the software peer reviews.
1. Confirm that the project records the software peer reviews and results of software inspection measurements.
1. Confirm that a measurement program establishes, records, maintains, reports, and uses software assurance, management, and technical measures.
1. Confirm software measurement data analysis conforms to documented analysis procedures.
1. Confirm access to software measurement data, analysis, and status as requested to the following entities, at a minimum:
- Sponsoring Mission Directorate
- NASA Chief Engineer
- Center Technical Authorities
- Headquarters SMA
1. Confirm that the project monitors and updates planned measurements to ensure the software meets or exceeds performance and functionality requirements, including satisfying constraints.
2. Monitor and track any performance or functionality requirements that are not being met or are at risk of not being met.
1. Confirm that the project collects, tracks, and reports on the software volatility metrics.
1. Confirm that all software non-conformances are recorded and tracked to resolution.
2. Confirm that accepted non-conformances include the rationale for the non-conformance.
1. Confirm that all software non-conformances severity levels are defined.
3. Confirm that the project assigns severity levels to non-conformances associated with tools, COTS, GOTS, MOTS, OSS, and reused software components.
1. Confirm the evaluations of reported non-conformances for all COTS, GOTS, MOTS, OSS, or reused software components are occurring throughout the project life cycle.
1. Perform or confirm that a root cause analysis has been completed on all identified high severity software non-conformances, and that the results are recorded and have been assessed for adequacy.
2. Confirm that the project analyzed the processes identified in the root cause analysis associated with the high severity software non-conformances.
4. Perform or confirm tracking of corrective actions to closure on high severity software non-conformances.
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.