See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
3.1.4 The project manager shall track the actual results and performance of software activities against the software plans.
- Corrective actions are taken, recorded, and managed to closure.
- Including changes to commitments (e.g., software plans) that have been agreed to by the affected groups and individuals.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
The purpose of this requirement is to determine the status of the project and ensure that the project performs according to plans and schedules, and within projected budgets, and that it satisfies technical objectives. This includes redirecting the project activities, as appropriate, to correct identified deviations and variations from other project management or technical processes. Redirection may include replanning as appropriate. The satisfaction of commitments in this plan, as well as subordinate software plans, helps assure that the safety, technical integrity, performance, and mission success criteria for the project will be met.
3. Guidance
The purpose of the Planning Process is to produce and communicate effective and workable project software plans. This process determines the scope of the software management and technical activities, identifies process outputs, software tasks, and deliverables, establishes schedules for software task conduct, including achievement criteria, and required resources to accomplish the software tasks. The software lead generally has the responsibility for periodically evaluating the cost, schedule, risk, technical performance, and content of the software work product development activity.
The guidelines for different types of software plans are contained in 7.18 – Documentation Guidance. Whenever baselined software plans (e.g., Software Development/Management Plan, Software Test Plan, Software Independent Verification and Validation (IV&V) Plan) are changed, previous commitments and agreements are likely to be impacted. Affected parties, whether they are stakeholders or other interested parties, 041 need to be solicited for their concurrence with the new plan. With concurrence comes the commitment to support the revised work plan. Without commitment, the risk arises that not all elements of the work plan will be performed or completed on time. The risk may also occur that customers and stakeholders will not accept the final software work products because they no longer meet customer needs and expectations.
The project is responsible for ensuring that commitments are met throughout the project life cycle. Tracking results and performance of software activities against software plans, including managing corrective actions and changes to those plans, is the primary method for carrying out that responsibility.
Results and Performance Tracking
The planning and requirements documentation developed during the early phases of the project guides the development of software work products. The project management team and the software development lead work together to construct a work plan that is logical and achievable in the allotted time and budget. During the early phases, key performance factors, schedules, and milestones are composed. As scheduled work is performed the results need to be reviewed to assure conformance with these plans and to assess if the expected performance has been achieved. The CMMI Institute's capability maturity model (CMMI®-DEV) considers the evaluation of these work activities to be part of its Project Monitoring and Control process. "A project's documented plan is the basis for monitoring activities, communicating status, and taking corrective action. Progress is primarily determined by comparing actual work product and task attributes, effort, cost, and schedule to the plan prescribed milestones or control levels within the project schedule or work breakdown structure (WBS)." 157
Per the Lesson Learned Acquisition and Oversight of Contracted Software Development (1999), Lesson No. 0921 528, it is important to ensure that software plans go across contract boundaries (as well as memorandums of understanding and other agreements) are adequately tracked by the project.
The project teams can use several tools to develop insight into the progress of the work. For tracking the progress of activities against plans the use of the following tools and techniques could be helpful:
- Charts - comparisons of planned vs. achieved values.
- Documents - statusing the document tree.
- Schedules - baselined, updates, variances.
- Reports - monthly technical, schedule, and cost narratives; performance measures.
- Project integration meetings and telephone conferences - cross-discipline evaluations.
- Test observations - unit test and integration test activities.
- Team meetings - issue (current and forecasted) and problem reporting; resolution options and tracking completion status.
Results and analysis of these tracking activities can serve as the basis for reviews by stakeholders and advocates (see SWE-018 - Software Activities Review).
In addition to the software lead, software assurance personnel have a responsibility for this requirement. "Specifically, reviews, audits, and evaluations may be performed to ensure adherence to and effectiveness of approved plans and procedures. Assure that problem reports, discrepancies from reviews, and test anomalies are documented, addressed, analyzed, and tracked to resolution. Assure that software products (e.g., software requirements, preliminary design, detailed design, use cases, code, models, simulators, test data, inspection results, flow diagrams) are reviewed and software quality metrics (e.g., defect metrics) are collected, analyzed, trended, and documented." 278
The software development team uses approved engineering processes to achieve specified results and performance. Reviews, audits, and tracking of the actual use of the specified processes by the software development team is a function of software assurance (see SWE-022 - Software Assurance).
Corrective Actions
Often the evaluation of actual results versus expected performance reveals issues, discrepancies, or deviations that need to be corrected. Typically these findings require further evaluations, replanning, and additional time in the schedule to correct. The software development lead must track these issues to closure to ensure the intent of this requirement.
Tools, such as Excel®-based checklists, planning, and tracking tools (such as Omniplan® and Primavera®), and/or formal configuration management systems/change control tools, are used to identify, resolve, and track closure discrepancies and other shortfalls to project performance.
It is important to understand that the activities of "identification," "recording," and "tracking to closure" are techniques that the software development engineering team uses to address and satisfy NPR 7150.2, NASA Software Engineering Requirements, requirements related to many areas in the project, such as life cycle planning (SWE-018 - Software Activities Review), requirements development and management (SWE-054 - Corrective Action for Inconsistencies), configuration management systems (SWE-080 - Track and Evaluate Changes), and the preparation of documentation to measure and record these activities (SWE-091 - Establish and Maintain Measurement Repository, Topic 7.18 – Documentation Guidance). NPR 7150.2 uses these terms repetitively, but users of this Handbook are expected to use them and interpret them in the context of the SWE guidance being read.
Occasionally these reviews surface a significant discrepancy between the actual and expected results of an activity. Some discrepancies are a normal part of project development activity and are resolved through the normal course of the scheduled activity. These discrepancies are typically tracked informally until the developers establish a product baseline after which discrepancies/problems are formally tracked (usually in the Problem Report and Corrective Action (PRACA) system) which requires evaluation, disposition, and assurance oversight of the problem. The Software Development Plan or the Software Configuration Management Plan typically defines the level of the discrepancies that are required to be recorded and tracked in the formal tracking systems. Typically a Center has an approved process for PRACA activities. This requirement does not mandate a particular approach or tool as long the key elements of a corrective action activity that are described in the following paragraph are employed.
During the software development activity, once a discrepancy is found that meets the criteria for formal reporting, the software development team clearly states the issue, its area of applicability across the software development activity, and the spectrum of relevant stakeholders it involves. As this information is obtained, the issue is documented in the approved process tool or data repository, and an analysis is conducted of the discrepancy. The results of a properly completed analysis provide a clear understanding of the discrepancy and a proposed course of action to further investigate and resolve the discrepancy as necessary. 5.01 - CR-PR - Software Change Request - Problem Report provides specific details for the information needed for documenting a problem report.
Once a corrective action activity has been approved and initiated, its progress is reviewed regularly for progress and its use of planned resources. This information is used to assess whether the action itself is on course or deviating from the expected result.
An important element of the corrective action activity is the proper closeout of the action. After the activity has concluded, or when the discrepancy has been narrowed to within acceptable limits, the closeout is recorded and may include the following information:
- Description of the issue/discrepancy.
- Proposed corrective action, with acceptable limits.
- Actual results/impacts from the effort.
- Listing of required changes to requirements, schedules, resources, if any, to accommodate the result.
- Signature(s)/concurrence by all relevant stakeholders.
Once the documentation has been completed, it can be entered into a suitable repository or configuration management system.
Commitment Changes
During the software development life cycle, results can deviate from expectations, and funding or workforce levels may be altered. Baselined requirements (see topic 7.08 - Maturity of Life-Cycle Products at Milestone Reviews in this Handbook for phases of the Space Flight Project life cycle at which software plans typically become baselined) changes become necessary based on these and other factors. It may be necessary to update planning documentation and development schedules to account for corrective action activity (see SWE-080 - Track and Evaluate Changes). Once it becomes clear that plans need to be changed, and/or schedules need to be altered, the software development lead and the relevant stakeholders recommit to these revised plans. The new commitments assure the availability of resources and unified expectations regarding the revised project.
Several avenues exist to obtain a formal commitment to changes in plans. First, the change control process requires formal evaluation and agreement, and signoff by relevant parties. The software development team must involve all the relevant stakeholders and other interested parties through the exercise of its configuration management change control system (see SWE-082 - Authorizing Changes). Less formal methods may be used, e.g., jointly written and signed memos of understanding between or among the various parties involved. The concurrence and signature to these documents are usually sufficient to provide binding commitments. Finally, organizational policies can also provide the required formality. Software development team organizational charters may require the team to automatically support any changes, subject to resource availability, because of Center, Agency, or national priorities. It is still important for the project and software development teams to strive for concurrence and commitment by the customers and stakeholders to mitigate risks engendered by unhappy and dysfunctional team members.
4. Small Projects
This requirement applies to all projects regardless of size. It's not unusual for smaller and less critical projects to utilize engineering personnel to fulfill some or all of the assurance duties (rather than personnel from the Center's Safety and Mission Assurance Organization).
Smaller projects may also consider using a work tracking system or configuration management tool that provides automatic notification and tracking when updates to documentation occur and rebaselining is necessary. Several small projects have begun using wikis to create and maintain project documentation. The use of a wiki allows those working on the project to be notified of changes as they occur.
5. Resources
5.1 References
- (SWEREF-041) NPR 7123.1D, Office of the Chief Engineer, Effective Date: July 05, 2023, Expiration Date: July 05, 2028
- (SWEREF-082) NPR 7120.5F, Office of the Chief Engineer, Effective Date: August 03, 2021, Expiration Date: August 03, 2026,
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-219) IEEE Std 1028, 2008. IEEE Computer Society, NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-277) NASA-STD-8739.9, NASA Office of Safety and Mission Assurance, 2013. Change Date: 2016-10-07, Change Number: 1
- (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
- (SWEREF-520) Public Lessons Learned Entry: 738.
- (SWEREF-528) Public Lessons Learned Entry: 921,
- (SWEREF-529) Public Lessons Learned Entry: 938.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to planning tracking, corrective actions, and commitment changes: :
- Acquisition and Oversight of Contracted Software Development (1999), Lesson No. 0921 528: "The loss of Mars Climate Orbiter (MCO) was attributed to, among other causes, the lack of a controlled and effective process for acquisition of contractor-developed, mission-critical software. NASA Centers should develop and implement acquisition plans for contractor-developed software and this should be described in each Project Implementation Plan. These plans must provide for Software Requirements, Software Management Planning, and Acceptance Testing and assure NASA Center verification of the adequacy of the software design approach and overall contractor implementation throughout the software life cycle.".
- Probable Scenario for Mars Polar Lander (MPL) Mission Loss (1998), Lesson No. 0938 529: "Description: Neither the MPL software requirements specification nor the software, subsystem or system test plans required verification of immunity to transient signals. MPL touchdown sensors generated known transient signals at leg deployment. The full leg deployment test was not repeated after wiring corrections. Tests should be re-run after test deficiencies are corrected or hardware or software is revised unless a clear rationale exists for not doing so. Hardware operational characteristics, including transients and spurious signals, must be reflected in software requirements and verified by test."
- Problem Reporting and Corrective Action System, Lesson No. 0738 520: "This Lesson Learned is based on Reliability Practice No. PD-ED-1255; from NASA Technical Memorandum 4322A, NASA Reliability Preferred Practices for Design and Test."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
- Corrective actions are taken, recorded, and managed to closure.
- Including changes to commitments (e.g., software plans) that have been agreed to by the affected groups and individuals.
7.1 Tasking for Software Assurance
- Assess plans for compliance with NPR 7150.2 requirements, NASA-STD-8739.8 278, including changes to commitments.
- Confirm that closure of corrective actions is associated with the performance of software activities against the software plans, including closure rationale.
7.2 Software Assurance Products
- Assessment of SA Compliance with NASA-STD-8739.8.
- Assessment of project compliance with NASA NPR 7150.2.
- Assessment of progress and performance against plans, and requirements, including corrective actions.
Objective Evidence
- Software Assurance Audits findings, audit results that show that the plans are being followed.
- Software plan updates.
- Evidence that closures of the corrective actions have been confirmed.
7.3 Metrics
- # of CAs raised by SA vs. total #
• Attributes (Type, Severity, # of days Open, Life-cycle Phase Found, severity)
• State (Open, In work, Closed)
• Trends of CA closures over time - Trend the # of inconsistencies or corrective actions identified, and # closed.
- # of software work product Non-Conformances identified by life-cycle phase over time
- # of Non-Conformances identified in plans (e.g., SMPs, SDPs, CM Plans, SA Plans, Safety Plans, Test Plans)
- # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)
- # of Compliance Audits planned vs. # of Compliance Audits performed
- # of Open vs. Closed Audit Non-Conformances over time
- Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)
7.4 Guidance
Task 1: To perform these tasks, the software assurance personnel need to review the software management/development plans and be aware of what project activities are planned and when they should be performed Then SA will need to confirm that the project is tracking their activities against their plan and progress, recording any issues and concerns that need corrective actions. Typically these are discussed in schedule reviews, regular status meetings, and periodic project reviews. The SA team makes sure that all issues and planned corrective actions are recorded and managed to closure by the project. Similarly, any issues and concerns identified from the reviews of the planning materials or the status or milestone reviews should result in corrective actions.
Task 2: Corrective actions that are planned should be tracked and the closure of all corrective actions should be confirmed, along with the accompanying rationale. It is not good enough to just declare that the corrective actions can be closed if there is no actual evidence of the implementation of the corrective action. If SA identifies areas where some issues or concerns need corrective action, those should also be brought to the attention of the project manager, documented, and tracked to closure. Upfront planning should be followed throughout the Project development and any changes in responsibilities of personnel, project commitments, software classification or criticality of the SW need to be discussed and reported at the status reviews and updated in the planning. Software Assurance should assure that any project documentation is updated to reflect any of these changes and check that the project is now tracking against the revised documentation. Software Assurance tracks the open and closed action items and develops a trend over time showing the total number of action items identified and the number of closed action items at each time increment. This should be analyzed to determine whether corrective actions are being identified and closed as expected.