See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.1.3 The project manager shall track and evaluate changes to software products.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
As software teams design, develop, and deploy software, it is common for multiple versions of the same software to be used in different sites and for the software developers to be working simultaneously on updates. Bugs or defects in the software are often only present in certain versions (because of the fixing of some problems and the introduction of others as the program develops). Therefore, to locate and fix bugs, it is important to be able to retrieve and run different versions of the software to determine in which version(s) the problem occurs. It may also be necessary to develop two versions of the software concurrently (for instance, where one version has bugs fixed, but no new features, while the other version is where new features are developed. Change requests address not only new or changed requirements but also failures and defects in software products. Change requests are analyzed to determine the impact that the change will have on the software product, related software products, the budget, and the schedule. Tracking and evaluating changes are useful for a variety of reasons, not the least of which is to maintain documented descriptions of problems, issues, faults, etc., their impact on the software and system, and their related resolutions. Evaluating changes allows key stakeholders to determine the cost-benefit of implementing changes and to make decisions based on that information.
3. Guidance
Tracking and evaluating changes occurs throughout the project life cycle and applies to all software providers, internal and subcontracted.
The NASA Systems Engineering Handbook 273, NASA/SP-2007-6105, Rev1, provides a flowchart of a "typical" change control process. This flowchart, shown below, highlights key activities and roles for capturing and tracking changes that are appropriate considerations for any project establishing a new change control process. Several of these steps are addressed in other guidance in this Handbook (see the table of related guidance at the end of this section), including Configuration Status Accounting (CSA).
Guidance for key elements from this flowchart is included below, including preparing the change request, evaluating the change, and tracking the request through the change control process.
Considerations for capturing the change
- Changes can be requested for baselined software products including specifications, requirements, design, code, databases, test plans, user documentation, training materials, etc.
- Discrepancies.
- Problems or failures.
- Reconfiguration changes, including routine changes, to operational software.
- Changes related to upgrades.
- Enhancement requests.
Capturing the requested change usually involves completing a predefined change request form or problem report and may require access to a change tracking system. A problem reporting and corrective action (PRACA) system is also an option for capturing changes, particularly after the software is operational (NASA-GB-8719.13, NASA Software Safety Guidebook 276).
Depending on system access and project procedures, requests may be entered by developers, testers, end-users, help desk personnel, etc. See 5.01 - CR-PR - Software Change Request - Problem Report in this Handbook for guidance for change requests and problem reports. Consider the following suggestions for the change capture process:
- Require a separate change request or problem report for each change.
- Use a form/format that guides the writer through the process of capturing all key information needed to capture the issue and process the request.
Considerations for evaluating the change and suggested solution
- Project impact analysis.
- Include the appropriate set of stakeholders, such as procurement, software assurance, risk management, relevant experts, management (e.g., change requested by high visibility customer may result in a business decision to implement a change as opposed to volumes of end-users not seeing the problem), etc.
- Evaluate the impact on schedule and cost, including making and testing the change and regression testing the software.
- Evaluate the impact on other groups and resources, as applicable.
- Evaluate the impact on functions and features, interfaces, and system resource requirements.
- Evaluate the impact on other baseline products, such as design, tests, documentation (traceability matrices are helpful here).
- Evaluate the risk of making the change vs. not making it.
- Evaluate the size, complexity, and criticality of the change.
- Evaluate whether a change request is within the scope of the project.
- Evaluate whether a change request is needed to meet project requirements and/or goals.
- Evaluate the impact on performance, reliability, quality, etc.
- Evaluate alternatives to making the change.
- Software safety impact analysis
- Include software assurance and software safety personnel in this review.
- Look for the potential creation of new hazard contributions and impacts.
- Look for potential modification of existing hazard controls or mitigations.
- Look for a detrimental effect on safety-critical software or hardware.
- Determine the effect on software safety.
- Determine the effect on system safety.
- Capture evaluation/analysis results and related decisions, including action items.
Impact analysis, including impact on the safety of the system, may be performed by a change control board (CCB) or experts they designate to perform the analysis. See SWE-082 - Authorizing Changes for additional guidance on impact analysis as it relates to authorizing changes.
Considerations for tracking the change
- Use a change control system that is compatible with the project environment and capable of tracking change until completed.
- Trace safety-critical problems back to the related system-level hazard.
- Include in the tracking records the actual change request/problem reports, impact analysis, notes from evaluation/approval boards and meetings, etc.
- Track the software products and versions changed as part of implementing the change (requirements, code, specifications, etc.)
- Close change requests only after verification and approval of the implemented change and all associated documentation revisions.
Tracking a change through its disposition (approve, defer, disapprove, etc.) is made easier if the tracking can be done as part of the same system used to capture the change request/problem report. Once disposition decisions are made, the relevant stakeholders are informed of the decisions.
When tracking and evaluating changes to software products, also consider this activity as part of data management activities. A basic description of data management is provided in SWE-079 - Develop CM Plan.
The current status of changes is presented at appropriate reviews, including project life-cycle reviews. A review of historical trends and details on open changes is also considered for review.
Additional Guidance
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to methods, tools, and procedures for tracking and evaluating changes.
NASA-specific configuration management planning information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
Additional guidance related to tracking and evaluating software changes may be found in the following related requirements in this Handbook:
4. Small Projects
Projects with limited budgets or personnel could reduce the overhead of tracking and evaluating changes, collecting metrics, etc., by using automated change request tools. Using existing tools can reduce purchase and setup costs for the project and if the tools are familiar to team personnel, training and start-up costs may also be minimized. Some automated tools have multiple capabilities that can provide the team with the means to perform multiple change tracking and evaluation activities with a single tool.
Additionally, small team size may be conducive to less formal evaluation methods, such as incorporating impact analysis into team meetings rather than holding separate meetings or assigning separate tasks with formal reports due to an evaluation board. Even though small projects may use less formal methods of tracking and evaluating changes, it is still very important to have a record of the changes and associated decisions so the team can have confidence in the final products.
5. Resources
5.1 References
- (SWEREF-001) Software Development Process Description Document, EI32-OI-001, Revision R, Flight and Ground Software Division, Marshall Space Flight Center (MSFC), 2010. See Chapter 13. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
- (SWEREF-011) Change Request Log Template, NASA Goddard Space Flight Center, 2015. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-212) IEEE Computer Society, IEEE STD 1042-1987, 1987. This link requires an account on the NASA START (AGCY NTSS) system (https://standards.nasa.gov ). Once logged in, users can access Standards Organizations, IEEE and then search to get to authorized copies of IEEE standards.
- (SWEREF-216) IEEE STD IEEE 828-2012, 2012., NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-271) NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07
- (SWEREF-273) NASA SP-2016-6105 Rev2,
- (SWEREF-276) NASA-GB-8719.13, NASA, 2004. Access NASA-GB-8719.13 directly: https://swehb.nasa.gov/download/attachments/16450020/nasa-gb-871913.pdf?api=v2
- (SWEREF-343) This NASA-specific information and resource is available in at the System for Administration, Training, and Educational Resources for NASA (SATERN), accessible to NASA-users at https://saterninfo.nasa.gov/.
- (SWEREF-431) (2012). Software Program Managers Network (SPMN) Lessons Learned Reference.
- (SWEREF-520) Public Lessons Learned Entry: 738.
- (SWEREF-576) Public Lessons Learned Entry: 3377.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following from the Space Shuttle program directly related to tracking and evaluating software changes:
- Problem Reporting and Corrective Action System. Lesson Number 0738 520: "The information provided by Problem Reporting and Corrective Action System (PRACAS) allows areas in possible need of improvement to be highlighted to engineering for the development of corrective action if deemed necessary. With this system in place in the early phases of a program, means are provided for the early elimination of the causes of failures. This contributes to reliability growth and customer satisfaction. The system also allows trending data to be collected for systems that are in place. Trend analysis may show areas in need of design or operational changes."
- Software Requirements Management. Lesson Number 3377 576: "The ability to manage and trace software requirements is critical to achieving success in any software project and to produce software products in a cost-effective and timely fashion. Conversely, incomplete, incorrect, or changing software requirements result in cost and schedule impacts that increase the later they occur or are discovered in the software life cycle. Current software technology, processes, and tools provide innovative, automated methods to facilitate optimum management of software requirements."
6.2 Other Lessons Learned
Additionally, the Software Program Managers Network 431 documents the following relevant lesson as one of its configuration management lessons learned following "visits with many different software-intensive development programs in all three Services. It describes problems uncovered ... on several Department of Defense (DoD) software-intensive programs."
- "The change control board's structure used by software projects is often overly cumbersome. It does not adequately assess, before authorizing a change, the impacts of a proposed change or the risk and cost of making these changes."
7. Software Assurance
7.1 Tasking for Software Assurance
1. Analyze proposed software and hardware changes to software products for impacts, particularly to safety and security.
2. Confirm:
a. that the project tracks the changes,
b. that the changes are approved and documented before implementation,
c. that the implementation of changes is complete, and
d. that the project tests the changes.
3. Confirm software changes follow the software change control process.
7.2 Software Assurance Products
- Software Design Analysis
- Source Code Analysis
- Verification Activities Analysis
- Software assurance impact analysis on changes, including risks and issues.
Objective Evidence
- Software Problem reporting or defect tracking data
- Software configuration management system data
- Software assurance audit results of the change management processes.
7.3 Metrics
- # of software process Non-Conformances by life-cycle phase over time.
- The trend of change status over time (# of changes approved, # in implementation, # in test, # closed).
- # of detailed software requirements tested to date vs. total # of detailed software requirements.
- # of tests successfully completed vs. total # of tests.
- # of Hazards containing software that has been successfully tested vs. total # of Hazards containing software.
- # of Requirements tested successfully vs. total # of Requirements.
- # of Non-Conformances identified during each testing phase (Open, Closed, Severity).
- # of tests executed vs. # of tests successfully completed.
- # of safety-related non-conformances identified by life-cycle phase over time.
- # of safety-related requirement issues (Open, Closed) over time.
7.4 Guidance
- Software assurance should analyze all proposed changes for impacts, looking closely at any impacts the change may have in any software related to safety or security. The analysis should also consider whether there will be any impacts on existing interfaces or the use of any COTS, GOTS, MOTS, or reused software in the system and whether the change will impact any future maintenance effort. Any identified risks should be brought up in the CCB meeting to discuss approval/rejection of the change.
- Confirm:
- That the project tracks the changes.
Software assurance checks to see that any changes submitted are properly documented and tracked through all states of resolution (including investigation, acceptance/rejection, implementation, test, closure) in the project tracking system.
- That the changes are approved and documented before implementation.
Software Assurance should track the changes from their submission to their closure or rejection. Initially, SA should confirm that all changes follow the change management process that the project has established or adopted. Initially, the change will be documented and submitted to the authorizing CCB for consideration. The authorizing CCB (which will include software assurance personnel) will evaluate any changes for impacts.
If the software is safety-critical, the responsible Software Assurance personnel will perform software safety analysis to evaluate whether the proposed change could invoke a hazardous state, affect a control for a hazard, condition, or state, increase the likelihood or severity of a hazardous state, adversely affect safety-critical software, or change the safety criticality of an existing software element. It needs to be kept in mind that changes to the hardware or the software can impact the overall system’s safety and while the focus is on software changes, Software Assurance also needs to be aware of changes to the hardware that may impact how software controls, monitors and analyzes inputs from that hardware. Hardware and software changes can alter the function of the software so that software previously determined to be not-safety critical is now safety-critical. It is also possible that the change in function may change the safety risk level from moderate to critical severity.
Some other considerations for the evaluation of changes:
- Is the change an error correction or a new requirement?
- Will the change fix the problem without changes to other areas?
- If major changes to other areas are needed, are they specified, and is this change really necessary?
- If the change is a requirements change, has the new requirement been approved?
- How much effort will be required to implement the change?
- If there is an impact on safety or reliability, are there additional changes that need to be made in those areas? Note: If there is a conflict between safety and security, safety changes have priority.
When all the impacts are considered, the CCB votes on acceptance/rejection. Software Assurance is a voting member of the CCB. (See Requirement Mapping Matrix for applicability) Software Assurance verifies that the decision is recorded and is acceptable, defined as:
- When the resolution is to “accept as is”, verify that the impact of that resolution on quality, safety, reliability, and security is compatible with the Project’s risk posture and is compliant with NPR 7150.2 and other Center and Agency requirements for risk.
- When the resolution is a change to the SW, the change will sufficiently address the problem and will not impact quality, safety, reliability, security, and compliance with NPR 7150.2; the change will not introduce new or exacerbate other, discrepancies or problems.
- In either case, the presence of other instances of the same kind of discrepancy/problem has been sought out and, if detected, addressed accordingly.
- Verify that appropriate software severity levels are assigned and maintained. (See SWE-202 - Software Severity Levels.)
- Assure any risk associated with the change is added to the Project/facility risk management system and is addressed, as needed, in safety, reliability, or other risk systems.
- That the implementation of the changes is complete.
Software Assurance checks to see if the implementation of the approved changes has been coded as per the change request. SA also checks to see that any associated documentation changes are submitted/approved and/made as needed (i.e., updates to requirements, design, test plans/procedures, etc.)
- That the project tests the changes.
Software Assurance checks to see that the project test any code that has changed and runs a set of regression tests to see that the change has not caused problems anywhere else in the software system. If the software is safety-critical, a full set of regression tests must be run to ensure that there was no impact on the safety-critical functions.
3. Confirm software changes are done using the software control process.
Software Assurance checks that the software control process has been followed throughout the handling of the submitted change and that the status of the change is recorded and confirmed as closed.