- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
4.1.5 The project manager shall track and manage changes to the software requirements.
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
SWE-053 - Last used in rev NPR 7150.2D
126.96.36.199 The project shall collect and manage changes to the software requirements.
|Difference between A and B|
Changed "collect" to "track".
188.8.131.52 The project manager shall track and manage changes to the software requirements.
|Difference between B and C|
4.1.5 The project manager shall track and manage changes to the software requirements.
|Difference between C and D||No change|
4.1.5 The project manager shall track and manage changes to the software requirements.
1.3 Applicability Across Classes
Key: - Applicable | - Not Applicable
A & B = Always Safety-Critical; C & D = Sometimes Safety-Critical; E - F = Never Safety-Critical.
Requirements change management helps ensure alignment between the software requirements and the project’s software plans and software work products. Collecting, analyzing, and managing requirements changes allow a project to control those changes and measure their impact. Requirements change management can also provide early insights into the impact of those changes to the overall project's budget and schedule, including the ability to plan how to address changes rather than simply reacting when a change is made.
Figure 1 - Typical flow down of requirements 273
Changes to requirements can be an indicator of software instability or unclear description of the end product. Regardless of the reason, requirements changes often mean increased costs, longer schedules, and changes in the technical features of the software. Changes in requirements can result in rework and can affect software safety.
Figure 2 shows typical requirements change management process flow.
Figure 2 - Software requirements management activities 273
Requirements management also includes the management of the software data dictionary content. Examples of the content contained in a software data dictionary include:
- Channelization data (e.g., bus mapping, vehicle wiring mapping, hardware channelization).
- Input/Output (I/O) variables.
- Rate group data.
- Raw and calibrated sensor data.
- Telemetry format/layout and data.
- Data recorder format/layout and data.
- Command definition (e.g., onboard, ground, test specific).
- Effecter command information.
- Operational limits (e.g., maximum/minimum values, launch commit criteria information).
Keep in mind that managing changes to requirements is an ongoing process that occurs throughout the project life cycle and needs to be planned for during project planning activities.
3.1 Capture and manage the changes
Consider using configuration management tools, requirements management tools, or change request tools to capture changes to requirements. Many of these tools will keep version histories and are able to provide reports on the changes. Some of those reports may be useful to management when analyzing the impact of the requirements changes on the project.
Regardless of the capture method, projects need to collect a minimum set of data to describe the requested change. See topic 7.18 - Documentation Guidance for more information.
As part of managing requirements changes, the team needs to keep in mind the effect of "requirements creep" on software development, including its costs and complexity. Those performing impact analyses or approving/disapproving requirements changes need to carefully scrutinize requests to avoid approving enhancements not required to accomplish the project goals and objectives.
3.2 Analyze the changes for cost, technical, and schedule impacts
Once change requests are documented, the team analyzes them for their effect on all parts of the software system as well as their effect on the overall system and project, as applicable to the level of change. Typically, changes are reviewed by a Configuration Control Board (CCB) or other review board (see SWE-082) who can request or perform an impact analysis (see SWE-080) before determining how to disposition the change.
When performing analysis of changes, look for impacts such as:
- Impact on other parts of the software system (not just the immediate design or code where the change will occur): architecture, design, interfaces, the concept of operations, higher- and lower-level requirements.
- Impact on other parts of the overall system (e.g., system requirements, interfaces, hardware).
- Safety, reliability, performance impacts.
- Skills needed (e.g., special expertise, additional developers, consultants).
- Rework effort (e.g., requirements specification, design, code, test, user manuals).
- New effort (e.g., code development, documentation, test case development, software assurance).
- Impact to stakeholders.
- Potential to introduce errors into the software.
- The criticality of the affected software.
Traceability matrices and tools are useful when determining the impact of a change, but the team needs to update the traceability information to keep it current as changes to the requirements occur (see SWE-052). Other relevant items, from NASA/SP-2007-6105, NASA Systems Engineering Handbook 273, include:
- Performance margins – "a list of key performance margins for the system and the current status of the margin... the propellant performance margin will provide the necessary propellant available versus the propellant necessary to complete the mission."
- Configuration management top evaluators list – "appropriate persons [to evaluate] the changes and providing impacts to the change... changes need to be routed to the appropriate individuals to ensure that the change has had all impacts identified."
- Risk system and threats list – "used to identify risks to the project and the cost, schedule, and technical aspects of the risk...A threat list is normally used to identify the costs associated with all the risks for the project."
The team uses the results of the impact assessment to determine the effect the change has on the project in terms of:
- Cost including personnel and equipment costs.
- Schedule including new or revised design, development, testing, and documentation effort.
- Technical impacts on the function of the software and overall system.
3.3 Document analysis results
Document and maintain the project records the results of the impact analysis and any decisions based on that analysis. Communicate the results to the appropriate stakeholders, i.e., project personnel who must implement approved changes must update documentation related to those changes, must test those changes; system-level personnel who must coordinate changes in response to this requirements change; as well as those affected if a requested change is not approved for implementation, including stakeholders outside the development team.
3.4 Other tasks
In addition to the activities noted above, managing changes to requirements may also include the following:
- Ensuring that relevant project plans are updated to reflect approved requirements changes.
- Ensuring change requests are created, assigned, and completed for work products affected by approved requirements changes, e.g., design documents, user manuals, interface specifications.
- Ensuring changed requirements and all related changed work products are verified and validated.
- Ensuring traceability documents are updated to reflect the requirements change.
- Ensuring that the rest of the project uses only the latest, updated project documents that reflect the requirements change.
NASA users should consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to the management of requirements changes.
SPAN - Software Processes Across NASA
SPAN contains links to all of the Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. 197
Additional guidance related to the management of requirements changes may be found in the following related requirement in this Handbook:
4. Small Projects
For projects with limited budgets or staff, spreadsheet programs may be used to manage the requirements and track changes to them, but when the number of requirements is large and a project can afford it, using an appropriate combination of a requirements management tool, a change management tool, and a change request tool can make managing requirements change easier for the project.
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
The NASA Lessons Learned database contains the following lesson learned related to managing requirements change:
- Lewis Spacecraft Mission Failure Investigation Board (Indirect contributors to loss of spacecraft.) Lesson Number 0625 512: "These indirect contributors [to the loss of the spacecraft] are to be taken in the context of implementing a program in the Faster, Better, Cheaper mode: Requirement changes without adequate resource adjustment."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
7.2 Software Assurance Products
- Issues and concerns from the requirements volatility trending.
Definition of objective evidence
- Document change management system results.
- Software problem reporting or defect tracking system results.
Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:
- Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
- Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
- Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
- Signatures on SA reviewed or witnessed products or activities, or
- Status report, email or memo containing Short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
- To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
- To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
- The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.
- Software Requirements volatility (# of requirements added, deleted, or modified; # of TBDs over time)
- The trend of change status over time (# of changes approved, # in implementation, # in test, # closed)
- The trend of Open vs. Closed Non-Conformances over time
- # of incorrect, missing, and incomplete requirements (i.e., # of requirements issues) vs. # of requirements issues resolved
Confirm the software requirements changes (e.g., CRs) are documented, tracked, approved, and maintained throughout the project life cycle.
- Software assurance should analyze all proposed changes for impacts, looking closely at any impacts the change may have in any of the software related to safety or security. The analysis should also consider whether there will be any impacts on existing interfaces or the use of any COTS, GOTS, MOTS, or reused software in the system and whether the change will impact any future maintenance effort. Any identified risks should be brought up in the CCB meeting to discuss approval/rejection of the change.
- That the project tracks the changes
Software assurance will check to see that any changes that are submitted are properly documented and tracked through all the states of resolution (investigation, acceptance/rejection, implementation, test, closure) in the project tracking system.
- That the changes are approved and documented before implementation
Software assurance should track the changes from their submission to their closure or rejection. Initially, SA should confirm that all changes follow the change management process that the project has established. Initially, the change will be documented and submitted to the authorizing CCB for consideration. The authorizing CCB (which will include a software assurance person) will evaluate any changes for impacts.
If the software is safety-critical, the responsible software assurance personnel will perform software safety change analysis to evaluate whether the proposed change could invoke a hazardous state, affect a control for a hazard, condition, or state, increase the likelihood or severity of a hazardous state, adversely affect safety-critical software, or change the safety criticality of an existing software element. It needs to be kept in mind that changes to the hardware or the software can impact the overall system’s safety and while the focus is on software changes, the software also needs to be aware of changes to the hardware that may impact how software controls, monitors and analyzes inputs from that hardware. Hardware and software changes can alter the role of software from non-safety-critical to safety-critical or change the severity from moderate to critical.
Some other considerations for the evaluation of changes:
- Is the change an error correction or a new requirement?
- Will the change fix the problem without major changes to other areas?
- If major changes to other areas are needed, are they specified, and is this change really necessary?
- If the change is a requirements change, has the new requirement been approved?
- How much effort will be required to implement the change?
- If there is an impact on safety or reliability, are there additional changes that need to be made in those areas? Note: If there is a conflict between safety and security, safety changes have priority.
When all the impacts are considered, the CCB votes on acceptance/rejection. Software assurance is a voting member of the CCB. Software assurance verifies that the decision is recorded and is acceptable, defined as:
- When the resolution is to “accept as is”, verify that the impact of that resolution on quality, safety, reliability, and security is compatible with the Project’s risk posture and is compliant with NPR 7150.2 and other Center and Agency requirements for risk.
- When the resolution is a change to the SW, the change will sufficiently address the problem and will not impact quality, safety, reliability, security, and compliance with NPR 7150.2; the change will not introduce new or exacerbate other, discrepancies or problems.
- In either case, the presence of other instances of the same kind of discrepancy/problem has been sought out and, if detected, addressed accordingly.
- Verify that appropriate software severity levels are assigned and maintained.
- Assure any risk associated with the change is added to the Project/facility risk management system and is addressed, as needed, in safety, reliability, or other risk systems
- That the implementation of the changes is complete
Software assurance will check to see if the implementation of the approved changes has been coded as per the change request. Check to see that any associated documentation changes are submitted/approved and/made as needed (i.e., updates to requirements, design, test plans/procedures, etc.)
- That the project tests the changes
Software assurance will check to see that the project test any of the code that has changed and runs a set of regression tests to see that the change has not caused a problem anywhere else in the software system. If the software is safety-critical, a full set of regression tests should be run to ensure that there was no impact on the safety-critical functions.
3. Confirm software changes are done in the software control process
Software assurance will check that the software control process has been followed throughout the handling of the submitted change and that the status of the change is recorded and confirmed as closed
Develop any issues and concerns from the requirements volatility trending.
Requirements volatility is the change in requirements (added, deleted, and modified) over a given time interval. The success of software projects is dependent on the quality of requirements. Requirements are the basis for planning project schedules as well as for designing and testing specifications.
Software requirements volatility is expected during the early stages of a project (conceptualize / requirements phase). It becomes a concern when it occurs after the software requirements phase (after CDR) is complete because it is likely to result in the re-work of the software components.
Even though there are good requirements, in the beginning, the software requirements may be changed during the project development. Software requirement changes during the software development process are also known as Requirements Volatility.
Requirements volatility has a great impact on the cost, the schedule, and the quality of the final product. Due to requirements volatility, many projects fail and some are completed partially. Requirements volatility cannot be overcome fully but we can minimize it with some requirement measures or metrics. Requirement metrics are useful in identifying the risks of a project by locating errors in the requirements document.
Generally, the software engineers will be producing trend charts of the software volatility. If they are not, the SA personnel should take the software requirements to change data and produce the volatility trend charts. In performing an analysis of the trending data, they should follow the project or SA analysis procedures. As a general guideline, the amount of volatility should be leveling off by the end of the requirements analysis phase. If there is considerable requirements volatility on into development, it is cause for concern. A project that has a lot of requirements volatility when it is nearing a test or delivery period has a major problem. There may be good reasons for this increased volatility (for example, a major design change in a spacecraft), but such a level of change requires careful analysis and probably replanning of activities. The earlier these trends can be identified, the better chance the project has of correcting the problem or replanning to adjust to the requirements changes.
Requirements evolution is due to factors.
- External Factors:
- Government regulations
- project direction and changes
- project funding
- Internal factors
- Hardware and interface constraints and unknowns
- Lack of experience of the system requirements and software requirements development team
- Feedback from milestone reviews and peer reviews
- Customer, operational, software, and hardware changes
- Hazard identifications
- Lack of process maturity in requirements generation and requirements management
- Requirement or hardware reuse
- Environmental changes or mission profile changes
- Requirements instability (the extent of fluctuation in user requirements)
- Requirements diversity (the extent to which stakeholders disagree among themselves deciding on requirements).
- Requirements analyzability (the extent to which the process of producing requirement -specification can be reduced to objective procedure).
- Poor communication between users and the development team.
Requirements volatility contains schedule, cost, and performance risk factors. Developing and maintaining a trending of software requirements volatility measurements is critical to determine the risks associated with the project.
Requirement volatility is not the only reason contributing to the success of any project, by can be a key indicator in the project risk and success.
Several challenges lead to Requirement Volatility.
- The change request form was not always complete. A change request form may have little information about the reason for the proposed change.
- No formal impact analysis and incomplete change effort estimation.
- Traceability between requirements and other software artifacts is not established. See SWE-052
The impact of requirement volatility on the software project has been seen as:
- Projects Schedule: If the schedule of one activity delays, obviously all subsequent activities schedule will be disturbed.
- Project Performance: Project performance decreases due to the change in requirements. Requirement Volatility has a high impact on the coding and maintenance phases.
- Project Cost: Project cost increases due to the change in requirements.
- S/W Maintenance Project cost increases due to the change in requirements.
- S/W Quality: Quality of software decreases due to continuous changes in requirements.
Though requirements volatility has an impact on the project schedule, project performance, project cost, software maintenance, and software quality, it may have some positive effects as well as it may help us to have a better understanding of user requirements.