See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.5.2 The project manager shall define and implement clear software severity levels for all software non-conformances (including tools, COTS, GOTS, MOTS, OSS, reused software components, and applicable ground systems).
1.1 Notes
At a minimum, classes should include loss of life or loss of vehicle, mission success, visible to the user with operational workarounds, and an ‘other’ class that does not meet previous criteria.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Severity is defined as the degree of impact a defect has on the development or operation of the software being tested. A higher effect on the system functionality will lead to the assignment of higher severity to the bug. Severity indicates the seriousness of the defect in the software functionality. These software severity levels should be defined and implemented clearly.
3. Guidance
3.1 Severity Level Definitions
Assigning software severity levels for software non-conformances allows engineers to focus their efforts on problems with the highest potential impact. Severity levels are different for each project based on the criticality of the system to NASA’s major programs and projects; potential risk to astronauts, employees, or the public; and magnitude of the potential loss of government investment. Other project unique characteristics can also affect severity levels of particular non-conformances (e.g., non-conformances that disrupt science data operations, causing missed science opportunities). Project Managers should enlist software engineers, systems engineers, subsystem engineers, and other project personnel to help define severity levels for a particular project based on the project's unique characteristics. This same set of personnel should also participate, as applicable, in reviewing assigned severity levels for non-conformances that are discovered. See also SWE-201 - Software Non-Conformances. SWE-080 - Track and Evaluate Changes,
See also Topic 5.01 - CR-PR - Software Change Request - Problem Report for tracking and reporting.
Severity levels are particularly important for non-conformances discovered in COTS, GOTS, MOTS, OSS, and reused software components. Most COTS, OSS, and some GOTS, MOTS, and reused software components share lists of known defects and non-conformances. The intent of the requirement is for the user of the software to verify if a site or list of non-conformances or reported bugs is maintained by the developing organization and to review the list of known non-conformances or report bugs to see if the non-conformances or reported bugs could or do impact the software component. Most commercial products and open-source software have a site that shows a list of well know bugs or non-conformances. The requirement is to research the information and see if any of the known bugs impact the software component being used by the project. Non-conformance tracking data can be used to assess the quality and determine the latent risk to the system inherent in using COTS, GOTS, MOTS, OSS, and reused software components. See also Topic 8.08 - COTS Software Safety Considerations.
At a minimum, software non-conformance severity levels should be defined as Major (High) or Minor (Low). Additionally, levels such as Trivial or Editorial can be used for items that do not impact normal operations. More severe non-conformances can be classified as Critical or Unacceptable. As stated earlier, these severity levels are project-specific and should be defined and documented. But keeping the number of definitions to a minimum will simplify tracking and reporting metrics.
The following example of severity definitions is used by one organization at NASA:
P1 - Critical: A critical priority change request is considered to be imperative to the success of the project, and likewise, may have a detrimental impact on the project if not addressed promptly. This type of change request is mandatory and must be completed. The timeframe for estimating the effort and cost required to implement a critical change request should be one (1) week or less. Examples of critical change requests are legal mandates, functionality to meet core business process requirements, or data integrity concerning database content.
P2 - High: A high-priority change request is considered to be important to the success of the project. The timeframe for estimating the effort and cost required to implement a high-priority change request should be two (2) to four (4) weeks. Examples of high-priority change requests are issues and problems resulting from data integrity, legal mandates, and add-ons to improve data quality.
P3 - Medium: A medium priority change request has the potential to impact the successful completion of the project but is not an immediate help or hindrance. The timeframe for estimating the effort and cost required to implement a medium priority change request should be four (4) to six (6) weeks. Examples of medium-priority change requests are requests that improve workflow processes.
P4 - Low: Low priority change requests need to be addressed if the time and budget permit. Low priority change requests are managed, as resources are available. The timeframe guideline for estimating the effort and cost required to implement a low-priority change request is more than six (6) weeks. Examples of low priority change requests are cosmetic changes or “fixes” that do not affect business functional requirements or deliverables.
3.2 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links |
---|
3.3 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
4. Small Projects
No additional guidance is available for small projects.
5. Resources
5.1 References
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
1. Confirm that all software non-conformances severity levels are defined.
2. Assess the application and accuracy of the defined severity levels to software non-conformances.
7.2 Software Assurance Products
- Assessment of Accuracy of Severity-Level Application to Non-conformances
- SA assessment of severity levels assigned to software non-conformances, including risks and issues
- Information for accessing several software non-conformances at each severity level for each software configuration item.
Objective Evidence
- Software defect or problem reporting data
- Software configuration management data
- Software assurance audit results in the change management process
- Software milestone results
- Software version description documents
- Software control board data or presentations
7.3 Metrics
- Total # of Non-Conformances over time (Open, Closed, # of days Open, and Severity of Open)
- Trend of Open versus Closed Non-Conformances over time
- # of Non-Conformances in the current reporting period (Open, Closed, Severity)
- # of Non-Conformances identified in embedded COTS, GOT, MOTS, OSS, or reused components in-ground or flight software vs. # of Non-Conformances successfully closed
- # of Non-Conformances identified in source code products used (Open, Closed)
- # of software Non-Conformances at each Severity level for each software configuration item (Open, Closed)
Note: Metrics in bold type are required by all projects.
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
Software assurance confirms that the project has defined and documented severity levels for non-compliances before the start of any testing. Typically, NASA severity levels are from 1 to 5 or 1 to 4 with 1 being the most severe. Some example definitions are found in the software requirement guidance for SWE-202 (this requirement). The project definitions of the severity levels can be documented in several places. In many cases, they may be documented in either the test plans or in the test procedures. Often a project will build the definitions of the severity levels in the discrepancy tracking tool being used to track the errors found during testing. Severity levels may not be used officially for unit testing, but even there it is important to know which errors are the most critical and which should be fixed first. Severity levels should be used for all other levels of testing.
Once the project begins testing, software assurance needs to be monitoring the activity by either witnessing tests or reviewing test report documentation. Check to be sure that all the non-conformances that are found are documented in the project discrepancy report system along with the severity level of the non-conformance. Review the assigned severity level to ensure that severity levels are being assigned as per the project definitions of the categories of non-conformances.
Software assurance also needs to review the developer maintained lists of non-conformances found or reported in COTS, MOTS, GOTS, OSS, and other reused software. These non-conformances should have been assigned the appropriate severity levels that match the project definitions for non-conformances occurring in the project software. (In other words, don’t just assume the severity levels recorded in received lists of reused software will reflect the same severity level when in use in the project software). See also Topic 8.08 - COTS Software Safety Considerations.
Software assurance confirms that the project metrics are kept on the number of non-conformances found at each severity level for each configuration item and non-conformances should be tracked to closure. Other useful information to have is the status of the non-conformances (i.e., found, fix identified, fix implemented, fix tested/closed).
SA Tasking for SWE-201 - Software Non-Conformances is related to this requirement and additional information that needs to be captured on the non-conformances is below:
- Software problem/Software defects/software change report status:
- total number,
- number closed,
- the number opened in the current reporting period,
- age,
- severity.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: