See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
4.4.8 The project manager shall validate and accredit the software tool(s) required to develop or maintain software.
1.1 Notes
All software development tools contain some number of software defects. Validation and accreditation of the critical software development and maintenance tools ensure that the tools being used during the software development life cycle do not generate or insert errors in the software executable components. Software tool accreditation is the certification that a software tool is acceptable for use for a specific purpose. Accreditation is conferred by the organization best positioned to make the judgment that the software tool in question is acceptable. The likelihood that work products will function properly is enhanced, and the risk of error is reduced if the tools used in the development and maintenance processes have been validated and accredited themselves.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Software development tools contain software defects. Commercial software development tools do have software errors. Validation and accreditation of the critical software development and maintenance tools ensure that the tools being used during the software development life cycle do not generate or insert errors in the software executable components. This requirement reduces the risk in the software development and maintenance areas of the software life cycle by assessing the tools against defined validation and accreditation criteria. The likelihood that work products will function properly is enhanced and the risk of error is reduced if the tools used in the development and maintenance processes have been validated and accredited themselves. This is particularly important for flight software (Classes A and B) which must work correctly with its first use if critical mishaps are to be avoided.
3. Guidance
3.1 Tools For Developing And Maintaining Software
Project managers are to use validated and accredited software tool(s) to develop or maintain software. Software is validated against test cases using tools and test environments that have been tested and known to produce acceptable results. Multiple techniques for validating software may be required based on the nature and complexity of the software being developed. This process involves comparing the results produced by the software work product(s) being developed against the results from a set of ‘reference’ programs or recognized test cases. Care must be taken to ensure that the tools and environment used for this process should induce no error or bias into the software performance. Validation is achieved if there is an agreement with expectations and no major discrepancies are found. Software tool accreditation is the certification that a software tool is acceptable for use for a specific purpose. Accreditation is conferred by the organization best positioned to make the judgment that the software tool in question is acceptable. The organization may be an operational user, the program office, or a contractor, depending upon the purposes intended. Validation and accreditation of software tool(s) required to develop or maintain software are particularly necessary in cases where:
- Complex and critical interoperability is being represented.
- Reuse is intended.
- The safety of life is involved.
- Significant resources are involved.
- Flight software is running on new processors where a software development tool may not have been previously validated by an outside organization.
The targeted purpose of this requirement was the validation and accreditation of software tools that directly affect the software being developed. Examples of software tools that directly affect software code development include but are not limited to compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools. For example, if your project is using a code generation tool, the code output of that code generation tool is validated and accredited to verify that the generated code does not have any errors or timing constraints that could affect execution. Another example is the use of math libraries associated with a compiler. These linked libraries are typically validated and accredited before use in the executable code.
3.2 Who Performs Accreditation?
This requirement does not mean that a project needs to validate and accredit a software problem reporting tool or software defect database or a software metric reporting tool. These types of indirect tools can be validated by use. It is also recognized that formal accreditation of software development/maintenance tools is often provided by organizations outside NASA (e.g., National Institute of Standards and Technology [NIST], Association for Computing Machinery [ACM]) for many common software development tools such as compilers. It could be prohibitively expensive or impossible for NASA projects to perform equivalent accreditation activities.
Accreditation is conferred by the organization in the best position to make the judgment that the software tool in question is acceptable. Someone on the project needs to be able to determine an acceptable level of validation and accreditation for each tool being used on a software project and be willing to make a judgment that the tool is acceptable for the tool's intended use. Most of the time that decision can be, and is, made by the software technical authority. The acceptance criteria and validation approach can be captured and recorded in a software development plan, software management plan, or software maintenance plan.
3.3 Novel Environments
Often the software work products being developed are based on new requirements and techniques for use in novel environments. These may require the development of supporting systems, i.e., new tools, test systems, and original environments, none of which have themselves ever been accredited. The software development team may also plan to employ accredited tools that have not been previously used or adapted to the new environment. The key is to evaluate and then accredit the development environment and its associated development tools against another environment/tool system that is accredited. The more typical case regarding new software development is the new use-case of bringing tools (otherwise previously accredited) together in a development environment (otherwise previously accredited) that have never been used together before (and not accredited as an integrated system).
A caveat exists that for many new NASA projects, previous ensembles of accredited development environments and tools just don’t exist. For these cases, projects must develop acceptable methods to assure their supporting environment is correct.
The flow of a software accreditation protocol or procedure typically requires an expert to conduct a demonstration and to declare that a particular software work product exhibits compatible results across a portfolio of test cases and environments.
3.4 Accreditation Tools
The software can also be accredited by using industry accreditation tools. Engineers can be trained and thus certified in the correct use of the software accreditation tools.
Many of the techniques used to validate software (see SWE-055 - Requirements Validation) can be applied by the software development team to validate and accredit their development tools, systems, and environments. These include the following:
- Functional demonstrations.
- Formal reviews.
- Peer reviews/ component inspections.
- Analysis.
- Use case simulations.
- Software testing.
One suggested flow for identifying, validating, accepting, and accrediting new development tools and environments to be used to develop software work products subject to this requirement is shown in Figure 3.1. While it may seem obvious, it is important to review the current and expected development tools (compilers, debuggers, code generation tools, code-coverage tools, and others planned for use) and the development environments and release procedures, to assure that the complete set is identified, understood and included in the accrediting process.
Figure 3.1 – Sample Accreditation Process Flow
Once you have the complete list, the next step in the process is to review the accreditation status of each item. Ask the following questions:
- Is the accreditation current?
- Does the accreditation need to be re-certified?
- Is the accreditation applies to the current development task?
- Will its accreditation continue throughout the development life cycle?
For tools that need accreditation, use the information in the preceding paragraphs to develop a new accreditation process. Keep in mind that the goal is to show that the tools will do what the project expects them to do. The Project team needs to review and accept the new process, and the software technical authority and the Center's software assurance personnel are to be informed of it. When the integration of accredited tools needs its accrediting process, that planning is conducted using the same steps.
The actual steps and tasks in the process are not specified in this guidance because of the wide variety of tools and environments that currently exist or will have to be developed. They are left for the project and the software development team to develop. Once the Project Team has planned and approved the new and integrating accreditation processes, the appropriate validation activities are conducted. Center software assurance organization representative(s) are informed and their observation of the validation processes are accounted for in the activities. Results, process improvements, lessons learned, and any discrepancies that result are documented and retained in the project configuration management system. Any issues or discrepancies are re-planned and tracked to closure as part of the regular project's corrective and preventive action reporting system.
3.5 Peer Review Of Validation Activity Results
The final activity in the accreditation process is an inspection and peer review of the validation activity results. Once the agreement is reached on the quality of the results (what constitutes "agreement" is decided ahead of time as a part of the process acceptance planning), the identified tools and environments can be accepted as accredited. (See SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures, SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking, SWE-089 - Software Peer Reviews and Inspections - Basic Measurements, Topic 7.10 - Peer Review and Inspections Including Checklists
3.6 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
3.7 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
Small projects with limited resources may look to software development tools and environments that were previously accredited on other projects. Reviews of project reports, Process Asset Libraries (PAL)s, and technology reports may surface previous use cases and/or acceptance procedures that apply to new tools pertinent to the current project.
5. Resources
5.1 References
- (SWEREF-082) NPR 7120.5F, Office of the Chief Engineer, Effective Date: August 03, 2021, Expiration Date: August 03, 2026,
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-224) ISO/IEC 12207, IEEE Std 12207-2008, 2008. IEEE Computer Society, NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-271) NASA STD 8719.13 (Rev C ) , Document Date: 2013-05-07
- (SWEREF-273) NASA SP-2016-6105 Rev2,
- (SWEREF-278) NASA-STD-8739.8B , NASA TECHNICAL STANDARD, Approved 2022-09-08 Superseding "NASA-STD-8739.8A,
- (SWEREF-370) ISO/IEC/IEEE 15289:2017. NASA users can access ISO standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of ISO standards.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
1. Confirm that the software tool(s) needed to create and maintain software is validated and accredited.
7.2 Software Assurance Products
- None at this time.
Objective Evidence
- Software tool validated and accredited criteria.
- Software tool validated and accredited results.
7.3 Metrics.
- # of Non-Conformances found in flight code, ground code, tools, and COTs products used (Open vs. Closed).
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
For this requirement, software assurance needs to confirm that any software tools used for development and maintenance have been validated and accredited. All software development tools contain some number of software defects. Validation and accreditation of the critical software development and maintenance tools ensure that the tools being used during the software development life cycle do not generate or insert errors in the software executable components. Software tool accreditation is the certification that a software tool is acceptable for use for a specific purpose. Accreditation is conferred by the organization best positioned to make the judgment that the software tool in question is acceptable. The likelihood that work products will function properly is enhanced, and the risk of error is reduced if the tools used in the development and maintenance processes have been validated and accredited themselves.
The first step In confirming the validation and accreditation of the software tools used for development and maintenance is to obtain a list of the tools being used. Then check to see if the project) or another organization positioned to make the judgment that the tools are acceptable) has verified, validated, and approved them for their use in the development, analysis, testing, or maintenance of the project software.
Examples of SW tools that directly affect the SW code development can include but are not limited to, the compilers, code-coverage tools, development environments, build tools, user interface tools, debuggers, and code generation tools used by a Project. Another example would be the Project’s use of a linked code library, which would be typically validated and accredited before use in the executable code.
There is information in the software guidance section of this requirement on validating and accrediting these tools.
If the software tools (e.g., simulators, compiler libraries, built-in memory checkers, build tools, development environments, code generators, code-coverage tools, etc.) have an impact on the safety of the system, then determine if they are correctly applied and used, operated as intended, the results documented, etc. Typically, once it is determined if a tool has any safety implications, an assessment is performed on the severity of impact if the tool provides one or more wrong outputs, and the likelihood of that occurring determines the level of effort or hazard mitigations to employ. (See also SWE-020 - Software Classification, SWE-023 - Software Safety-Critical Requirements.)
Software assurance may want to check on the following other considerations for software tools:
- When software tools are used to design, develop, analyze, test, or maintain software systems, assure that:
- The tools are kept up to date and configuration managed, incorporating changes, as needed
- Inputs are correct and complete
- The models are testable, tested, and accredited
- When using software tools check:
- Use of up-to-date versions and licenses on purchased tools
- Have any concerns or risks been recorded and resolved?
- Are the tools being operated within the parameters/limitations of their capabilities?
- Have any known errors or discrepancies been documented for the software tools in use, and if so, have the risks been input to the risk management system and evaluated?
- Are the results as expected or can they be explained?
- Is there any analysis stating the level of certainty/trust from the results of the tools, including any concerns, risks, issues?
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: