See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
3.11.5 The project manager shall test the software and record test results for the required software cybersecurity mitigation implementations identified from the security vulnerabilities and security weaknesses analysis.
1.1 Notes
Include assessments for security vulnerabilities during Peer Review/Inspections of software requirements and design. Utilize automated security static analysis as well as coding standard static analyses of software code to find potential security vulnerabilities.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
Mitigations identified to address security risks need to be confirmed as appropriate and adequate. Software verification and validation (V&V) processes are used to determine whether the software security risk mitigations, as planned and implemented, satisfy their intended use to address security risks in space flight software. The results of V&V activities serve as a documented assurance that the correct mitigations were identified, implemented, and will reduce to an acceptable level or eliminate known software security risks.
3. Guidance
3.1 Planning For V&V For Security Risk Mitigations
Project managers ensure that software security risk mitigations identified and evaluated in SWE-154 - Identify Security Risks and SWE-156 - Evaluate Systems for Security Risks are verified and validated as part of the project’s development activities. The verification and validation (V&V) of these mitigations are included in the project’s planned V&V activities, reflected in the project’s V&V plans, and carried out throughout the project life cycle.
When planning V&V for required software security risk mitigations, the following are useful source documents within the project and from the Information System Security Officer (ISSO):
- System-level risk assessment report.
- Plan of actions and milestones (POA&M). (POA&M serves as the NASA management tool to address, report, resolve, and remediate security-related weaknesses, both at the information system level and program-level).
Examples of the type of information found in these documents relevant to planning V&V for software security risk mitigations include:
- Vulnerabilities discovered during the security impact analysis or security control monitoring.
- Corrective efforts associated with the mitigation of identified security weaknesses.
- The NASA goal for vulnerability remediation identified during vulnerability scanning.
- To have all vulnerabilities categorized as high remediated within five working days of discovery.
- To have all vulnerabilities categorized as moderate remediated within thirty working days of discovery.
- To have all vulnerabilities categorized as low remediated within sixty working days of discovery.
See also SWE-154 - Identify Security Risks, SWE-156 - Evaluate Systems for Security Risks, SWE-157 - Protect Against Unauthorized Access, SWE-191 - Software Regression Testing.
3.2 V&V Activities
Specific to software security risk mitigations, V&V activities include, at a minimum:
- Security vulnerability checks during peer review/inspections of software requirements, design, code (may require updates to existing peer review/inspection checklists).
- Automated security static code analysis.
- Coding standard static analysis.
For Additional Guidance on Peer Reviews see SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures, SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking, Topic 7.10 - Peer Review and Inspections Including Checklists.
Additionally, repeating previous assessments for software security vulnerabilities could be used to confirm that the previously identified vulnerabilities and weaknesses have been addressed and no new vulnerabilities introduced or discovered. To keep costs down and prevent the propagation of risks through the project, mitigations are to be put in place as early as possible and verified throughout the life cycle.
In the case of software systems that do not directly have security scan tools, manual methods and techniques can be used/applied to supplement and provide risk reduction. Software coding languages may not have security scanning tools, such as Programmable Logic Devices (PLDs)…, therefore alternative methods are necessary including manual techniques. Examples of these techniques are (not limited too):
- Peer reviews from multiple people
- Checking for backdoors, hardcoded credentials, poor programming practices
- Examination of third-party software/libraries being used
- Security patches, supply chain risks, …
- Security practices for software development/test infrastructures (see SWE-136)
- Running malware scanners, Access restrictions, …
- Plans developed for reacting to security incidents
- Unit and formal tests to verify potential security measures are in place
- These should be at multiple levels
- For example, tests to show access to authorized accounts only, rejecting invalid commands
- Code paths are handled for all inputs
- Sanitization of inputs (for example restricting to only valid expected values)
- “Fuzzy” testing (i.e. using values outside of nominals)
- Identify the most critical and vulnerable systems, spend time focusing on those
Project Protection Plans (PPP) contain Controlled Unclassified Information (CUI) and may be supported by a classified appendix. Identification of and discussions with the owner of the PPP will help to identify what areas that the software needs to address. Due to the sensitive nature of the PPP documents exact details may not be given but requirements and needs can be derived and tested to support the plan.
See also SWE-135 - Static Analysis, SWE-185 - Secure Coding Standards Verification, SWE-207 - Secure Coding Practices, Topic 8.04 - Additional Requirements Considerations for Use with Safety-Critical Software
3.3 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
3.4 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
Smaller projects may not have the resources to purchase security scanners, therefore the manual methods may be used more in these situations as well as NASA approved software that is free. The Office of Chief Information Officer (OCIO) may be able to help or provide options. Software development organizations may have infrastructure tools or be able to use tools from larger projects (with negotiated compensation). Small projects may also identify the most critical and vulnerable systems, then spend time focusing efforts to secure them. Often times the interfaces are where the critical/vulnerable systems are for security.
5. Resources
5.1 References
- (SWEREF-064) NIST SP 800-27, Revision A.
- (SWEREF-065) NIST SP 800-64 Revision 2.
- (SWEREF-082) NPR 7120.5F, Office of the Chief Engineer, Effective Date: August 03, 2021, Expiration Date: August 03, 2026,
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-273) NASA SP-2016-6105 Rev2,
- (SWEREF-664) OCE site in NASA Engineering Network, Portal that houses information for software developers to develop code in a secure fashion, Formerly known as "Secure Coding"
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
1. Confirm that testing is complete for the cybersecurity mitigation.
2. Assess the quality of the cybersecurity mitigation implementation testing and the test results.
7.2 Software Assurance Products
- Source Code Analysis
- Verification Activities Analysis
- SA assessment of the quality of cybersecurity mitigation testing.
Objective Evidence
- Software Test Procedures
- Software Test Reports
7.3 Metrics
- # of software work product Non-Conformances identified by life cycle phase over time
- Total # of tests completed vs. number of test results evaluated and signed off
- # of Safety-Critical tests executed vs. # of Safety-Critical tests witnessed by SA
- # of Cybersecurity Risks with Mitigations vs. # of Cybersecurity Risks identified
- # of Cybersecurity vulnerabilities and weaknesses identified
- # of Cybersecurity vulnerabilities and weaknesses (Open, Closed, Severity)
- Trending of Open vs. Closed over time.
- # of Cybersecurity vulnerabilities and weaknesses identified by life cycle phase
- # of Cybersecurity vulnerabilities and weaknesses identified vs. # resolved during Implementation
- # of tests executed vs. # of tests completed
- # of Non-Conformances identified during each testing phase (Open, Closed, Severity)
- # of Cybersecurity mitigation implementations identified from the security vulnerabilities and security weaknesses
- # of Cybersecurity mitigation implementations identified with associated test procedures vs. # of Cybersecurity mitigation implementations identified
- # of Cybersecurity mitigation tests completed vs. total # of Cybersecurity mitigation tests
- # of Non-Conformances identified during Cybersecurity mitigation testing (Open, Closed, Severity)
- Trends of Cybersecurity Non-Conformances over time
- # of tests completed vs. total # of tests
- # of detailed software requirements tested to date vs. total # of detailed software requirements
See also Topic 8.18 - SA Suggested Metrics.
7.4 Guidance
The following are steps to be taken to confirm that the quality and completeness of the testing to address cybersecurity mitigation requirements:
- Confirm the quality of the testing and the results. A source code quality or weakness analyzer may be used, or manual methods.
- Analyze static code analysis results and compare to the Software Assurance independent static code analysis and compare baseline results.
- Perform a peer review with the Software Engineer and go over known vulnerabilities and weaknesses in the software and plans to address them.
Review that the engineering team and the project have addressed any identified cybersecurity vulnerabilities and weaknesses in the software requirements, design, code. Confirm that the requirements associated with the identified cybersecurity vulnerabilities and weaknesses have been tested or are planned to be tested. Check to see if the engineering team and the project have run a static analysis tool to assess the cybersecurity vulnerabilities and weaknesses in the source code if so check to see that the findings from the static analysis tool have been addressed by the team. If manual methods are used, check the process of the manual method and any issues that are identified are addressed.
A method of identifying weaknesses and vulnerabilities is to use the National Vulnerability database from NIST that is the U.S. government repository of standards-based vulnerability data. Software weaknesses can be identified using Common Weakness Enumeration (CWE) - a dictionary created by MITRE.
See the secure coding site 664 (NASA access only) for more information.
See also 8.04 - Additional Requirements Considerations for Use with Safety-Critical Software.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: