bannerc
SWE-158 - Evaluate Software for Security Vulnerabilities

1. Requirements

3.11.5 The project manager shall ensure that space flight software systems are assessed for possible cybersecurity vulnerabilities and weaknesses.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-158 - Last used in rev NPR 7150.2C

RevSWE Statement
A


Difference between A and B

NEW

B

3.16.6 The project manager shall ensure that the space flight software systems are assessed for possible security vulnerabilities and weaknesses.

Difference between B and C

Changed from securiity to cybersecurity 

C

3.11.5 The project manager shall ensure that space flight software systems are assessed for possible cybersecurity vulnerabilities and weaknesses.

Difference between C and D

Retired

Functions addressed by SWEs 154, 156, 157. 159

D




1.3 Applicability Across Classes

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable

2. Rationale

Space flight software systems are assessed for software and communication security vulnerabilities and weaknesses to ensure such issues are known and managed during the software development life cycle and to ensure the associated risk is also identified and managed.

3. Guidance

The project manager is required to have and implement a Project Protection Plan for their system.  The software development organization is responsible for implementing mitigations to address the identified software and communication security vulnerabilities and weaknesses for space flight software systems.  It is expected that requirements for these mitigations are generated as security requirements and that those security requirements affecting software are included in the software requirements specification (see 5.09 - SRS - Software Requirements Specification).  These security requirements are then flowed down and implemented in the design and code and tested by the software development team.

A select group of personnel from the Agency's Space Protection Working Group (SPWG) writes Project Protection Plans and Program Threat Summaries.  Program Threat Summaries for NASA are based on similar standardized documents written for national security space systems. 

Project managers are to assess their software systems against identified risks (see SWE-156) and agreed to viable security vulnerabilities and weaknesses to confirm that changes required to mitigate or eliminate identified security risks have been implemented in the completed products.

Project managers work with software assurance and other software security experts, including the Information System Security Officer (ISSO), to have software vulnerability assessments performed.  Assessments, conducted by software security specialists, may require input from software developers.  The results of these assessments may require updates to software requirements to be sure that identified vulnerabilities that can be mitigated in the software are properly addressed.  It is important for projects to not introduce vulnerabilities and weaknesses by coding for functionality and performance at the expense of security.

Security assessments are performed throughout the project life cycle rather than being performed at the end of the life cycle when it is costly to correct programming errors, bugs, and other identified vulnerabilities.  “Continuous test and evaluation of security attributes of systems is an important part of testing as it allows software developers to identify and address vulnerabilities as part of the system architecture and design.” 485 Each project establishes assessment guidelines and frequency suitable to its development life cycle or follows any existing Center guidelines, tailored appropriately for the project.

Quality, security-related vulnerability assessments/analyses:

  • Include newly developed code and, to the extent possible, off-the-shelf (OTS), open-source, and reused code.
  • May focus on 484:
    • Code that runs by default.
    • Code that runs in an elevated context.
    • Code connected to a globally accessible network interface.
    • Code is written in a language whose features facilitate building addressing vulnerabilities.
    • Code with a history of vulnerabilities.
    • Code that handles sensitive data.
    • Complex code.
    • Code that changes frequently.
  • Assess for bugs, programming errors, and potential failures.
  • Confirm, to some acceptable level of rigor, that software does not do what it is not supposed to do
  • Assess the impact and likelihood of vulnerability exploitation.
  • May check for adherence to secure architecture, design, coding standards, including the use of secure coding practices (may be difficult to do for OTS software).
  • May use automated security static code analysis as well as coding standard static analysis.

NASA users should consult center Process Asset Libraries (PALs) for center-specific guidance and resources related to software security. 

Additional guidance related to software security may be found in the following related requirements in this Handbook:

4. Small Projects

No additional guidance is available for small projects.  

5. Resources

5.1 References


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

No Lessons Learned have currently been identified for this requirement.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-158 - Evaluate Software for Security Vulnerabilities
3.11.5 The project manager shall ensure that space flight software systems are assessed for possible cybersecurity vulnerabilities and weaknesses.

7.1 Tasking for Software Assurance

  1. Confirm that space or flight software systems have been assessed for potential cybersecurity vulnerabilities and weaknesses.

  2. Perform static code analysis on the software or analyze the project's static code analysis tool results for cybersecurity vulnerabilities and weaknesses.

7.2 Products

  • Source Code Analysis
  • The results of SA independent static code analysis for cybersecurity vulnerabilities and weaknesses.


    Objective Evidence

    • Results from the static code analysis tool results for cybersecurity vulnerabilities and weaknesses.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of Cybersecurity vulnerabilities and weaknesses identified by life-cycle phase
  • # of Cybersecurity vulnerabilities and weaknesses identified vs. # resolved during Implementation
  • # of Non-Conformances identified in Cybersecurity coding standard compliance (Open, Closed)
  • Document the Static Code Analysis tools used with associated Non-Conformances
  • # of total errors and warnings identified by the tool
  • # of errors and warnings evaluated vs. # of total errors and warnings identified by the tool
  • # of Non-Conformances raised by SA vs. total # of raised Non-Conformances
  • # of static code errors and warnings identified as “positives” vs. # of total errors and warnings identified by the tool
  • # of static code errors and warnings resolved by Severity vs. # of static code errors and warnings identified by Severity by a tool
  • # of Cybersecurity vulnerabilities and weaknesses identified by too
  • The trend of # of total errors and warnings identified per SCA Tool, Language, and SLOC size
  • Trends of Cybersecurity Non-Conformances over time

7.4 Guidance

Assess space flight software systems for potential cybersecurity vulnerabilities and weaknesses.

For quality, security-related vulnerability assessments/analyses:

  • Include newly developed code and, to the extent possible, off-the-shelf (OTS), open-source, and reused code.
  • Focus on 484:
    • Code that runs by default.
    • Code that runs in an elevated context.
    • Code connected to a globally accessible network interface.
    • Code is written in a language whose features facilitate building addressing vulnerabilities.
    • Code with a history of vulnerabilities.
    • Code that handles sensitive data.
    • Complex code.
    • Code that changes frequently.
  • Assess for bugs, programming errors, and potential failures.
  • Confirm, to some acceptable level of rigor, that software does not do what it is not supposed to do
  • Assess the impact and likelihood of vulnerability exploitation.
  • Check for adherence to secure architecture, design, coding standards, including the use of secure coding practices (may be difficult to do for OTS software).
  • Use automated security static code analysis as well as coding standard static analysis.

Additional SA independent assessments for cybersecurity and vulnerabilities and weaknesses include dynamic code analysis, threat modeling, binary weakness analyzer, origin analyzer, secure library analyzer,  and malware/virus/spyware scanners, etc.

Perform an independent static code analysis for cybersecurity vulnerabilities and weaknesses. SA or Engineering (work with IV&V if a tool is not available) runs the code through cybersecurity vulnerabilities and weaknesses static code analysis tool assesses the results and implements the required findings in the requirements, design, and code.  If engineering runs the tool, SA should do an independent assessment of the tool results.

The assessment should account for network communication that the application may use to transmit data.  These ports should be considered possible cybersecurity vulnerabilities and weaknesses and documented as such.  The tasking should consider including an independent pentest of an application that uses network ports to network communications.

  • No labels