bannerc
8.18 - SA Suggested Metrics

1. Introduction

This topic contains the complete list of software assurance/safety metrics that are suggested for use with the SA tasks in NASA-STD-8739.8.These suggested metrics will provide SA and safety personnel with much of the information they will need to assess both the software assurance/safety work, as well as providing information to help with the monitoring of the software engineering progress.

The next Metrics Table tab shows the set of metrics in a table and provides a way to download an Excel file. The Excel file contains the same metrics but has the capability of being able to filter the columns. This provides an easy way to see which metrics are associated with a particular SWE, and to filter out SWEs that may be tailored out in your project. These tables provide information on potential phases in which each metric might be collected. Be sure to read the information at the top of each metrics sheet for more specific information on the use of the tables.

Each project should review the table and decide which metrics they want to collect, based on the metrics that they think will provide them with the most information for their activities.

2. Metrics Table

The table below provides suggested metrics for the SWE requirements in Sections 7.3 of the SA tabs in the SWEs of this Handbook.

There are multiple “Metrics Types”, and each type includes optional “Measurements” by life-cycle phase for the “Associated SWE Requirements”. Many of the measurements were reworded to clarify them or to make them more generic so they could be applied in multiple places. For example, the word “non-conformance” was used in lieu of findings, problem reports, defects, or errors to cover all types of non-conformances. When choosing a measurement, change the word “non-conformance” to the one that best fits the situation to be monitored. (Note: these may be collected by the project and analyzed by SA.) Projects should choose a set of measurements to provide information on the project being implemented. The measurements do not have to be implemented as written. They may be modified to best fit the characteristics of the project.

The life-cycle phases listed below indicate the recommended phase the data may be collected (color has been added only to provide a visual reference as you scroll down the page). However, it does not mean it has to be collected in each phase. Establish a set of measurements that best suits the project taking into consideration the size and applicability of the activity. For example, if a configuration audit metric is collected, it may be appropriate to collect those measurements during most of the life-cycle phases listed, but the project may not be doing configuration audits in all the listed phases.

The NPR 7150.2C SWE requirement numbers listed in the table below are associated with the SA tasking in NASA-STD-8739.8 but may not be the only applicable SWE requirement for a particular metric.

There are a few measurements that are required specifically for analysis in NASA-STD-8739.8. They are listed in bold type. Other measurements listed in the charts should be considered when performing the tasking activities associated with the SWE numbers for the metric.

In some cases, in order to understand the full context of the metric listed in a requirement, look at the metrics in 8.18 to see other measures that contribute to the metric in a particular requirement. For example,  the table shows SWE-036 with the metric, “# of software components (e.g. programs, modules, routines, functions, etc.) planned versus the number actually released in each build.” SWE-036, task 1c would provide the planning information for the scheduled deliverables. However, in order to complete the metric in SWE-036, it would be necessary to use the information collected in SWE-077 to get the  “# of components actually released in each build.”

For a usable MS Excel version of this table, click on the image below and download the file. 

A Word version of the full table is reproduced below if you don't want to download the Excel version.

Metrics Type

Measurements

Plan

Reqt

Des

Imp

Test

Del

Associated SWE Reqt #

Peer Review Metrics

·        # of peer reviews performed vs. # of peer reviews planned


X

X

X

X

X

SWE-016

SWE-087

SWE-089

·        # of Non-Conformances identified in each peer review


X

X

X

X

X

SWE-087

SWE-089

·        # of Non-Conformances identified by software assurance during each peer review


X

X

X

X

X

SWE-087

SWE-088

SWE-089

·        Total # of peer review Non-Conformances (Open, Closed)


X

X

X

X

X

SWE-087

SWE-088

SWE-089

·        Preparation time each review participant spent preparing for the review


X

X

X

X

X

SWE-088

SWE-089

·        Time required to close review Non-Conformances


X

X

X

X

X

SWE-087

SWE-088

SWE-089

·        # of peer review participants vs. total # invited


X

X

X

X

X

SWE-088

SWE-089

·        # of peer review Non-Conformances per work product vs. # of peer reviewers


X

X

X

X

X

SWE-088

SWE-089

Peer Review Audit Metrics

·        # of audit Non-Conformances per peer review audit


X

X

X

X

X

SWE-088

·        # of Peer Review Audits planned vs. # of Peer Review Audits performed


X

X

X

X

X

SWE-016

SWE-087

SWE-088

·        Trends on non-conformances from audits (Open, Closed, Life-cycle Phase)


X

X

X

X

X

SWE-088

SWE-089

·        Time required to close peer review audit Non-Conformances


X

X

X

X

X

SWE-088

SWE-089

·        Preparation time each audit participant spent preparing for audit


X

X

X

X

X

SWE-088

SWE-089

Problem/Change Report Status Metrics

·        Total # of Non-Conformances over time (Open, Closed, # of days Open, and Severity of Open)

·        # of Non-Conformances in current reporting period (Open, Closed, Severity)

X

X

X

X

X

X

SWE-062

SWE-065c

SWE-065d

SWE-068

SWE-202

SWE-203

SWE-204

·        # of safety-related Non-Conformances


X

X

X

X

X

SWE-203

SWE-068

SWE-071

·        Trend of Open vs. Closed Non-Conformances over time

X

X

X

X

X

X

SWE-053

SWE-054

SWE-202

SWE-065

·        Trend of change status over time (# of changes approved, # in implementation, # in test, # closed)

X

X

X

X

X

X

SWE-018

SWE-053

SWE-080

·        # of Non-Conformances identified in embedded COTS, GOT, MOTS, OSS, or reused components in ground or flight software vs. # of Non-Conformances successfully closed



X

X

X


SWE-136

SWE-202

SWE-203

SWE-211

·        # of Non-Conformances identified in source code products used (Open, Closed)



X

X

X


SWE-202

SWE-203

·        # of software Non-Conformances at each Severity level for each software configuration item (Open, Closed)

Note: Metrics in bold type are required by all projects.

 

 

 

X

X

X

SWE-202

·        # of Closed action items vs. # of Open action items

X

X

X

X

X

X

SWE-062

SWE-065c

·        # of Root Cause Analyses performed;

·        # of Non-Conformances identified by each root cause analysis



X

X

X

X

SWE-204

SA Corrective Action (CA) metrics (Issues, Risks)

· # of CAs raised by SA vs. total #

·        Attributes (Type, Severity, # of days Open, Life-cycle Phase Found)

·        State (Open, In work, Closed)

·        Trends of CA closures over time

X

X

X

X

X

X

SWE-024

SWE-204

SWE-054

·        Trend the # of inconsistencies or corrective actions identified, and # closed.

X

X

X

X

X


SWE-054

SWE-024

SWE-204

.        # of open vs. closed issues over time and latencyXXXXXXSWE-018

Process Improvement Metrics

·        # of software work product Non-Conformances identified by life-cycle phase over time

X

X

X

X

X

X

SWE-013

SWE-022

SWE-024

SWE-039

SWE-051

SWE-054

SWE-057

SWE-058

SWE-062

SWE-065b

SWE-065c

SWE-068

SWE-071

SWE-075

SWE-079

SWE-084

SWE-086

SWE-087

SWE-125

SWE-134

SWE-139

SWE-146

SWE-157

SWE-158

SWE-159

SWE-184

SWE-185

SWE-187

SWE-191

SWE-194

SWE-201

SWE-204

SWE-205

·        # of software process Non-Conformances by life-cycle phase over time

X

X

X

X

X

X

SWE-032

SWE-039

SWE-061

SWE-077

SWE-080

SWE-082

SWE-085

SWE-086

SWE-088

SWE-139

SWE-195

SWE-204

Characteristics Metrics

·        Identify the specific requirements in NASA-STD-8739.8 that are being tailored by the projects (*organizational metric)

X

X

X

X

X

X

SWE-121

SWE-125

SWE-013

SWE-176

·        # of projects tailoring each requirement (*organizational measure)

·        % of requirements tailored per project (*organizational measure)

X

X

X

X



SASS-01

SASS-09

SWE-125

SWE-013

SWE-121

·        % of Total Source Code for each Software Classification (*organizational measure)




X

X

X

SWE-020

SWE-176

SWE-087

Cost/ Effort Metrics

·        Planned SA resource allocation vs. actual SA resource allocation 

X

X

X

X

X

X

SWE-015

SWE-174

SWE-151

·        Comparison of initial SA cost estimates vs. final cost (capturing assumptions and differences)

X





X

SWE-174

SWE-015

SWE-151

·        Trend SA cost estimates throughout life-cycle

X

X

X

X

X

X

SWE-174

Training Metrics

·        % of required training completed for each of the project SA personnel

X

X

X

X



SWE-017

·        % of project personnel that have completed project specific training against planned training


X

X

X

X

X

SWE-017

Compliance Audit Metrics

·        # of Compliance Audits planned vs. # of Compliance Audits performed

X

X

X

X

X

X

SWE-024

SWE-039

SWE-139

SWE-016

SWE-032

SWE-077

SWE-195

SWE-082

SWE-084

SWE-085

SWE-086

SWE-088

SWE-079

·        # of Open vs. Closed Audit Non-Conformances over time

·        Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)

X

X

X

X

X

X

SWE-024

SWE-039

SWE-139

SWE-016

SWE-032

SWE-077

SWE-079

SWE-195

SWE-082

SWE-084

SWE-085

SWE-086

SWE-088

SWE-201

·        # of Non-Conformances identified in plans (e.g., SMPs, SDPs, CM Plans, SA Plans, Safety Plans, Test Plans)

X

X

X

X

X


SWE-024

SWE-013

SWE-075

SWE-079

SWE-139

SWE-071

·        #  of Non-Conformances identified in the software Configuration Management Plan

·        Trends of # Open vs. # Closed over time


X

X

X

X

X

SWE-079

·        #  of Configuration Management Audits conducted by the project – Planned vs. Actual


X

X

X

X

X

SWE-082

SWE-077

SWE-084

·        # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)

X

X

X

X

X

X

SWE-024

SWE-039

SWE-139

SWE-016

SWE-032

SWE-077

SWE-195

SWE-082

SWE-084

SWE-085

SWE-086

SWE-088

SWE-079

·        # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project

·        Trends of # Open vs. # Closed over time

X

X

X

X

X

X

SWE-016

SWE-039

SWE-032

SWE-077

SWE-195

SWE-082

SWE-084

SWE-085

SWE-086

SWE-088

Project Acceptance Metrics

·        # of Non-Conformances (activities not being performed)

·        # of Non-Conformances accepted by project

·        # of Non-Conformances (Open, Closed, Total)

·        Trends of over time

X

X

X

X

X

X

SWE-087

SWE-201

SWE-039

SWE-139

Progress Tracking Metrics

·        Deviations of actual schedule progress vs. planned schedule progress above defined threshold

X

X

X

X

X

X

SWE-016

SWE-018

SWE-046

·        # of Software Requirements (e.g. Project, Application, Subsystem, System, etc.)


X

X

X

X

X

SWE-050

SWE-051

SWE-052

SWE-065b

SWE-066

SWE-194

SWE-027

·        # of Software Requirements that do not trace to a parent requirement   


X

X

X



SWE-033

SWE-050

SWE-051

·        # of architectural issues identified vs. number closed



X




SWE-057

SWE-058

SWE-143

.        # of planned units for implementation vs. # of units tested and implemented


X

SWE-060

·        # of planned unit test cases vs. # of actual unit test cases successfully completed




X



SWE-062

SWE-186

·        # of software requirements with completed test procedures over time




X

X


SWE-065b

SWE-052

SWE-066

SWE-071

SWE-191

·        Total # of tests completed vs. number of test results evaluated and signed off





X

X

SWE-066

SWE-068

SWE-065c

SWE-159

·        # of Safety Critical tests executed vs. # of Safety Critical tests witnessed by SA




X

X

X

SWE-066

SWE-068

SWE-065c

SWE-159

SWE-062

SWE-186

·        # of software components (e.g. programs, modules, routines, functions, etc.) planned vs. # actually released in each build





X

X

SWE-194

SWE-077

SWE-073

SWE-036

·        # of Non-Conformances from reviews (Open vs. Closed; # of days Open)

X

X

X

X

X

X

SWE-134

SWE-037

SWE-039

SWE-087

SWE-088

SWE-089

SWE-143

·        # of Software Requirements being met via satisfactory testing vs. total # of Software Requirements




X

X

X

SWE-066

SWE-065b

SWE-066

SWE-192

SWE-071

·        # of Software Requirements without associated test cases




X

X

X

SWE-066

SWE-065b

SWE-071

Risk Management Metrics

·        # of Risks identified in each life-cycle phase (Open, Closed)

X

X

X

X

X

X

SWE-086

SWE-039

SWE-179

SWE-154

SWE-156

SWE-190

·        # of Risks by Severity (e.g., red, yellow, green) over time

X

X

X

X

X

X

SWE-032

SWE-033

SWE-039

SWE-086

SWE-154

SWE-156

SWE-179

SWE-190

SWE-191

·        # of Risks with mitigation plans vs. total # of Risks

X

X

X

X

X

X

SWE-032

SWE-033

SWE-039

SWE-086

SWE-154

SWE-156

SWE-179

SWE-190

SWE-191

·        # of Risks trending up over time

·        # of Risks trending down over time

X

X

X

X

X

X

SWE-032

SWE-033

SWE-039

SWE-086

SWE-154

SWE-156

SWE-179

SWE-190

SWE-191

Cybersecurity Risks Metrics

·        # of Cybersecurity Risks identified (Open, Closed, Severity)



X

X

X

X

SWE-154

SWE-156

SWE-151

·        # of Cybersecurity Risks with Mitigations vs. # of Cybersecurity Risks identified



X

X

X

X

SWE-154

SWE-156

SWE-151

SWE-159

Traceability Metrics

·        % of traceability completed in each area: System Level requirements to Software requirements; Software Requirements to Design; Design to Code; Software Requirements to Test Procedures


X

X

X

X


SWE-052

·        % of traceability completed for all hazards to software requirements and test procedures


X

X

X

X


SWE-052

·        Defect trends for trace quality (# of circular traces, orphans, widows, etc.)


X





SWE-051

SWE-052

Requirements Metrics

·        # of detailed software requirements vs. # of estimated SLOC to be developed by the project


X

X


X


SWE-050

SWE-051

SWE-151

SWE-174

·        # of incorrect, missing and incomplete requirements (i.e., # of requirements issues) vs. # of requirements issues resolved


X

X




SWE-051

SWE-053

SWE-054

·        Software Requirements Volatility (# of requirements added, deleted, modified, # of TBDs over time)

 

X

X

X

X

 

SWE-200

SWE-053

·        # of TBD/TBC/TBR requirements trended over time

 

 

X

X

X

X

SWE-066

Safety Metrics

·        # of safety-related requirement issues (Open, Closed) over time


X

X

X

X

X

SWE-039

SWE-139

SWE-205

SWE-023

SWE-134

SWE-184

SWE-052

SWE-051

SWE-058

SWE-065b

SWE-066

SWE-071

SWE-192

SWE-080

SWE-087

·        # of safety-related non-conformances identified by life-cycle phase (over time, Open vs. Closed, # of days)


X

X

X

X

X

SWE-013

SWE-039

SWE-139

SWE-143

SWE-121

SWE-184

SWE-023

SWE-134

SWE-052

SWE-051

SWE-057

SWE-058

SWE-135

SWE-062

SWE-065a

SWE-065b

SWE-066

SWE-068

SWE-071

SWE-191

SWE-080

SWE-081

SWE-087

SWE-205

Reuse Metrics

·        # of products submitted for reuse;

·        # of developed products submitted for reuse vs. total # of developed products






X

SWE-147

SWE-148

·        # of developed products entered in NASA Internal Sharing & Reuse System vs. total # of developed products






X

SWE-147

SWE-148

·        # of products submitted for reuse vs. # of products entered into NASA Internal Sharing & Reuse Systems






X

SWE-147

SWE-148

Cybersecurity Metrics

·        # of Cybersecurity vulnerabilities and weaknesses identified

·        # of Cybersecurity vulnerabilities and weaknesses (Open, Closed, Severity)

·        Trending of Open vs. Closed over time

X

X

X

X

X

X

SWE-158

SWE-155

SWE-159

SWE-135

SWE-063

·        # and type of vulnerabilities and weaknesses identified by the project

X

X

X

X

X


SWE-158

SWE-135

SWE-063

·        # of Cybersecurity vulnerabilities and weaknesses identified by life-cycle phase


X

X

X

X

X

SWE-158

SWE-159

SWE-135

SWE-063

·        # of Cybersecurity vulnerabilities and weaknesses identified vs. # resolved during Implementation




X

X

X

SWE-155

SWE-158

SWE-159

SWE-135

SWE-063

·        # of Non-Conformances identified in Cybersecurity coding standard compliance (Open, Closed)




X

X

X

SWE-063

SWE-135

SWE-155

SWE-158

SWE-207

Static Code Metrics

·        Document the Static Code Analysis tools used with associated Non-Conformances

·        # of total errors and warnings identified by tool

·        # of errors and warnings evaluated vs. # of total errors and warnings identified by tool




X



SWE-135

SWE-158

SWE-185

·        # of Non-Conformances raised by SA vs. total # of raised Non-Conformances




X



SWE-135

SWE-158

SWE-185

·        # of static code errors and warnings identified as “positives” vs. # of total errors and warnings identified by tool

·        # of static code errors and warnings resolved by Severity vs. # of static code errors and warnings identified by Severity by tool




X



SWE-135

SWE-158

SWE-185

·        # of static code “positives” over time (Open, Closed, Severity)




X

X


SWE-135

SWE-185

·        # of Cybersecurity vulnerabilities and weaknesses identified by tool




X

X


SWE-135

SWE-158

SWE-185

·        # coding standard violations identified (Open, Closed, type of violation, Severity)




X



SWE-061

SWE-185

·        Software cyclomatic complexity # for safety critical components




X

X


SWE-087

SWE-134

·        Trend of # of total errors and warnings identified per SCA Tool, Language, and SLOC size




X



SWE-135

SWE-158

Test Procedures Metrics

·        #  of Non-Conformances and risks open vs. # of Non-Conformances, risks identified with test procedures




X

X


SWE-087

SWE-065b

SWE-071

SWE-191

·        # of hazards with completed test procedures/cases vs. total number of hazards over time




X

X


SWE-065b

SWE-068

SWE-191

·        # of software requirements with completed test procedures/cases over time




X

X


SWE-065b

SWE-071

SWE-066

·        # of Non-Conformances identified when the approved, updated requirements are not reflected in test procedures




X

X


SWE-065b

SWE-071

SWE-191

·        # of Non-Conformances identified while confirming hazard controls are verified through test plans/procedures/cases




X

X


SWE-065b

SWE-071

SWE-066

SWE-068

SWE-191

Test Metrics

·        # of tests executed vs. # of tests successfully completed




X

X

X

SWE-062

SWE-066

SWE-065c

SWE-068

SWE-159

SWE-190

SWE-192

SWE-055

SWE-080

SWE-191

SWE-194

SWE-211

·        # of Non-Conformances identified during each testing phase (Open, Closed, Severity)




X

X

X

SWE-062

SWE-065c

SWE-066

SWE-068

SWE-080

SWE-159

SWE-190

SWE-191

SWE-192

SWE-194

SWE-211

·        # of Requirements tested successfully vs. total # of Requirements




X

X

X

SWE-062

SWE-065b

SWE-065c

SWE-066

SWE-080

SWE-191

SWE-192

SWE-194

·        # of Hazards containing software that have been successfully tested vs. total # of Hazards containing software




X

X

X

SWE-062

SWE-066

SWE-068

SWE-080

SWE-134

SWE-192

SWE-205

·        # of Non-Conformances identified in models, simulations, and tools over time (Open, Closed, Severity)




X

X

X

SWE-070

·        # of Regression test set Non-Conformances/Risks over time (Open, Closed, Severity)




X

X

X

SWE-191

·        # of Requirements successfully tested in customer environment vs. # of Requirements





X

X

SWE-055

Cybersecurity Metrics During Testing Metrics

·        # of Cybersecurity mitigation implementations identified from the security vulnerabilities and security weaknesses




X

X

X

SWE-154

SWE-155

SWE-159

SWE-191

·        # of Cybersecurity mitigation implementations identified with associated test procedures vs. # of Cybersecurity mitigation implementations identified




X

X

X

SWE-154

SWE-155

SWE-159

SWE-191

·        # of Cybersecurity mitigation tests completed vs. total # of Cybersecurity mitigation tests




X

X

X

SWE-159

·        # of Non-Conformances identified during Cybersecurity mitigation testing (Open, Closed, Severity)




X

X

X

SWE-159

·        Trends of Cybersecurity Non-Conformances over time




X

X

X

SWE-158

SWE-159

SWE-155

SWE-063

SWE-135

SWE-201

Test Coverage Metrics

·        Software code/test coverage percentages for all identified safety-critical components (e.g., # of paths tested vs. total # of possible paths)

 

 

 

X

X

X

SWE-066

SWE-190

SWE-134

SWE-189

·        # of tests successfully completed vs. total # of tests




X

X

X

SWE-055

SWE-062

SWE-066

SWE-065c

SWE-068

SWE-080

SWE-159

SWE-190

SWE-191

SWE-192

SWE-194

SWE-211

·        # of detailed software requirements tested to date vs. total # of detailed software requirements




X

X

X

SWE-055

SWE-062

SWE-065b

SWE-065c

SWE-066

SWE-071

SWE-080

SWE-191

SWE-192

SWE-194

SWE-159

·        # of safety critical requirement verifications vs. total # of safety critical requirement verifications completed

·        # of Open issues vs. # of Closed over time




X

X

X

SWE-062

SWE-066

SWE-191

SWE-192

SWE-068

·        # of Source Lines of Code (SLOC) tested vs. total # of SLOC




X

X

X

SWE-066

SWE-134

SWE-189

SWE-190

Build/Release Content Metrics

·        # of software units planned vs. # actually built



X

X

X

X

SWE-063

SWE-194

·        # of planned software requirements implemented in each build vs. # of actual software requirements implemented in each build




X

X

X

SWE-063

SWE-194

·        # of Non-Conformances identified in release documentation (Open, Closed)




X

X

X

SWE-063

SWE-077

SWE-084

SWE-085

Operations/ Maintenance Metrics

·        # of Non-Conformances identified in the software after delivery






X

SWE-195

IV&V Metrics (Kept by IV&V)

·        Metrics on IV&V Non-Conformances (# of issues, risks, open, closed, severity, category (requirements, code, test, documentation, etc.))

·        Kept by IV&V team, not SA

X

X

X

X

X

X

SWE-179

SASS-02

3. Resources

3.1 References

3.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.



  • No labels