See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.4.3 The project manager shall analyze software measurement data collected using documented project-specified and Center/organizational analysis procedures.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
NASA software measurement programs are now being designed 297 to provide the specific information necessary to manage software products, projects, and services. Center/organizational, project, and task goals (see SWE-090) are determined in advance and then measurements and metrics are selected (see SWE-091) based on those goals. These software measurements are used to make effective management decisions as they relate to established goals. Documented procedures are used to calculate and analyze metrics that indicate overall effectiveness in meeting the goals.
Typically, the effectiveness of the project in producing a quality product is characterized by measurement levels associated with the previously chosen metric. The use of measurement functions and analysis procedures that are chosen in advance helps assure that Center/organizational goals are being addressed.
3. Guidance
SWE-093 requires the analysis of the collected software measurements with the documented project-specified and Center and organizational analysis procedures. Implicit in the requirement is the need to investigate, evaluate and select the appropriate analysis procedures and software metrics. The Software Development (SDP) or Management Plan (see 5.08 - SDP-SMP - Software Development - Management Plan) lists software metrics as part of the SDP content. This indicates the need to develop the software metrics for the project early in the software development life cycle. The evolution of the software development project and its requirements may necessitate a similar evolution in the required software measures and software metrics (see SWE-092).
Metrics can be classified as primitive metrics (or base metrics) and/or as derived metrics (or computed metrics). Primitive metrics are those that can be directly measured or observed, such as program size (in source lines of code (SLOC)), number of defects observed in unit testing, or total development time for the project. Computed metrics are those that cannot be directly measured but are computed in some manner from software measures or other metrics. Examples of computed metrics are those that are commonly used to measure productivity, such as SLOC produced per person-month, or the number of defects per thousand lines of code (KSLOC). Computed metrics are combinations of software measures or other metric values and are often more valuable in understanding or evaluating the software development process than simple metrics. 252
It is important to understand that the analysis procedure defines how we are going to calculate the project's software metrics. As stated above, the primitive metrics are measured directly and their analysis procedure may consist of converting them to a simple plot, bar chart, or table. Examples of software measures or primitive metrics may include the number of lines of code reviewed during an inspection, or the hours spent preparing for an inspection meeting.
The derived metrics (usually more complex, but not always) are calculated using the analysis procedures (e.g., mathematical combinations, equations, or algorithms) of the base software measures or other derived measures. An example of a derived metric would be the peer inspection's preparation rate, modeled as the number of lines of code reviewed divided by the number of preparation and review hours.
Many analysis procedures produce metrics with an element of simplification. This is both the strength and weakness of using metrics. When we create a model to use as our analysis procedure, we need to be pragmatic. If we try to include all of the software measures that might affect the metric used to characterize the software product, the model can become so complicated that it is useless. Being pragmatic means not trying to create the most comprehensive analysis procedure. It means picking the aspects that are the most important. Remember that the analysis procedures can always be modified to include additional levels of detail in the future.
Ask yourself these questions:
- Does the analysis procedure produce more information than we have now?
- Is the resulting information of practical benefit?
- Does it tell us what we want to know?
- Does it help satisfy the goals and objectives of the software measurement program?
The importance of the need for defining software measures and associated analysis procedures can be illustrated by considering the lines of code metric. "Lines of Code" is one of the most used and most often misused of all of the software metrics. (The problems, variations, and anomalies of using lines of code were well documented many years ago 236). There is no industry-accepted standard for counting lines of code. Therefore, if you are going to use a metric based on lines of code, a specific measurement method must be defined. A description of this metric is included in all reports and analyses so that stakeholders (customers and managers) can understand the definition of the metric. Without this, invalid comparisons with other data are almost inevitable. 137
There are two basic approaches for selecting an analysis procedure to use for producing the software metric(s): (1) Use an existing model, or, (2) create a new one. In many cases, there is no need to "re-invent the wheel."
Many software analysis procedures exist that other organizations have used successfully. These are documented in the current literature and prior NASA software development projects. With a little research, you can identify many candidate analysis procedures that require little or no adaptation to match your own project needs and environments. For example, review the material in the NASA Software Measurement Guidebook 329 to see previously used software metrics. The analysis procedures to develop these metrics are typically straightforward.
The second method is to create your model. The best advice here is to talk to the people who are responsible for the software product or the software development processes and procedures. They are the experts. They know what factors are important. With their assistance, the key software measures, analysis procedures, and resulting software metrics you will need to support and meet the project's specific measurement objectives will become apparent (see SWE-090). If you create a new analysis procedure for calculating the project's software metrics, you need to ensure the analysis procedure is intelligible to your customers and management chain. You must also prove it is a valid analysis procedure for what you are trying to measure. Sometimes this validation can occur only through the application of statistical techniques 137 or by application to previous software development projects in your local environment.
Good metrics facilitate the development of models that are capable of predicting process or product parameters, not just describing them.
As you analyze the collected software measurement data, keep in mind that ideal metrics are:
- Traceable to an organizational or project objective.
- Simple, precisely definable.
- Objective.
- Easily obtainable (i.e., at a reasonable cost).
- Valid (the metric effectively measures what it is intended to measure).
- Robust (the metric is relatively insensitive to insignificant changes in the process or product).
Good metrics may have different types of attributes:
- Control type: Used to monitor the software processes, products, and services; and identify areas where corrective or management action is required.
- Evaluate type: Used to examine and analyze the measurement information as part of the decision-making processes.
- Understand and predict type: Used to predict the future status of the software development activity; adds to the level of confidence in the quality of the software product.
When analyzing the collected software measures, ask the following:
- Are the software measurement sets complete?
- Is the data objective or subjective?
- What is the integrity and accuracy of the data?
- How stable is the production process being measured?
- What is the variation in the data set?
Software metrics can provide the information needed by engineers for technical decisions as well as information required by management 355. According to the International Standards Organization (ISO) / International Electrotechnical Commission (IEC) 15939 Software Engineering – Software Measurement Process standard, decision criteria are the thresholds, targets, or patterns used to determine the need for action or further investigation. 378
Center and organization analysis procedures may include reporting and distribution functions. This includes defining the report format (tables, trend lines, bar graphs), data extraction and reporting cycle (dates, triggers for collection, exception basis), reporting mechanisms (hard copy, online Database Management System (DBMS)), distribution (email blasts, management chain), and availability. Software measurement data collection cycles may or may not be the same as the data reporting cycles. Alternately, the data collection and storage procedures may capture the descriptions of these reporting and distribution functions (see SWE-092).
Additional guidance related to software measurements may be found in the following related requirements in this Handbook:
4. Small Projects
While SWE-091 may offer limits to the type and interval of measures to be recorded, the project still needs to collect and analyze the selected software measurement data to develop the key software metrics. Using previously defined analysis procedures can help a project reduce the time and effort needed to develop procedures. Also, certain development environments, such as JIRA (see section 5.1, Tools), and associated plug-ins, or configuration management systems, can help automate the collection and distribution of information associated with the analysis of development metrics.
5. Resources
5.1 References
- (SWEREF-137) Boeing Houston Site Metrics Manual, HOU-EGM-308, January 17, 2002. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
- (SWEREF-146) Capers Jones, 2007, McGraw Hill, New York.
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-222) IEEE STD 610.12-1990, 1990. NASA users can access IEEE standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of IEEE standards.
- (SWEREF-236) Jones, Capers (1986), McGraw Hill, New York.
- (SWEREF-252) Mills, Everald E. (1988). Carnegie-Mellon University-Software Engineering Institute. Retrieved on December 2017 from http://www.sei.cmu.edu/reports/88cm012.pdf.
- (SWEREF-297) Paul Goodman (1993). London: McGraw Hill. Available in ACM digital Library at: https://dl.acm.org/citation.cfm?id=541761
- (SWEREF-329) Technical Report - NASA-GB-001-94 - Doc ID: 19980228474 (Acquired Nov 14, 1998), Software Engineering Program,
- (SWEREF-336) Software Technology Support Center (STSC) (1995), Hill Air Force Base. Accessed 6/25/2019.
- (SWEREF-355) Westfall, Linda, The Westfall Team (2005), Retrieved November 3, 2014 from http://www.win.tue.nl/~wstomv/edu/2ip30/references/Metrics_in_12_steps_paper.pdf
- (SWEREF-367) IEEE Computer Society, Sponsored by the Software Engineering Standards Committee, IEEE Std 982.1™-2005, (Revision of IEEE Std 982.1-1988),
- (SWEREF-378) ISO/IEC/IEEE 15939:2017(en) NASA users can access ISO standards via the NASA Technical Standards System located at https://standards.nasa.gov/. Once logged in, search to get to authorized copies of ISO standards.
- (SWEREF-567) Public Lessons Learned Entry: 1772.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following related to capturing software measures in support of Center/organizational needs:
- Know-How Your Software Measurement Data will be used, Lesson No. 1772 567: Before Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the Independent Program Assessment Office (IPAO) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The cost office inputs this data to its parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate of approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach an agreement on the correct estimate.
Before submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
- Confirm software measurement data analysis conforms to documented analysis procedures.
- Analyze software assurance measurement data collected.
7.2 Software Assurance Products
- SA metrics analysis results relating to software meeting or exceeding requirements, including any risks or issues.
Objective Evidence
- Software measurement or metric data
- Trends and analysis results on the metric set being provided
- Status presentation showing metrics and treading data
- Software assurance audit reports on software metric processes
7.3 Metrics
- Measures relating to status and performance as identified in other requirements. (Schedule deviations, closure of corrective actions, product and process audit results, peer review result, etc)
7.4 Guidance
Task 1
Software assurance reviews the software development plan/software management plan or the measurement plan to confirm that the project has chosen documented analysis procedures from an Agency, Center, or project library or has developed project-specific analysis procedures for the measures they have chosen to collect. Confirm that the software measurement analysis the project has done on its collected measures has followed the documented project analysis procedures. When the project measures exceed a documented threshold, verify that the project has examined the potential root causes of the variation and has chosen a corrective action to prevent further problems.
Task 2
Software assurance will take the software assurance measurement data and analyze it using the software assurance measurement analysis procedures documented in the software assurance plan. Measurement trends, potential root causes of problem areas, and potential corrective actions should receive special attention. An in-depth analysis of the data and trends should be done to understand the causes of any undesirable trends or indicators. The understanding of these causes is key to determining how to make corrections and improve software assurance performance. For example, if the charts for software assurance activities performed versus software assurance activities planned show that many planned activities have not been performed on schedule, there could be many reasons why (Not enough staff to perform all planned activities, the project is behind schedule and planned activities couldn't be completed, software assurance was focused on unplanned work, etc.) Based on the analysis, corrective actions can be planned to improve assurance work. More information on analyzing the measurement data can be found in the guidance for this software requirement.