See edit history of this section
Post feedback on this section
1. Requirements
2.1.5.8 For Class A, B, and C software projects, the Center Director, or designee, shall utilize software measurement data for monitoring software engineering capability, improving software quality, and to track the status of software engineering improvement activities.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
2. Rationale
What gets measured, gets managed. Software measurement programs are established to meet objectives at multiple levels and structured to satisfy particular organization, project, program, and Mission Directorate needs. The data gained from these measurement programs assist in managing projects, assuring quality, and improving overall software engineering practices.
3. Guidance
3.1 Organization's Measurement Program
Each organization is expected to develop its own measurement program (see SWE-090 - Management and Technical Measurements) that is tailored to its needs and objectives and is based on an understanding of its unique development environment. See also Topic 7.14 - Implementing Measurement Requirements and Analysis for Projects.
Some of the important features and advantages of measurements/metrics are:
- Motivation – Involving employees in the whole process of goal setting and increasing employee empowerment. This increases employee job satisfaction and commitment.
- Better communication and coordination – Frequent reviews and interactions between superiors and subordinates help to maintain harmonious relationships within the organization and also solve many problems.
- Clarity of goals:
– Subordinates tend to have a higher commitment to the objectives they set for themselves than those imposed on them by another person.
– Managers can ensure that the objectives of subordinates are linked to the organization's objectives.
– Everyone will have a common goal for the organization.
Software measurements are collected:
- For the benefit of the current project:
– Objective measurement data is used to plan, track, and correct the project.
- For the benefit of future projects across the Center:
– Help create a basis for planning future projects.
– Help understand what baseline performance is for similar projects.
– Provide organizational information to help improve software activities.
Software measurement data is collected and maintained:
- To force advanced, detailed planning.
- To help make development and management planning decisions consistent with the project scope and requirements.
- To provide objectivity in assessing progress which is often difficult during the heat of the battle.
- To provide status relative to approved scope and requirements to support management control.
- To allow corrective action in time to prevent the “crisis” or to minimize the impact of the crisis.
- To improve the ability to estimate completion costs and schedule variances by analysis of data and trends.
A good measurement plan describes what data is to be collected, how it is to be collected, and how that measurement data is to be used. Components of a good measurement plan include:
- Measurement objectives such as:
– Organizational objectives: Improve cost estimation; improve the quality of delivered software.
– Project objectives: Deliver software on schedule and within the cost, determine test completion, meet operational performance goals, deliver quality software, and identify project problems early.
2. The measures that will meet the objectives (and don’t forget measures for the process areas)
– Your project measures should include any measures that the organization needs.
– Choose project measures that help you manage your project.
– Consider the tools you are using and take advantage of metrics they may provide.
3. Descriptions of how the measures will be collected and stored
– Who does the collection? How do they get collected?
– Do we need some tools?
– Do we have a repository for the measures?
4. The analysis methods for each of the measures
– Do we have to do some additional computations on our measures to compare them?
– Do we need to look at several measures together to get a full understanding?
– What sorts of charts will show the answers to our questions?
– What are the expected values? How do we know if the measurement results indicate something bad or good?
5. Communication of the measurement results
– What results should the team be aware of?
– What are the key measurement results that need to be reported to management?
6. Commitment to the measurement plan from your team and your management
See also SWE-094 - Reporting of Measurement Analysis, 5.05 - Metrics - Software Metrics Report.
The table 497 below provides an example of mapping organizational goals/objectives to metrics:
The specific measurement systems for particular programs and projects enable reporting on the minimum requirements categories listed in SWE-091 - Establish and Maintain Measurement Repository, and repeated here:
- Software development tracking data.
- Software functionality achieved data.
- Software quality data.
- Software development effort and cost data.
Step 2 Target Goals
"At the organizational level, we typically examine high-level strategic goals like being the low cost provider, maintaining a high level of customer satisfaction, or meeting projected revenue or profit margin target. At the project level, we typically look at goals that emphasize project management and control issues or project level requirements and objectives. These goals typically reflect the project success factors like on time delivery, finishing the project within budget or delivering software with the required level of quality or performance. At the specific task level, we consider goals that emphasize task success factors. Many times these are expressed in terms of the entry and exit criteria for the task." 355
Once these software measurement systems are established, the software measures and analysis results that emanate from them enable:
- Assessments and monitoring of the abilities and skills in the workforce.
- Identification of improvements in software quality.
- Identification of opportunities for improving software development.
- Status tracking for software engineering improvement activities.
Metrics (or indicators) are computed from measures using the approved analysis procedures (see SWE-093 - Analysis of Measurement Data). They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breaches of pre-established limits, such as allowable latent defects. Management metrics are measurements that help evaluate how well software development activities are performing across multiple development organizations or projects. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in plan adjustments of all candidate approaches that are quantified and analyzed.
Measurements useful for monitoring software engineering capability include:
- Training metrics.
- Training opportunity metrics, including cross-training opportunities.
- Workforce experience level data.
- Development life cycle model metrics (e.g., number of personnel working on Agile projects vs. traditional waterfall)
Measurements useful for improving software quality include:
- The number of software Problem Reports/Change Requests (new, open, closed, severity).
- Review of item discrepancies (open, closed, and withdrawn).
- The number of software peer reviews/inspections (planned vs. actual).
- Software peer review/inspection information (e.g., effort, review rate, defect data).
- The number of software audits (planned vs. actual).
- Software audit findings information (e.g., number and classification of findings).
- Software risks and mitigations.
- The number of requirements verified or status of requirements validation.
- Results from static code analysis tools.
Measurements useful for tracking the status of software engineering improvement activities include:
- CMMI assessment findings and results.
- Productivity metric improvements.
- Defect metric improvements.
- Software cost data.
- Goals or objectives of the software engineering improvement activities that have been completed or the status of activities associated with the goals or objectives.
- Workforce metrics.
See also SWE-018 - Software Activities Review regarding the use of measurement data in software activities reviews.
See also SWE-040 - Access to Software Products regarding access to measurement data.
3.2 Software Assurance Metrics
8.18 - SA Suggested Metrics - This topic contains the complete list of software assurance/safety metrics that are suggested for use with the SA tasks in NASA-STD-8739.8.
This topic contains the complete list of software assurance/safety metrics that are suggested for use with the SA tasks in NASA-STD-8739.8. These suggested metrics will provide SA and safety personnel with much of the information they will need to assess both the software assurance/safety work, as well as providing information to help with the monitoring of the software engineering progress.
The next Metrics Table tab shows the set of metrics in a table and provides a way to download an Excel file. The Excel file contains the same metrics but has the capability of being able to filter the columns. This provides an easy way to see which metrics are associated with a particular SWE and to filter out SWEs that may be tailored out in your project. These tables provide information on potential phases in which each metric might be collected. Be sure to read the information at the top of each metrics sheet for more specific information on the use of the tables.
Each project should review the table and decide which metrics they want to collect, based on the metrics that they think will provide them with the most information for their activities.
3.3 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
3.4 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
No additional guidance is available for small projects.
5. Resources
5.1 References
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-252) Mills, Everald E. (1988). Carnegie-Mellon University-Software Engineering Institute. Retrieved on December 2017 from http://www.sei.cmu.edu/reports/88cm012.pdf.
- (SWEREF-355) Westfall, Linda, The Westfall Team (2005), Retrieved November 3, 2014 from http://www.win.tue.nl/~wstomv/edu/2ip30/references/Metrics_in_12_steps_paper.pdf
- (SWEREF-367) IEEE Computer Society, Sponsored by the Software Engineering Standards Committee, IEEE Std 982.1™-2005, (Revision of IEEE Std 982.1-1988),
- (SWEREF-430) Basili, V.R., et al. (May, 2002). University of Maryland, College Park. Experimental Software Engineering Group (ESEG). Lessons Learned Reference.
- (SWEREF-497) MSFC Flight & Ground Software Division (ES50) Organizational Metrics Plan, EI32-OMP Revision D April 29, 2013. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
- Much of NASA's software development experience that was accomplished in the NASA/GSFC Software Engineering Laboratory (SEL) is captured in "Lessons learned from 25 years of process improvement: The Rise and Fall of the NASA Software Engineering Laboratory 430. The document describes numerous lessons learned that are applicable to the Agency's software development activities. From their early studies, they were able to build models of the environment and develop profiles for their organization. One of the key lessons in the document is "Lesson 6: The accuracy of the measurement data will always be suspect, but you have to learn to live with it and understand its limitations."