4.4.5 The project shall report measurement analysis results periodically and allow access to measurement information by Center-defined organizational measurement programs. NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement. Classes C-E and Safety Critical are labeled "P (Center) +SO." This means that this requirement applies to the safety-critical aspects of the software and that an approved Center-defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement. Class C and Not Safety Critical and Class G are labeled with "P (Center)." This means that a Center defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement. Class F is labeled with "X (not OTS)". This means that this requirement does not apply to off-the-shelf (OTS) software for these classes. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? X P(C) X X X P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable The intent of this requirement is that the software development project organizations provide or allow access to the software metric data during the project life cycle by those Center-defined organizations responsible for assessing and utilizing the metric data. Access can be provided to a number of organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and/or subcontractors provide NASA with access to the software metric information in a timely manner to allow usage of the information. NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and also at the Mission Directorate and Mission Support levels (see SWE-095 and SWE-096) to satisfy organizational, project, program and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and for subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements. The data gained from these measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and in improving overall software engineering practices. In general, the project level and Mission Directorate/Mission Support Office level measurement programs are designed to meet the following high-level goals: Information generated to satisfy other requirements in the NPR 7150.2 (see SWE-090,SWE-091, SWE-092, and SWE-093) provide the software measurement data and the analysis methods that are used to produce the results being reported by this SWE-094. The information is recorded in a Software Metrics Report. Metrics (or indicators) are computed from measures using approved and documented analysis procedures. They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breeches of pre-established limits, such as allowable latent defects. Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center leaders for evaluating overall Center capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities. To ensure the measurement analysis results are communicated properly, on time, and to the appropriate people, the project develops reporting procedures for the specified analysis results (see SWE-091 and SWE-093). It makes these analysis results available on a regular basis (as designed) to the appropriate distribution systems. Things to consider when developing these reporting methods in order to make them available to others are as follows: The project reports analysis results periodically according to established collection and storage procedures (see SWE-092) and the reporting procedures developed according to this SWE. These reporting procedures are contained in the Software Development or Management Plan (see SWE-102). Data reporting activities may be restricted to measures that support safety and quality assessments and the overall organization's goals for software process improvement activities. Data reporting timing may be limited to annual or major review cycles. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking and storage of measurement data. Many projects within NASA have been using the JIRA environment (see section 5.1, Tools) with a variety of plug-ins which help capture measurement data associated with software development. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. The NASA Lessons Learned database contains the following lessons learned related to reporting of measurement analysis: Consider the following lessons learned when capturing software measures in support of Center/organizational needs:
See edit history of this section
Post feedback on this section
1. Requirements
1.1 Notes
1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
"Metrics or measurements are used to provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics: 577
The lesson learned Recommendation No. 8 provides this step as well as other steps to mitigate the risk from defects in the FSW development process:
"Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures include metrics such as the percentage of code, requirements, and defined faults tested, and the percentage of tests passed in both simulation and test bed environments. These measures also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and test bed environments, and where they have successfully passed all stress tests." 572
The Recommendation states that "Prior to submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's." 567
SWE-094 - Reporting of Measurement Analysis
Web Resources
View this section on the websiteUnknown macro: {page-info}