bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)


SWE-094 - Reporting of Measurement Analysis

1. Requirements

4.4.5 The project shall report measurement analysis results periodically and allow access to measurement information by Center-defined organizational measurement programs.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes C-E and Safety Critical are labeled "P (Center) +SO." This means that this requirement applies to the safety-critical aspects of the software and that an approved Center-defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Class C and Not Safety Critical and Class G are labeled with "P (Center)." This means that a Center defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Class F is labeled with "X (not OTS)". This means that this requirement does not apply to off-the-shelf (OTS) software for these classes.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

    X

    P(C)

    X

   

    X

   

    X

    P(C)

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

The intent of this requirement is that the software development project organizations provide or allow access to the software metric data during the project life cycle by those Center-defined organizations responsible for assessing and utilizing the metric data. Access can be provided to a number of organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and/or subcontractors provide NASA with access to the software metric information in a timely manner to allow usage of the information.

NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and also at the Mission Directorate and Mission Support levels (see SWE-095 and SWE-096) to satisfy organizational, project, program and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and for subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements.

The data gained from these measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and in improving overall software engineering practices. In general, the project level and Mission Directorate/Mission Support Office level measurement programs are designed to meet the following high-level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

Information generated to satisfy other requirements in the NPR 7150.2 (see SWE-090,SWE-091, SWE-092, and SWE-093) provide the software measurement data and the analysis methods that are used to produce the results being reported by this SWE-094. The information is recorded in a Software Metrics Report.

3. Guidance

Metrics (or indicators) are computed from measures using approved and documented analysis procedures. They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breeches of pre-established limits, such as allowable latent defects. Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center leaders for evaluating overall Center capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities.

To ensure the measurement analysis results are communicated properly, on time, and to the appropriate people, the project develops reporting procedures for the specified analysis results (see SWE-091 and SWE-093). It makes these analysis results available on a regular basis (as designed) to the appropriate distribution systems.

Things to consider when developing these reporting methods in order to make them available to others are as follows:

  • Stakeholders: Who receives which reports? What level of reporting depth is needed for particular stakeholders? Software developers and software testers (the collectors of the software measures) may only need a brief summary of the results, or maybe just a notification that results are complete and posted online where they may see them. Task and project managers will be interested in the status and trending of the project's metrics (these may be tailored to the level and responsibilities of each manager). Center leaders may need only normalized values that assist in evaluations of Center competence levels overall (which in turn provides direction for future training and process improvement activities).
  • Management chain: Does one detailed report go to every level? Will the analysis results be tailored (abbreviated, synopsized) at progressively higher levels in the chain? Are there multiple chains (engineering, projects, safety and mission assurance)?
  • Timing and periodicity: Are all results issued at the same frequency? Are weekly, monthly, or running averages reported? Are some results issued only upon request? Are results reported as deltas or cumulative?
  • Formats for reports: Are spreadsheet-type tools used (bar graph, pie chart, trending lines)? Are statistical analyses performed? Are hardcopy, power point, or email/online tools used to report information?
  • Appropriate level of detail: Are only summary results presented? Are there criteria in place for deciding when to go to a greater level of reporting (trend line deterioration, major process change, mishap investigation)? Who approves data format and release levels?
  • Specialized reports: Are there capabilities to run specialized reports for individual stakeholders (safety-related analysis, interface-related defects)? Can reports be run outside of the normal "Timing and periodicity" cycle?
  • Correlation with project and organizational goals: Are analysis results directly traceable or relatable to specific project and Center software measurement goals? Who performs the summaries and synopses of the traceability and relatability? Are periodic reviews scheduled and held to assess the applicability of the analysis results to the software improvement objectives and goals?
  • Interfaces with organizational and Center-level data repositories: Are analysis results provided regularly to organizational and Center-level database systems? Is access open, or by permission (password) only? Is it project specific, or will only normalized data be made available? Where are the interfaces designed, maintained, and controlled?

The project reports analysis results periodically according to established collection and storage procedures (see SWE-092) and the reporting procedures developed according to this SWE. These reporting procedures are contained in the Software Development or Management Plan (see SWE-102).

4. Small Projects

Data reporting activities may be restricted to measures that support safety and quality assessments and the overall organization's goals for software process improvement activities. Data reporting timing may be limited to annual or major review cycles. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking and storage of measurement data. Many projects within NASA have been using the JIRA environment (see section 5.1, Tools) with a variety of plug-ins which help capture measurement data associated with software development.

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to reporting of measurement analysis:

  • Selection and use of Software Metrics for Software Development Projects, Lesson No. 3556: "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

    "Metrics or measurements are used to provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics: 577
    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software."
  • Flight Software Engineering Lessons, Lesson No. 2218: "The engineering of flight software (FSW) for a typical NASA/Caltech Jet Propulsion Laboratory (JPL) spacecraft is a major consideration in establishing the total project cost and schedule because every mission requires a significant amount of new software to implement new spacecraft functionality."

    The lesson learned Recommendation No. 8 provides this step as well as other steps to mitigate the risk from defects in the FSW development process:

    "Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures include metrics such as the percentage of code, requirements, and defined faults tested, and the percentage of tests passed in both simulation and test bed environments. These measures also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and test bed environments, and where they have successfully passed all stress tests." 572

Consider the following lessons learned when capturing software measures in support of Center/organizational needs:

  • Know How Your Software Measurement Data Will Be Used, Lesson No. 1772: "Prior to Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the IPAO (Independent Program Assessment Office) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The IPAO input this data to their parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation, and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach agreement on the correct estimate."

    The Recommendation states that "Prior to submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's." 567