bannerd


SWE-091 - Establish and Maintain Measurement Repository

1. Requirements

2.1.5.7 For Class A, B, and C software projects, the Center Director, or designee, shall establish and maintain a software measurement repository for software project measurements containing at a minimum: 

a. Software development tracking data.
b. Software functionality achieved data.
c. Software quality data.
d. Software development effort and cost data.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-091 - Last used in rev NPR 7150.2D

RevSWE Statement
A

4.4.2 The project shall select and record the selection of specific measures in the following areas:

    1. Software progress tracking.
    2. Software functionality.
    3. Software quality.
    4. Software requirements volatility.
    5. Software characteristics
Difference between A and BMade the requirement based on software Class and Safety Criticality; Added requirement to establish metrics repository / system from Rev A SWE-095.
Removed requirement for requirements volatility and software characteristics; Added development effort and cost data to list of measures.
B

2.1.3.9 For Class A, B, C, and safety-critical software projects, the Center Director shall establish and maintain a software measurement repository for software project measurements containing at a minimum:

    1. Software development tracking data.
    2. Software functionality achieved data.
    3. Software quality data.
    4. Software development effort and cost data.
Difference between B and CRemoved "safety-critical" from the requirement.
C

2.1.5.9 For Class A, B, and C software projects, the Center Director, or designee, shall establish and maintain a software measurement repository for software project measurements containing at a minimum:

    1. Software development tracking data.
    2. Software functionality achieved data.
    3. Software quality data.
    4. Software development effort and cost data.

Difference between C and DNo Change
D

2.1.5.7 For Class A, B, and C software projects, the Center Director, or designee, shall establish and maintain a software measurement repository for software project measurements containing at a minimum: 

a. Software development tracking data.
b. Software functionality achieved data.
c. Software quality data.
d. Software development effort and cost data.





2. Rationale

Software measurement programs are established to meet measurement objectives and goals at multiple levels. The data gained from these measurement programs are used in the management of projects, to assuring safety and quality, and in improving overall software engineering practices. Software measurement repositories help manage this data and provide the insight needed on projects.  Measurement repositories should be established for collecting, storing, analyzing, and reporting measurement data based on the requirements of the projects and Center. The repository enables the Center to assess its current software status and the engineering capabilities of providers for future work. Once these software measurement systems are established, the software measures and analysis results that emanate from them enable Center-wide assessments of the abilities and skills in the workforce, and opportunities for improving software development.

3. Guidance

A software organization is expected to collect metrics in accordance with a common procedure to facilitate uniformity in the data collected.  The software organization designates a storage location so that the project metric data can be viewed and used by the organization. An effective storage approach allows for long-term access to the metric information that can be used in trending assessments and analyses. The data gained from these repositories are used in the management of projects, to assuring safety and quality, and in improving overall software engineering practices.  Measurement repositories can be at a Center or organizational level within a Center.  Each Center should decide which approach works best for its Center.

See also Topic 7.14 - Implementing Measurement Requirements and Analysis for Projects

See also Topic 5.08 - SDP-SMP - Software Development - Management Plan where measurements are planned. 

3.1 Measurement System Goals

In general, Center-level measurement systems are designed to meet the following high-level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

3.2 Measurement Repository

The software measurement repository stores metrics history to be used to evaluate data on current and/or future projects. The availability of past metrics data can be the primary source of information for calibration, planning estimates, benchmarking, and process improvement activities.

With the high-level goals in mind, Centers are to establish and maintain a specific measurement repository for their particular programs and projects to enable reporting on the minimum requirements categories:

  • Software development tracking data:

This data is tracked throughout the life cycle, and includes, but is not limited to, the planned and actual values for software resources, schedule, implementation status, and test status.  This information may be reported in the Software Metrics Report (see 5.05 - Metrics - Software Metrics Report).

  • Software functionality achieved data:

This is data that monitors the functionality of the software and includes, but is not limited to, the planned vs. the actual number of requirements and function points.  It also includes data on the utilization of computer resources. This information may be reported in the Software Metrics Report (see 5.05 - Metrics - Software Metrics Report).

  • Software quality data:

This data is used to determine the quality of the software produced during each phase of the software life cycle.  Software quality data includes figures and measurements regarding software problem reports/change requests, reviews of item discrepancies, peer reviews/software inspections, software audits, software risks, and mitigations.  This information may be reported in the Software Metrics Report (see 5.05 - Metrics - Software Metrics Report).

  • Software development effort and cost:

Effort and cost data are used to monitor the cost and progress of software development.  This type of data can include progress toward accomplishing planned activities, number of schedule delays, resource and tool expenses, costs associated with test and verification facilities, and more. This information is typically captured in some type of status reporting tool. 

3.3 Measurement Collection

Since this is a Center repository, measurements are collected across programs and projects.  Proper execution of data collection necessitates an agreed-to plan for the Center’s software measurement repository. This plan is based on the evaluation and interpretation of specified software measures that have been captured, validated, and analyzed according to the approved procedures in the programs/projects’ software measurement plans. The collection plan begins with clear information that includes:

  • A description of all data is to be provided.
  • A precise definition of terms.
  • A description of responsibilities for data provision and analyses.
  • A description of time frames and responsibilities for the reception of the data and analyses being provided.

Each Center develops its own measurement program that is tailored to its needs and objectives and is based on an understanding of its unique development environment.

Activities within the data collection procedure include:

  • A clear description of all data is to be provided. This includes a description of each item and its format, a description of the physical or electronic form to be used, and the location or address for the data to be sent (i.e., the measurement repository).
  • A clear and precise definition of terms. This includes a description of the project or organization-specific criteria, definitions, and a description of how to perform each step in the collection process and storing data in a measurement repository.

3.4 Data Storage 

Activities within the data storage procedure, which covers placing data in the measurement repository, include 329:

  • A description of the checking, validation and verification (V&V) of the quality of the data sets collected. This includes checks for proper formats, logicalness, missing entries, repetitive entrees, and typical value ranges (expect these to be oriented towards validation at the set level since individual data checking and validation will have been already performed at the project level).
  • A description of what data sets or intermediate analyses will be made and kept, or made and discarded (assuming the analyses can be reconstructed if needed). This includes a listing of requested analyses by Center stakeholders, lists of continuing metrics, and descriptions of changes to the analyses to reflect advances in the software development life cycle.
  • The identification of a Center’s measurement repository, site, and management steps to access, use, and control the data and its appropriate database management system (DBMS) in the repository. The use of a DBMS in the repository allows multiple projects and organizations to access the data in a format that supports their specific or organizational objectives.

3.5 Metrics

Metrics (or indicators) are computed from measurements using the Center’s analysis procedures. They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breaches of pre-established limits, such as allowable latent defects.

Management metrics are measurements that help evaluate how well software development activities are performing across multiple Centers, development organizations, programs, or projects. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in plan adjustments of all candidate approaches that are quantified and analyzed.

See also SWE-090 - Management and Technical Measurements, SWE-092 - Using Measurement Data, SWE-093 - Analysis of Measurement Data, SWE-094 - Reporting of Measurement Analysis.

3.6 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.7 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

4. Small Projects

No additional guidance is available for small projects. 

5. Resources

5.1 References

  • (SWEREF-089) Project-Type/Goal/Metric Matrix, developed by NASA Software Working Group Metrics Subgroup, 2004. This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook.
  • (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
  • (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
  • (SWEREF-252) Mills, Everald E. (1988). Carnegie-Mellon University-Software Engineering Institute. Retrieved on December 2017 from http://www.sei.cmu.edu/reports/88cm012.pdf.
  • (SWEREF-316) Software Metrics Selection Presentation: A tutorial prepared for NASA personnel by the NASA Software Working Group and the Fraunhofer Center for Empirical Software Engineering College Park, MD. Seaman, Carolyn (2005) This NASA-specific information and resource is available in Software Processes Across NASA (SPAN), accessible to NASA-users from the SPAN tab in this Handbook. Retrieved on February 27, 2012 from https://nen.nasa.gov/web/software/nasa-software-process-asset-library-pal?.
  • (SWEREF-329) Technical Report - NASA-GB-001-94 - Doc ID: 19980228474 (Acquired Nov 14, 1998), Software Engineering Program,
  • (SWEREF-336) Software Technology Support Center (STSC) (1995), Hill Air Force Base. Accessed 6/25/2019.
  • (SWEREF-355) Westfall, Linda, The Westfall Team (2005), Retrieved November 3, 2014 from http://www.win.tue.nl/~wstomv/edu/2ip30/references/Metrics_in_12_steps_paper.pdf
  • (SWEREF-572) Public Lessons Learned Entry: 2218.
  • (SWEREF-577) Public Lessons Learned Entry: 3556.


5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lessons Learned database contains the following lessons learned related to Measurement Selection:

  • Selection and use of Software Metrics for Software Development Projects. Lesson Learned Number 3556 577 "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provides the driving event behind this lesson.

"Metrics or measurements provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation states: "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will be used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:

    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software."

  • Flight Software Engineering Lessons. Lesson Learned Number 2218 572: "The engineering of flight software (FSW) for a typical NASA/Caltech Jet Propulsion Laboratory (JPL) spacecraft is a major consideration in establishing the total project cost and schedule because every mission requires a significant amount of new software to implement new spacecraft functionality."

The lesson learned Recommendation No. 8 provides this step as well as other steps to mitigate the risk from defects in the FSW development process:

"Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures include metrics such as the percentage of code, requirements, defined faults tested, and the percentage of tests passed in both simulation and testbed environments. These measures also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and testbed environments, and where they have successfully passed all stress tests."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.


  • No labels