bannera

Book A.
Introduction

Book B.
7150 Requirements Guidance

Book C.
Topics

Tools,
References, & Terms

SPAN
(NASA Only)

SWE-092 - Measurement Collection and Storage

1. Requirements

4.4.3 The project shall specify and record data collection and storage procedures for their selected software measures and collect and store measures accordingly.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 Applicability Across Classes

Classes D-E and Safety Critical are labeled "X +SO." This means that this requirement applies to the safety-critical aspects of the software.

Class F is labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf (OTS) software for these classes.

Class G is labeled with "P (Center)." This means that a Center-defined process which meets a non-empty subset of this full requirement can be used to meet the intent of this requirement.

Class

  A_SC 

A_NSC

  B_SC 

B_NSC

  C_SC 

C_NSC

  D_SC 

D_NSC

  E_SC 

E_NSC

     F      

     G      

     H      

Applicable?

   

   

   

   

   

   

    X

   

    X

   

    X

    P(C)

   

Key:    A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures

2. Rationale

The collection and storage procedures identified in this SWE-092 are to be selected and recorded to assure adequate collection and storage procedures are disseminated and used effectively, as well as providing a uniform methodology across the software development project and, if required, across multiple projects. The rationale for this requirement is for a software organization to collect metrics per a common procedure to allow uniformity in the data collected and for a software organization to designate a storage location so that the metric data can be viewed and used by the organization. An effective storage approach allows for long-term access to the metric information that can be used in trending assessments and analyses.

Software measurement programs are established to meet objectives at multiple levels, and structured to satisfy particular organization, project, program and Mission Directorate needs. The data gained from these measurement programs assist in managing projects, assuring quality, and improving overall software engineering practices. In general, the organizational level, project level and Mission Directorate/Mission Support Office level measurement programs are designed to meet the following high-level goals:

  • To improve future planning and cost estimation.
  • To provide realistic data for progress tracking.
  • To provide indicators of software quality.
  • To provide baseline information for future process improvement activities.

We can also say, at the organizational level, "We typically examine high-level strategic goals like being the low cost provider, maintaining a high level of customer satisfaction, or meeting projected resource allocations. At the project level, we typically look at goals that emphasize project management and control issues or project level requirements and objectives. These goals typically reflect the project success factors like on time delivery, finishing the project within budget or delivering software with the required level of quality or performance. At the specific task level, we consider goals that emphasize task success factors. Many times these are expressed in terms of the entry and exit criteria for the task." 355

With these high-level goals in mind, projects are to establish specific measurement objectives for their particular activities (see SWE-090). They are to select and document specific measures (see SWE-091).

3. Guidance

Fortunately, there are many resources to help a Center develop data collection and storage procedures for its software measurement program. The NASA "Software Measurement Guidebook" 329 is aimed at helping organizations begin or improve a measurement program. The Software Engineering Institute at Carnegie Mellon University has detailed specific practices within its CMMI-DEV, Version 1.3 157 model for collecting, interpreting, and storing data. The Software Technology Support Center (STSC), at Hill Air Force Base, has its "Software Metrics Capability Evaluation Guide". 336  Westfall's "12 Steps to Useful Software
Metrics" 355 is an excellent primer for the development of useful software metrics. Other resources are suggested in section 5 [Resources].

The guidance in SWE-091 discusses recognized types of measures. It provides guidance for the selection and recording of the measurement selection decisions. SWE-092 calls for the specification and recording of the data collection and storage procedures for the software measures, and for the actual collection and storage of the measurements themselves. If a metric does not have a customer, there is no reason for it to be produced. Software measures are expensive to collect, report, and analyze so if no one is using a metric, producing it is a waste of time and money.

Each organization or Mission Directorate develops its own measurement program that is tailored to its needs and objectives, and is based on an understanding of its unique development environment. Typical software measurement programs have three components: Data collection, technical support, and analysis and packaging. To properly execute the collection of the data, an agreed-to procedure for this is needed within the Software Development Plan (see SWE-102).

Activities within the data collection procedure include:

  • A clear description of all data to be provided. This includes a description of each item and its format, a description of the physical or electronic form to be used, the location or address for the data to be sent.
  • A clear and precise definition of terms. This includes a description of the project or organization specific criteria, definitions, and a description of how to perform each step in the collection process.
"When we use terms like defect, problem report, size, and even project, other people will interpret these words in their own context with meanings that may differ from our intended definition. These interpretation differences increase when more ambiguous terms like quality, maintainability, and user-friendliness are used." 355
  • Who is responsible for providing which data. This may be easily expressed in matrix form, with clarifying notes appended to any particular cell of the matrix.
  • When and to whom the data are to be provided. This describes the recipient(s) and management chain for the submission of the data; it also specifies the submission dates, periodic intervals, or special events for the collection and submission of the data.

The data collection involvement by the software development team works better if the team's time to collect the data is minimized. Software developers and software testers are the two groups that are responsible for collecting and submitting significant amounts of the required data. If the software developers or testers see this as a non-value added task, data collection will become sporadic and data quality will suffer, thus increasing the effort of the technical support staff that has to validate the collected data. Any step that automates some or all of the data collection will contribute to the efficiency and quality of the data collected.

Activities within the data storage procedure include:

  • A description of the checking, validation and verification of the quality of the data collected. This includes automated analysis of the data, logic checks, missing entries, repetitive entrees, typical value ranges.
  • A description of what data, or intermediate analyses will be made and kept, or made and discarded (assuming the analyses can be reconstructed if needed). This includes a listing of requested analyses by stakeholder organizations, lists of continuing metrics, and changes to the analyses to reflect advances in the software development life cycle
  • The identification of a proper storage system or site and management steps to access, use, and control the appropriate data base management system (DBMS). The use of a DBMS allows multiple projects and organizations to access the data in a format that supports their specific or organizational objectives.

Metrics (or indicators) are computed from measures using the approved analysis procedures (see SWE-093). They are quantifiable indices used to compare software products, processes, or projects or to predict their outcomes. They show trends of increasing or decreasing values, relative only to the previous value of the same metric. They also show containment or breeches of pre-established limits, such as allowable latent defects. Management metrics are measurements that help evaluate how well software development activities are performing across multiple development organizations or projects. Trends in management metrics support forecasts of future progress, early trouble detection, and realism in plan adjustments of all candidate approaches that are quantified and analyzed. The method for developing and recording these metrics is also written in the (SWE-102) Software Development Plan.

As the project objectives evolve and mature, and as requirements understanding improves and solidifies, software measures can be evaluated and updated if necessary or if beneficial. Updated measurement selections, data to be recorded, and recording procedures are documented during update and maintenance of the Software Development Plan.

4. Small Projects

While a small project (see SWE-091) may propose limited sets and relaxed time intervals for the measures to be collected and recorded, the project still needs to select and record in the Software Development Plan the procedures it will use to collect and store software measurement data. Small projects may consider software development environments or configuration management systems that contain automated collection, tracking and storage of measurement data. Many projects within NASA have been using the JIRA environment (see section 5.1, Tools) with a variety of plug-ins which help capture measurement data associated with software development.

5. Resources

5.1 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN).

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

Much of NASA's software development experience that was accomplished in the NASA/GSFC Software Engineering Laboratory (SEL) is captured in the reference cited below [from Lessons learned from 25 years of process improvement: The Rise and Fall of the NASA Software Engineering Laboratory]. The document describes numerous lessons learned that are applicable to the Agency's software development activities. From their early studies they were able to build models of the environment and develop profiles for their organization. One of the key lessons in the document is "Lesson 6: The accuracy of the measurement data will always be suspect, but you have to learn to live with it and understand its limitations." 430