2.6.2.2 The project shall require the software supplier(s) to provide software metric data as defined in the project's Software Metrics Report. The requirement for the content of a Software Metrics Report is defined in Chapter 5 [section 5.3.1 of NPR 7150.2, NASA Software Engineering Requirements]. Class D and Not Safety Critical and Class G are labeled with "P (Center)." This means that an approved Center-defined process which meets a non-empty subset of the full requirement can be used to achieve this requirement. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? P(C) P(C) Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable The software development team needs to acquire software metrics data in a timely and periodic manner to assure the effectiveness of the insight and oversight activities being performed by NASA. The specification of those metrics, and when they are due, is effectively accomplished by identifying them in the software acquisition contract statement of work (SOW) clauses and instructions (see SWE-048). Effective insight and oversight of the software development by the NASA software development team is achieved by obtaining and reviewing these metrics and making decisions based on them. This is a key requirement that must be addressed on all NASA software projects. Access needs to be defined up front in the SOW, task agreement, software plans or other assignment paperwork. Special care needs to be used to clearly identify this requirement in the in-house documentation, primary contractor and subcontractor requirements. NASA needs direct insight into software metrics on NASA software projects. Measurement is defined as the process of assigning numerical values to process, product, or project attributes according to defined criteria. 254 The measurement process is based on estimation or direct measurement. Estimation activities result in planned or expected measures. Direct measurement activities result in actual measures. The term "measure" is the result of counting or otherwise quantifying an attribute of a process, project or product. Some examples of measures are size, cost, and defects. The term "metric" is defined as a measurement that provides a basis for making a decision or taking action. 254 Measurement is a key process area for successful management and is applied to all engineering disciplines. Measurement helps to define and implement more realistic plans, as well as monitor progress against those plans. Measurement data provides objective information that helps project management perform the following. Software metrics are typically used for estimation (i.e., size, effort, and cost), productivity measurements, reliability measurements, quality measurements, and project and task management. A metric quantifies a characteristic of a process or product and defines what is to be measured. They help in managing and controlling software projects and learning more about the way the organization operates and performs. Metrics are also a tool that highlights potential problems or deficiencies in the development process or in the products themselves. They provide quantitative and qualitative measures that help focus management's attention and resources, if necessary, on the prevention and/or correction of problems. The term "indicator" is used to help an organization define and measure progress toward organizational goals. Once an organization has analyzed its mission, identified all its relevant stakeholders, and defined its objectives, it needs a way to measure progress toward those goals. Indicators are those measurements. If data is used to make safety decisions (either by a human or the system), then the data is safety-critical, as is all the software that acquires, processes, and transmits the data. Refer to NASA-STD-8719.13, NASA Software Safety Standard 271 for direction on how to handle safety-critical data. A successful process for measurements is characterized by decision making that regularly includes data analyses results that are based on objective measurement. To ensure successful implementation of a project measurement process the following activities are needed. The figure below describes a typical software metric process flow. The measurement process below is based upon the Software Engineering Institute's Capability Maturity Model Integration (CMMI). The blue highlighting indicates the planning activities needed when new organizational goals, metrics and/or strategy are indicated. 157 Figure 3.1 Software Metric Process Flow WHY WE MEASURE It is often difficult to accurately understand the status of a project or determine how well development processes are working without some measures of current performance and a baseline for comparison purposes. Metrics support better management and control of software projects and work to establish greater insight into the way the organization is operating. There are four major reasons for measuring software processes, products, and resources. They are to Characterize, Evaluate, Predict, and Improve. Measurement is an important component of any project and product development effort. It is applied to all facets of software development and engineering disciplines. Before a process can be efficiently managed and controlled, it has to be measured. The content of Software Metrics Report, section 5.3.1 of NPR 7150.2, (see SWE-117), is set up as a common approach to collecting and reporting software metrics. It requires that metrics information be reported on a CSCI (Computer Software Configuration Item) basis. All NASA software development follows some level of defined software processes. The reporting processes used in a software development activity can be derived from a set of common processes defined at the Agency level, Center level or organizational level. As a minimum, for which ever processes are used, the following reporting categories shown in SWE-117 are required for summarizing and organizing the minimum information needed: Software progress tracking. Software functionality. Software quality. Software requirements volatility. Software characteristics. The NASA approach to contractor-developed software work products requires that contractor terms and deliverables be explicitly listed in the contract SOW. To be most effective this includes the software metrics list required to effectively manage the insight and oversight activities. This requirement is levied on the contractor as a provision in the software acquisition agreement or contract SOW. Additional guidance related to software measurement determination, collection, analysis, and reporting may be found in the following related requirements in this handbook: Measurement Objectives Measurement Selection Measurement Collection and Storage Analysis of Measurement Data Reporting of Measurement Analysis Directorate Measurement System Directorate Measurement Objectives Software Metrics Report Examples of software metrics Table 3.1 Software Metric Examples No additional guidance is available for small projects. The community of practice is encouraged to submit guidance candidates for this paragraph. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. The NASA Lessons Learned database contains the following lessons learned related to software metrics:
See edit history of this section
Post feedback on this section
1. Requirements
1.1 Notes
1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
"Metrics or measurements should be used to provide visibility into a software project's status during all phases of the software development life cycle in order to facilitate an efficient and successful project." The Recommendation is: "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will used to identify the "health" or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:
- The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing)
- The number of errors found during software verification/validation
- The number of errors found in delivered software (a.k.a., "process escapes")
- Projected versus actual labor hours expended
- Projected versus actual lines of code, and the number of function points in delivered software." 577
The lesson learned statement provides this step as well as other steps to mitigate the risk from defects in the FSW development process:
"Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures should include metrics such as the percentage of code, requirements, and defined faults tested, and the percentage of tests passed in both simulation and test bed environments. These measures should also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and test bed environments, and where they have successfully passed all stress tests." 572
The collection of appropriate software metrics through an insight and oversight process enables the software development team to monitor the quality of the contractor's product. 528
SWE-044 - Supplier Metric Data
Web Resources
View this section on the websiteUnknown macro: {page-info}