5.3.1 The Software Metrics Report shall contain as a minimum the following information tracked on a CSCI (Computer Software Configuration Item) basis: [SWE-117] a. Software progress tracking measures. An example set of software progress tracking measures that meet 5.3.1.a include, but are not limited to: a. Software resources, such as budget and effort (planned vs. actual). An example set of software functionality measures that meet 5.3.1.b include, but are not limited to: a. Number of requirements included in a completed build/release (planned vs. actual). An example set of software quality measures that meet 5.3.1.c include, but are not limited to: a. Number of software Problem Reports/Change Requests (new, open, closed, severity). An example set of software requirement volatility measures that meet 5.3.1.d include, but are not limited to: a. Number of software requirements. An example set of software characteristics that meet 5.3.1.e include, but are not limited to: a. Project name. Other information may be provided at the supplier's discretion to assist in evaluating the cost, technical, and schedule performance; e.g., innovative processes and cost reduction initiatives. Classes C-E and Safety Critical are labeled "P (Center) + SO." This means that this requirement applies to the safety-critical aspects of the software and that an approved Center-defined process that meets a non-empty subset of this full requirement can be used to meet the intent of this requirement. Class C and Not Safety Critical is labeled with "P (Center)." This means that a Center-defined process that meets a non-empty subset of this full requirement can be used to meet the intent of this requirement. Classes F and G are labeled with "X (not OTS)." This means that this requirement does not apply to off-the-shelf software for these classes. Class A_SC A_NSC B_SC B_NSC C_SC C_NSC D_SC D_NSC E_SC E_NSC F G H Applicable? X P(C) X X X X Key: A_SC = Class A Software, Safety-Critical | A_NSC = Class A Software, Not Safety-Critical | ... | - Applicable | - Not Applicable The Software Metrics Report (SMR) provides data to the project for the assessment of software cost, technical, and schedule progress. The reports provide a project or software lead: Measurement is a key process area for successful management and is applied to all engineering disciplines. Measurement helps to define and implement more realistic plans, as well as to monitor progress against those plans. Measurement data provides objective information that helps project management to perform the following: These metrics serve as the major foundation for efforts to manage, assess, correct, report, and complete the software development activities. Since a computer software configuration item (CSCI) is a group of software that is treated as a single entity by a configuration management system, it is the lowest level of a product that can be effectively tracked. The SMR typically uses this CSCI information to aggregate management metrics for current project statusing and future project planning. The SMR captures all the information that results from exercising and completing the requirements for SWE-090, SWE-091, SWE-092, SWE-093, and SWE-094. The SMR serves as a single repository for collecting the information developed from these activities and saving and presenting them to the appropriate stakeholders and project personnel. They result from the chosen measurement objectives (see SWE-090) and provide a view into the types of actions or decisions that may be made based on the results of the analysis and help prioritize the areas where measures need to be collected. SWE-091 calls for the project to develop and record measures in software progress tracking, software functionality, software quality, software requirements volatility, and software characteristics. SWE-092 indicates that data collection and storage procedures are specified and that the data itself needs to be then collected and stored. According to SWE-093,the collected software measures are to be analyzed with project- and Center-approved methods. Finally, SWE-094 calls for the results to be periodically reported and for access to the measurement information to be made available. All of this information may feed into Mission Directorate measurement and metrics programs (see SWE-095 and SWE-096). Software Tracking Measures Typically, the most common reason for implementing a measurement program is to track progress, one of the hardest things to do effectively. Consider the following four attributes for selecting effective tracking measures: The requirement gives an example set of measures for tracking the project: Software Functionality Measures "Function measurement methods rely on some definition of what constitutes software functionality." 365 "Function Point Analysis has been proven as a reliable method for measuring the size of computer software. In addition to measuring output, Function Point Analysis is extremely useful in estimating projects, managing change of scope, measuring productivity, and communicating functional requirements." 203 Software Quality Measures "Historically, software quality metrics have been the measurement of exactly their opposite-that is, the frequency of bugs and defects." 232 Crosby has defined software quality as conformance to a specification. 169 Software Requirement Volatility "The identified causes of requirements volatility include presence of inconsistencies or conflicts among requirements; evolving user/customer knowledge and priorities; project concurrent activities like defect fixing, functionality correction; technical, schedule or cost related problems; change in work environment; and process model selection decisions." 347 Software Characteristics Software characteristics are sometimes called software attributes. These basic characteristics are necessary for developing cost models, planning aids, and general management principles. 329 A simple X-row by Y-column table can be used to capture and effectively display this information. Formats The SMR can be provided in an electronic format or via access to the software metric data repository. It is preferred that the software metric data be provided in an electronic format or method. This document does not have to be a formal hard copy Data Requirement Document if electronic access of the information is provided. It is also acceptable for an organization to provide this information as a part of a monthly software review process. The final set of metrics used by a project needs to be determined by the project and organizational needs and the software development life cycle phase. Which software metrics are reported at what point in the development or maintenance life cycle needs to be addressed in the software planning documents or organizational planning documents. This requirement lists a minimum requirement for providing five areas of information on a CSCI (Computer Software Configuration Item) basis. Smaller projects may consider providing less information than listed in the example sets for each information item. The information included in the SMR needs to be sufficient to manage the project, manage risk, and maintain safety throughout the project's life cycle. Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider. No lessons learned have currently been identified for this requirement.
See edit history of this section
Post feedback on this section
1. Requirements
b. Software functionality measures.
c. Software quality measures.
d. Software requirement volatility.
e. Software characteristics.1.1 Notes
b. Software development schedule tasks (e.g., milestones) (planned vs. actual).
c. Implementation status information (e.g., number of computer software units in design phase, coded, unit tested, and integrated into computer software
configuration item vs. planned).
d. Test status information (e.g., number of tests developed, executed, passed).
e. Number of replans/baselines performed.
b. Function points (planned vs. actual).
b. Review of item discrepancies (open, closed, and withdrawn).
c. Number of software peer reviews/inspections (planned vs. actual).
d. Software peer review/inspection information (e.g., effort, review rate, defect data).
e. Number of software audits (planned vs. actual).
f. Software audit findings information (e.g., number and classification of findings).
g. Software risks and mitigations.
h. Number of requirements verified or status of requirements validation.
i. Results from static code analysis tools.
b. Number of software requirements changes (additions, modifications, deletions) per month.
c. Number of "to be determined" items.
b. Language.
c. Software domain (flight software, ground software, Web application).
d. Number of source lines of code by categories (e.g., new, modified, reuse) (planned vs. actual).
e. Computer resource utilization in percentage of capacity.1.2 Applicability Across Classes
X - Applicable with details, read above for more | P(C) - P(Center), follow center requirements or procedures2. Rationale
3. Guidance
Manpower and dollar planning levels are typically found in the project plan or the Software Development or Management Plan (see SWE-102). Actuals are available from the task reports and time card charges. Tools such as Earned Value management are useful for interpreting these measures to determine project impacts and accomplishments.
Milestones that describe life cycle phase completions, major work product events, and code deliveries are managed by a comparison of planned vs. actual dates. Other key milestones may also be selected for tracking to manage risk and safety activities (see SWE-016).
Quantitative measures that describe progress can be easily counted and reported. Variations of actual results from planned results indicate problem and risk areas.
Software unit testing and systems testing are a strong indicator that design and coding activities are being completed. The execution and passing of tests indicate the quality of the resulting work products.
Increased numbers of replans or rebaselining indicate a project in flux. Lengthening times between replans and rebaselines indicate a project that has a settled set of requirements and one that is closer to completion.
The tracking of completed requirements assists in the verification and validation activities. Consideration for assigning weighted values to each requirement may provide a better insight into the true level of completion of the software development activities.
The identification of and planned completion rates or dates for function points (functional user requirements of the software are identified and each one is categorized into one of five types: outputs, inquiries, inputs, internal files, and external interfaces) vs. actual results indicates the level of control being maintained on the complexity and the productivity of the software. Care must be taken to define and count function points in a common manner across all the organizations developing software. In addition, the reporting and recording needs to include sufficient information for the user or reader of the SMR to understand the function point reporting methodology.
The Center or project configuration management system, the change management process, and the problem reporting and corrective action systems need to be described in consistent terms. If other or additional nomenclature or measures (new, open, closed, severity) are used, explain them in the SMR.
Actual discrepancy reports and their storage procedures are the source for this metric input. Sufficient references and or citations are needed to enable readers or users of the SMR to find individual reports.
This information comes from the Software Development or Management Plan and from periodic reports and status meetings on peer reviews (see SWE-119).
This information is defined in the Software Development or Management Plan (see SWE-102) and from periodic reports and status meetings (see SWE-119).
This information is defined in the Quality Assurance Plan and from periodic reports and status meetings (see SWE-022).
This information is defined in the Quality Assurance Plan and from periodic reports and status meetings.
This information is typically a summary of the risk management system and related tracking reports for risk mitigation (see SWE-086).
This information is typically a summary of verification and validation (V&V) activities, testing reports, and completed design reviews.
The importance given to the analysis results from using the static analyzers indicates a confidence in this type of quality assurance activity. Methods for reporting these results are based on the tool outputs and the project's needs. Again, sufficient descriptive information needs to be included to make these metrics understandable to the user.
Indicate the source of the identified requirements, e.g., the Software Requirements Specification, lower level specifications. Describe if the count is by unique paragraph (line) number or by the number of "shall" statements. (Unfortunately, not all requirements specifications practice good requirement writing techniques by having only one 'shall" per paragraph (line).)
The SMR may include trend lines for running averages, life cycle phases, or annual totals. Describe the metric, especially if time periods other than or in addition to the requested 'per month' period are used.
Be careful to differentiate between To Be Revised, i.e., an approximate or initial condition value exists, and To Be Determined (entry is blank) values.
Consider if there are naming conventions or if additional informative material is needed to differentiate between similarly entitled projects.
Larger projects may use multiple languages. Identify versions and service pack information, if appropriate.
Identify plans to use software in multiple domains.
Include counting methodologies and conventions to assure numbers are provided on a common basis.
Allot enough space in the SMR if this utilization varies by software domain, life cycle phase, or any other discriminator.4. Small Projects
5. Resources
5.1 Tools
6. Lessons Learned
SWE-117 - Software Metrics Report
Web Resources
View this section on the websiteUnknown macro: {page-info}