An example set of software progress tracking measures include, but are not limited to:
Software resources, such as budget and effort (planned vs. actual).
Software development schedule tasks (e.g., milestones) (planned vs. actual).
Implementation status information (e.g., number of computer software units in design phase, coded, unit tested, and integrated into computer software configuration item vs. planned).
Test status information (e.g., number of tests developed, executed, passed).
Number of replans/baselines performed.
An example set of software functionality measures include, but are not limited to:
Number of requirements included in a completed build/release (planned vs. actual).
Function points (planned vs. actual).
An example set of software quality measures include, but are not limited to:
Number of software Problem Reports/Change Requests (new, open, closed, severity).
Review of item discrepancies (open, closed, and withdrawn).
Number of software peer reviews/inspections (planned vs. actual).
Software peer review/inspection information (e.g., effort, review rate, defect data).
Number of software audits (planned vs. actual).
Software audit findings information (e.g., number and classification of findings).
Software risks and mitigations.
Number of requirements verified or status of requirements validation.
Results from static code analysis tools.
An example set of software requirement volatility measures include, but are not limited to:
Number of software requirements.
Number of software requirements changes (additions, modifications, deletions) per month.
Number of "to be determined" items.
An example set of software characteristics include, but are not limited to:
Project name.
Language.
Software domain (flight software, ground software, Web application).
Number of source lines of code by categories (e.g., new, modified, reuse) (planned vs. actual).
Computer resource utilization in percentage of capacity.
Other information may be provided at the supplier's discretion to assist in evaluating the cost, technical, and schedule performance; e.g., innovative processes and cost reduction initiatives.
Div
id
tabs-2
2. Rationale
The Software Metrics Report (SMR) provides data to the project for the assessment of software cost, technical, and schedule progress. The reports provide a project or software lead:
Tracking measures to indicate progress achieved to date and relates them to cost.
Functionality measures to indicate the capabilities achieved to date.
Quality measures to indicate the degree to which checks and inspections have found and removed problems and defects in the software.
Requirements volatility measures to indicate current project/product stability and the potential for future iterations and changes in the product.
Software characteristics to help to uniquely identify the project and its work products, along with its major features.
Measurement is a key process area for successful management and is applied to all engineering disciplines. Measurement helps to define and implement more realistic plans, as well as to monitor progress against those plans. Measurement data provides objective information that helps project management to perform the following:
More accurately plan a project or program that is similar to one that has been completed.
Identify and correct problems early in the life cycle (more proactive than reactive).
Assess impact of problems that relate to project or program objectives.
Make proper decisions that best meet program objectives.
Defend and justify decisions.
These metrics serve as the major foundation for efforts to manage, assess, correct, report, and complete the software development activities. Since a computer software configuration item (CSCI) is a group of software that is treated as a single entity by a configuration management system, it is the lowest level of a product that can be effectively tracked. The SMR typically uses this CSCI information to aggregate management metrics for current project statusing and future project planning.
Div
id
tabs-3
3. Guidance
The SMR captures all the information that results from exercising and completing the requirements for SWE-090, SWE-091, SWE-092, SWE-093, and SWE-094. The SMR serves as a single repository for collecting the information developed from these activities and saving and presenting them to the appropriate stakeholders and project personnel. They result from the chosen measurement objectives (see SWE-090) and provide a view into the types of actions or decisions that may be made based on the results of the analysis and help prioritize the areas where measures need to be collected. SWE-091 calls for the project to develop and record measures in software progress tracking, software functionality, software quality, software requirements volatility, and software characteristics. SWE-092 indicates that data collection and storage procedures are specified and that the data itself needs to be then collected and stored. According to SWE-093,the collected software measures are to be analyzed with project- and Center-approved methods. Finally, SWE-094 calls for the results to be periodically reported and for access to the measurement information to be made available. All of this information may feed into Mission Directorate measurement and metrics programs (see SWE-095).
Software Tracking Measures
Typically, the most common reason for implementing a measurement program is to track progress, one of the hardest things to do effectively. Consider the following four attributes for selecting effective tracking measures:
Objectivity: The measurement needs to be based on criteria that are observable and verifiable.
Near Real Time: The measurement reflects what is happening now in the project.
Multiple Levels: The measure needs to have both drill-down capability and be able to be rolled up to summary levels.
Prediction: The measure must support projections about future progress.
An example set of measures for tracking the project are shown in the recommended content:
Software resources, such as budget and effort (planned vs. actual). Manpower and dollar planning levels are typically found in the project plan or the Software Development or Management Plan. Actuals are available from the task reports and time card charges. Tools such as Earned Value management are useful for interpreting these measures to determine project impacts and accomplishments.
Software development schedule tasks and milestones (planned vs. actual). Milestones that describe life-cycle phase completions, major work product events, and code deliveries are managed by a comparison of planned vs. actual dates. Other key milestones may also be selected for tracking to manage risk and safety activities (see SWE-016).
Implementation status information, e.g., number of computer software units in design phase, coded, unit tested, and integrated into CSCI vs. planned. Quantitative measures that describe progress can be easily counted and reported. Variations of actual results from planned results indicate problem and risk areas.
Test status information, e.g., number of tests developed, executed, passed. Software unit testing and systems testing are a strong indicator that design and coding activities are being completed. The execution and passing of tests indicate the quality of the resulting work products.
Number of replans/baselines performed. Increased numbers of replans or rebaselining indicate a project in flux. Lengthening times between replans and rebaselines indicate a project that has a settled set of requirements and one that is closer to completion.
Software Functionality Measures
"Function measurement methods rely on some definition of what constitutes software functionality."
Swerefn
refnum
365
"Function Point Analysis has been proven as a reliable method for measuring the size of computer software. In addition to measuring output, Function Point Analysis is extremely useful in estimating projects, managing change of scope, measuring productivity, and communicating functional requirements."
Swerefn
refnum
203
Number of requirements included in a completed build/release (planned vs. actual). The tracking of completed requirements assists in the verification and validation activities. Consideration for assigning weighted values to each requirement may provide a better insight into the true level of completion of the software development activities.
Function Points (planned vs. actual). The identification of and planned completion rates or dates for function points vs. actual results indicates the level of control being maintained on the complexity and the productivity of the software. Care must be taken to define and count function points in a common manner across all the organizations developing software. In addition, the reporting and recording needs to include sufficient information for the user or reader of the SMR to understand the function point reporting methodology.
Software Quality Measures
"Historically, software quality metrics have been the measurement of exactly their opposite-that is, the frequency of bugs and defects."
Swerefn
refnum
232
Crosby has defined software quality as conformance to a specification.
Swerefn
refnum
169
Number of software Problem Reports/Change Requests (new, open, closed, severity). The Center or project configuration management system, the change management process, and the problem reporting and corrective action systems need to be described in consistent terms. If other or additional nomenclature or measures (new, open, closed, severity) are used, explain them in the SMR.
Review of item discrepancies (open, closed, and withdrawn). Actual discrepancy reports and their storage procedures are the source for this metric input. Sufficient references and or citations are needed to enable readers or users of the SMR to find individual reports.
Number of software peer reviews/inspections (planned vs. actual). This information comes from the Software Development or Management Plan and from periodic reports and status meetings on peer reviews.
Software peer review/inspection information, e.g., effort, review rate, defect data. This information is defined in the Software Development or Management Plan and from periodic reports and status meetings.
Number of software audits (planned vs. actual). This information is defined in the Quality Assurance Plan and from periodic reports and status meetings (see SWE-022).
Software audit findings information, e.g., number and classification of findings. This information is defined in the Quality Assurance Plan and from periodic reports and status meetings.
Software risks and mitigations. This information is typically a summary of the risk management system and related tracking reports for risk mitigation (see SWE-086).
Number of requirements verified or status of requirements validation. This information is typically a summary of verification and validation (V&V) activities, testing reports, and completed design reviews.
Results from static code analysis tools. The importance given to the analysis results from using the static analyzers indicates a confidence in this type of quality assurance activity. Methods for reporting these results are based on the tool outputs and the project's needs. Again, sufficient descriptive information needs to be included to make these metrics understandable to the user.
Software Requirement Volatility
"The identified causes of requirements volatility include presence of inconsistencies or conflicts among requirements; evolving user/customer knowledge and priorities; project concurrent activities like defect fixing, functionality correction; technical, schedule or cost related problems; change in work environment; and process model selection decisions."
Swerefn
refnum
347
Number of software requirements. Indicate the source of the identified requirements, e.g., the Software Requirements Specification, lower level specifications. Describe if the count is by unique paragraph (line) number or by the number of "shall" statements. (Unfortunately, not all requirements specifications practice good requirement writing techniques by having only one 'shall" per paragraph (line).)
Number of software requirements changes (additions, modifications, deletions) per month. The SMR may include trend lines for running averages, life-cycle phases, or annual totals. Describe the metric, especially if time periods other than or in addition to the requested 'per month' period are used.
Number of To Be Determined items. Be careful to differentiate between To Be Revised (i.e., an approximate or initial condition value exists) and To Be Determined (entry is blank) values.
Software Characteristics
Software characteristics are sometimes called software attributes. These basic characteristics are necessary for developing cost models, planning aids, and general management principles.
Swerefn
refnum
329
A simple X-row by Y-column table can be used to capture and effectively display this information.
Project name. Consider if there are naming conventions or if additional informative material is needed to differentiate between similarly entitled projects.
Language. Larger projects may use multiple languages. Identify versions and service pack information, if appropriate.
Software domain (flight software, ground software, Web application). Identify plans to use software in multiple domains.
Number of source lines of code by categories, e.g., new, modified, reuse, (planned vs. actual). Include counting methodologies and conventions to assure numbers are provided on a common basis.
Computer resource utilization in percentage of capacity. Allot enough space in the SMR if this utilization varies by software domain, life-cycle phase, or any other discriminator.
Formats
The SMR can be provided in an electronic format or via access to the software metric data repository. It is preferred that the software metric data be provided in an electronic format or method. This document does not have to be a formal hard copy Data Requirement Document if electronic access of the information is provided. It is also acceptable for an organization to provide this information as a part of a monthly software review process. The final set of metrics used by a project needs to be determined by the project and organizational needs and the software development life-cycle phase. Which software metrics are reported at what point in the development or maintenance life cycle needs to be addressed in the software planning documents or organizational planning documents.
Div
id
tabs-4
4. Small Projects
This guidance lists a minimum recommendation for providing five areas of information on a Computer Software Configuration Item (CSCI) basis. Smaller projects may consider providing less information than listed in the example sets for each information item. The information included in the SMR needs to be sufficient to manage the project, manage risk, and maintain safety throughout the project's life cycle.
Div
id
tabs-5
5. Resources
Include Page
MC Metrics
MC Metrics
5.1 Tools
Include Page
TT Metrics
TT Metrics
Div
id
tabs-6
6. Lessons Learned
No Lessons Learned have currently been identified.