See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
5.4.4 The project manager shall provide access to the software measurement data, measurement analyses, and software development status as requested to the sponsoring Mission Directorate, the NASA Chief Engineer, the Center Technical Authorities, and Headquarters SMA.
1.1 Notes
NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
This requirement intends to provide access to the software metric data during the project life cycle for those Agency and Center-defined organizations responsible for assessing and utilizing the metric data. The software development project organizations can provide access to several organizations tasked to review or access the software development progress and quality. When a software effort is acquired from an industry partner, the contractor and/or subcontractors provide NASA with access to the software metric information on time to allow usage of the information.
NASA established software measurement programs to meet measurement objectives at multiple levels within the Agency. In particular, measurement programs are established at the project and also at the Center levels to satisfy organizational, project, program, and Directorate needs. Centers have measurement systems and record repositories that are used for records retention, and subsequent analysis and interpretation to identify overall levels of Center competence in software engineering, and to identify future opportunities for training and software process improvements.
3. Guidance
Management metrics are measurements that help evaluate how well software development activities are being conducted across multiple development organizations. Trends in management metrics support forecasts for future progress, early trouble detection, and realism in current plans. In addition, adjustments to software development processes can be evaluated, once they are quantified and analyzed. The collection of Center-wide data and analysis results provides information and guidance to Center and Agency leaders for evaluating overall capabilities, and in planning improvement activities and training opportunities for more advanced software process capabilities.
The data gained from center measurement programs assist in the management of projects, assuring and improving safety and product and process quality, and improving overall software engineering practices. In general, the project level and Center level measurement programs are designed to meet the following high-level goals:
- To improve future planning and cost estimation.
- To provide realistic data for progress tracking.
- To provide indicators of software quality.
- To provide baseline information for future process improvement activities.
Other requirements (SWE-090, SWE-091, SWE-092, and SWE-093) provide the software measurement data and the analysis methods that are used to produce the results being accessed by this SWE-094. The information is stored in the center repository (SWE-091) and recorded in a Software Metrics Report (see 5.05 - Metrics - Software Metrics Report).
Software measurement data which includes software development status (see SWE-090) and measurement analyses (see SWE-093) should be accessible to sponsoring Mission Directorate, NASA Chief Engineer, Center, and Headquarters SMA, and able to be captured in Center repositories.
It is the project’s responsibility to ensure that the measurement data, analysis results, and development status are communicated properly, on time, and to the appropriate people. The project collects data and reports analysis results periodically according to the established collection, reporting, and storage procedures (see SWE-090 ). The project also makes the information available regularly (as designed) and provides access at the request of sponsoring Mission Directorate, NASA Chief Engineer, Center and Headquarters SMA, and personnel managing Center repositories.
NASA-specific software measurement access information and resources are available in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
Additional guidance related to software measurements may be found in the following related requirements in this Handbook:
4. Small Projects
No additional guidance is available for small projects.
5. Resources
5.1 References
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-329) Technical Report - NASA-GB-001-94 - Doc ID: 19980228474 (Acquired Nov 14, 1998), Software Engineering Program,
- (SWEREF-336) Software Technology Support Center (STSC) (1995), Hill Air Force Base. Accessed 6/25/2019.
- (SWEREF-355) Westfall, Linda, The Westfall Team (2005), Retrieved November 3, 2014 from http://www.win.tue.nl/~wstomv/edu/2ip30/references/Metrics_in_12_steps_paper.pdf
- (SWEREF-567) Public Lessons Learned Entry: 1772.
- (SWEREF-577) Public Lessons Learned Entry: 3556.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
The NASA Lessons Learned database contains the following lessons learned related to access to measurement data:
- Know-How Your Software Measurement Data Will Be Used, Lesson No. 1772 567: "Before Preliminary Mission & Systems Review (PMSR), the Mars Science Laboratory (MSL) flight project submitted a Cost Analysis Data Requirement (CADRe) document to the Independent Program Assessment Office (IPAO) that included an estimate of source lines of code (SLOC) and other descriptive measurement data related to the proposed flight software. The IPAO input this data into its parametric cost estimating model. The project had provided qualitative parameters that were subject to misinterpretation and provided physical SLOC counts. These SLOC values were erroneously interpreted as logical SLOC counts, causing the model to produce a cost estimate of approximately 50 percent higher than the project's estimate. It proved extremely difficult and time-consuming for the parties to reconcile the simple inconsistency and reach agreement on the correct estimate."
The Recommendation states that "Before submitting software cost estimate support data (such as estimates of total SLOC and software reuse) to NASA for major flight projects (over $500 million), verify how the NASA recipient plans to interpret the data and use it in their parametric cost estimating model. To further preclude misinterpretation of the data, the software project may wish to duplicate the NASA process using the same or a similar parametric model, and compare the results with NASA's."
- Selection and use of Software Metrics for Software Development Projects, Lesson No. 3556 577: "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.
"Metrics or measurements are used to provide visibility into a software project's status during all phases of the software development life cycle to facilitate an efficient and successful project." The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will be used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:- "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
- "The number of errors found during software verification/validation.
- "The number of errors found in delivered software (a.k.a., 'process escapes').
- "Projected versus actual labor hours expended.
- "Projected versus actual lines of code, and the number of function points in delivered software."
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
- Confirm access to software measurement data, analysis, and status as requested. to the following entities:
- Sponsoring Mission Directorate
- NASA Chief Engineer
- Center Technical Authorities
- Headquarters SMA
7.2 Software Assurance Products
- None at this time.
Objective Evidence
- Status presentation showing metrics and treading data
7.3 Metrics
- None identified at this time
7.4 Guidance
The software assurance task in this requirement is to confirm that the four groups/individuals listed below have access to the project’s software measurement data, measurement analysis, and software development status, as requested:
- Sponsoring Mission Directorate
- NASA Chief Engineer
- Center Technical Authorities
- Headquarters SMA
In some cases, the project may include some of this information in their status reports, such as the software development status and the measurement analysis describing the major take-aways from the measurements collected. If this information is in the status reports, see if there is evidence that it was distributed to or communicated to those above. Measurements and Metrics should be identified in the project-specific management or development plans.
Likely, most of the above group may not see this information regularly, particularly in the case of the software measurement data, so it will be necessary to inquire if the information was available when it was requested. (Or it might be easier to ask if any requests for software measurement data, measurement analysis, and software development status were issued where NO results or INCOMPLETE results were received. In other words, did they get what they asked for?)
See topic 8.03 - Organizational Goals of Software Assurance Metrics - For a table of software assurance metrics with their associated goals and questions.