See edit history of this section
Post feedback on this section
- 1. The Requirement
- 2. Rationale
- 3. Guidance
- 4. Small Projects
- 5. Resources
- 6. Lessons Learned
- 7. Software Assurance
1. Requirements
3.9.2 The project manager shall acquire, develop, and maintain software from an organization with a non-expired CMMI-DEV rating as measured by a CMMI Institute Certified Lead Appraiser as follows:
- For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
- For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.
1.1 Notes
Organizations need to complete an official CMMI® Institute defined appraisal against the CMMI®-DEV model V2.0. Organizations are to maintain their rating and have their results posted on the CMMI® Institute Website, or provide an Appraisal Disclosure Statement so that NASA can assess the current maturity/capability rating. Software development organizations need to maintain their appraisal rating during the period they are responsible for the development and maintenance of the software. CMMI® ratings can cover a team, a group, a project, a division, or an entire organization.
For Class B software, an exception can be exercised for those cases in which NASA wishes to purchase a product from the "best in class provider," but the best in class provider does not have the required CMMI® rating. For Class B software, instead of a CMMI® rating by a development organization, the project will conduct an evaluation, performed by a qualified evaluator selected by the Center ETA, against the CMMI®-DEV Maturity Level 2 practices, and mitigate any risk, if deficiencies are identified in the evaluation. If this approach is used, the development organization and project are responsible for correcting the deficiencies identified in the evaluation. When this exception is exercised, the OCE and Center ETA are notified of the proposition and provided the results of the evaluation. The project manager should seek guidance from the Office of Procurement (OP) for help in exercising the exception.
1.2 History
1.3 Applicability Across Classes
Class A B C D E F Applicable?
Key: - Applicable | - Not Applicable
2. Rationale
The CMMI® requirement is a qualifying requirement for NASA. The requirement is included to ensure that NASA projects are supported by software development organization(s) having the necessary skills and processes in place to produce reliable products within cost and schedule estimates.
3. Guidance
3.1 Capability Maturity Model Integration
The Capability Maturity Model (CMM®) and the CMMI®-DEV have internationally used frameworks for process improvement in development organizations. It is an organized collection of best practices and proven process areas. Practices cover topics that include eliciting and managing requirements, decision making, measuring performance, planning work, handling risks, and more. Using these practices, NASA can improve NASA software projects’ chances of mission success.
This requirement provides NASA with a methodology to:
- Measure software development organizations against an industry-wide set of best practices that address software development and maintenance activities applied to products and services.
- Measure and compare the maturity of an organization’s product development and acquisition processes with the industry state of the practice.
- Measure and ensure compliance with the intent of the NPR 7150.2 process-related requirements using an industry-standard approach.
- Assess internal and external software development organizations' processes.
- Identify potential risk areas within a given organization’s software development processes.
Benefits of using CMMI® include:
- Reducing the risk of software failure - Increasing mission safety.
- Improving the accuracy of schedule and cost estimates by requiring the use of historical data and repeatable methods.
- Helping NASA become a smarter buyer of contracted out software.
- Increasing quality by finding and removing more defects earlier.
- Improving the potential for reuse of tools and products across multiple projects.
- Increasing the ability to meet the challenges of evolving software technology.
- Improving software development planning across the Agency.
- Improving the NASA contractor community concerning software engineering.
- Lowering the software development cost.
- Improving employee morale.
- Improving customer satisfaction.
- Improving NASA and contractor community knowledge and skills.
- Providing NASA with a solid foundation and structure for developing software in a disciplined manner.
CMMI® ratings can cover a team, a workgroup, a project, a division, or an entire organization. When evaluating software suppliers, it’s important to make sure that the specific organization doing the software work on the project has the cited rating (as some parts of a company may be rated while others are not).
Many of the requirements in NPR 7150.2 are consistent with the established process areas in the CMMI®-DEV framework. The CMMI®-DEV rating, as well as consistent NPR 7150.2 requirements, are both needed to ensure that organizations have demonstrated the capability to perform key software engineering processes and have a binding agreement to continue to execute key software engineering processes during the development of NASA’s most critical software systems. See also SWE-129 - OCE NPR Appraisals and SWE-221 - OSMA NPR Appraisals.
See also SWE-003 - Center Improvement Plans for information about CMMI in the Center Improvement Plans.
3.2 Applicability
This requirement applies to software in Classes A and B. It is recommended that projects check the status of the software development or maintenance organization's CMMI® rating at each major project life cycle review to ensure continued compliance and to identify potential risk areas in the software processes. A "check" can easily be done via the CMMI® Institute's Published Appraisals website 327.
3.3 General Software Acquisition Guidance
The content of the supplier agreement is critical to the acquisition of any software, including software embedded in a delivered system. In addition to the CMMI® Maturity Level requirements placed on the supplier by SWE-032, the supplier agreement must also specify compliance with the software contract requirements identified in NPR 7150.2. The creation and negotiation of any supplier agreement involving software need to include representatives from the Center's software engineering and software assurance organizations to ensure that the software requirements are represented in the acquisition agreement(s). The agreements identify the following aspects of the acquisition:
- Technical requirements on the software.
- Definition and documentation of all software deliverables.
- Required access to intermediate and final software work products throughout the development life cycle.
- Compliance and permissible exceptions to NPR 7150.2 and any applicable Center software engineering requirements.
- Software development status reporting includes implementation progress, technical issues, and risks.
- Definition of acceptance criteria for software and software work products.
- Non-technical software requirements include licensing, ownership, use of third party or Open Source Software, and maintenance agreements.
See also Topic 7.03 - Acquisition Guidance
Representatives from the Center's software engineering and assurance organizations must evaluate all software-related contract deliverables before acceptance by the Project. The deliverables must be evaluated for:
- Compliance with acceptance criteria.
- Completeness.
- Accuracy.
3.3.1 Class A software
If you acquire, develop, or maintain Class A software the organization performing the functions is required to have a non-expired CMMI®-DEV Level 3 or higher rating.
3.3.2 Class A software acquisition guidance
To ensure that the solicitation, contract, and delivered products meet the requirements of this NPR, the Project's acquisition team must be supported by representatives from a software engineering and software assurance organization that is either rated at CMMI®-DEV Maturity Level 3 or higher or rated at CMMI®-DEV Capability Level 3 in at least the process areas of Supplier Agreement Management and Process and Product Quality Assurance. This support may be in the form of direct involvement in the development of supplier agreements or review and approval of these agreements. The support must also include the review and approval of any software-related contract deliverables. The extent of the CMMI®-DEV Level 3 rated organization's support required for a Class A acquisition can be determined by the Center's Engineering Technical Authority responsible for the project.
Identification of the appropriate personnel from an organization that has been rated at a CMMI®-DEV Level 3 or higher to support the Project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center Management. The Center Engineering Technical Authority has the responsibility for ensuring that the appropriate and required NASA Software Engineering requirements are included in an acquisition.
For those cases in which a Center or project desires a general exclusion from the NASA Software Engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester can submit a waiver for those exclusions or alternate requirements in the form of a streamlined compliance matrix for approval by the designated Engineering and SMA Technical Authorities with appropriate justification.
3.3.3 Class A software development or maintenance guidance
The software organizations that directly develop or maintain Class A software are required to have a valid CMMI®-DEV Level 3 or higher rating for the organization performing the activities. Support contracts supporting NASA in-house software development organizations can be included in the NASA organizational assessments. Project contractors and subcontractors performing Class A software development are required to have their CMMI®-DEV Level 3 rating. NASA and primes need to pass this requirement down in contracts to ensure all subcontractors have the necessary CMMI®-DEV rating.
The CMMI®-DEV Level 3 rating is to be maintained throughout the project’s development or maintenance period. NASA requests organizations’ CMMI® ratings are posted on the CMMI Institute website 327. The CMMI® Institute vets the validity of the CMMI® appraisals on this list and assures the rating hasn’t expired (as of this writing CMMI® ratings are valid for 3 years). In rare instances (rating earned in a classified environment) an organization may have a current CMMI®-DEV rating, but it doesn’t appear on the CMMI® Institute website. In these cases, the supplier’s claim can be directly checked with the CMMI® Institute.
3.3.4 Class B software
(except Class B software on NASA Class D payloads) - CMMI®-DEV Maturity Level 2 Rating or higher for software, or CMMI®-DEV Capability Level 2 Rating or higher for software in the following process areas:
a. Requirements Management.
b. Configuration Management.
c. Process and Product Quality Assurance.
d. Measurement and Analysis.
e. Project Planning.
f. Project Monitoring and Control.
g. Supplier Agreement Management (if applicable).
3.3.5 Class B software acquisition guidance
To ensure that the solicitation, contract, and delivered products meet the requirements of this NPR, the Project's acquisition team must be supported by representatives from a software engineering and software assurance organization that is either rated at CMMI®-DEV Maturity Level 2 or higher or rated at CMMI®-DEV Capability Level 2 in at least the process areas of Supplier Agreement Management and Process and Product Quality Assurance. This support may be in the form of direct involvement in the development of supplier agreements or review and approval of these agreements. The support must also include the review and approval of any software-related contract deliverables.
The Center Engineering Technical Authority responsible for the project determines the extent of the CMMI®-DEV Level 2 rated organization's support required (see description in the previous paragraph) for a Class B acquisition. Identification of the appropriate personnel from an organization that has been rated at a CMMI®-DEV Level 2 or higher to support the Project acquisition team is the responsibility of the designated Center Engineering Technical Authority and Center Management. The Center Engineering Technical Authority has the responsibility for ensuring that the appropriate and required NASA Software Engineering requirements are included in an acquisition.
For those cases in which a Center or project desires a general exclusion from the NASA Software Engineering requirement(s) in this NPR or desires to generically apply specific alternate requirements that do not meet or exceed the requirements of this NPR, the requester can submit a waiver in the form of a streamlined compliance matrix for those exclusions or alternate requirements for approval by the designated Engineering and SMA Technical Authorities with appropriate justification.
3.3.6 Class B software development or maintenance guidance
The software organizations that directly develop or maintain Class B software are required to have a valid CMMI®-DEV Level 2 or higher rating (via a Continuous or Staged representation) for the organization performing the activities. Support contracts supporting NASA in-house software development organizations can be included in the NASA organizational assessments. Project contractors and subcontractors performing Class B software development are required to have their own CMMI®-DEV Level 2 or higher rating. The CMMI®-DEV Level 2 maintains an active rating during the development or maintenance period. The rating is to be posted on the CMMI® Institute website 327.
3.3.7 Guidance on the exception for Class B software development and maintenance
If this option is used, the project is responsible for funding the evaluation and for addressing all risks that are identified during the evaluation. A CMMI appraisal across the listed process areas in this requirement is one method for conducting this evaluation. The Center Engineering Technical Authority is responsible for maintaining all records associated with the evaluation for the life of the project. The decision on participators in the evaluation process is determined by the responsible Center Engineering Technical Authority on the project. Recommended guidance is that the “qualified evaluator” should have demonstrated experience in an appraisal or training.
3.3.8 Guidance on Class B software on NASA Class D payloads (as defined in NPR 8705.4) and Class C software
While not required, it is highly recommended that providers have a Certified CMMI® Lead Appraiser conduct periodic informal evaluations against process areas chosen by the project and project engineering based on the risk associated with the project. The project determines if an assessment is needed, identifies the required areas for the assessment, and communicates this information to the provider. A sample assessment process, “Process for Evaluation in Lieu of CMMI® Appraisal,” can be found in Software Processes Across NASA (SPAN), accessible to NASA users from the SPAN tab in this Handbook.
3.4 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook:
Related Links |
---|
See also
3.5 Center Process Asset Libraries
SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki 197
See the following link(s) in SPAN for process assets from contributing Centers (NASA Only).
SPAN Links |
---|
4. Small Projects
National Defense Industrial Association (NDIA) CMMI® Working Group conducted a study on the use of CMMI®-DEV within Small Businesses in 2010 158. One of the counter-intuitive findings was that the "Perceptions that CMMI® is too burdensome for small businesses is not supported by data on CMMI®-DEV adoption". Significant numbers of organizations in the 1-20 employees range adopted and achieved CMMI® Level ratings.
Small projects are expected to take advantage of the Agency, Center, and/or organizational assets.
5. Resources
5.1 References
- (SWEREF-153) Chrissis, M.B., Konrad, M., Shrum, S., This book is the definitive reference for CMMI-DEV Version 1.3. It describes best practices for the development and maintenance of products and services across their lifecycle. 3rd Edition, 2010, Addison-Wesley Professional, ISBN: 0-321-71150-5
- (SWEREF-157) CMMI Development Team (2010). CMU/SEI-2010-TR-033, Software Engineering Institute.
- (SWEREF-158) NDIA Working Group, NDIA Systems Engineering Division, CMMI Technology Conference, Nov, 2010.
- (SWEREF-196) CMMI - Capability, Maturity Model, Integration, Version 2.0, See your SEPG for a NASA licensed copy.
- (SWEREF-197) Software Processes Across NASA (SPAN) web site in NEN SPAN is a compendium of Processes, Procedures, Job Aids, Examples and other recommended best practices.
- (SWEREF-327) Software Engineering Institute (SEI), architecture web site.
- (SWEREF-422) Code Q-1, NASA Headquarters, February, 2001. Contains reports from 2004 through 2018
- (SWEREF-457) The link will generate a current list of Organizational Units which have completed and reported SCAMPI Class A appraisals against the CMMI or People CMM Model. Documented authorization has been received from the sponsor of each posted appraisal for this release of information.
- (SWEREF-553) Public Lessons Learned Entry: 1414.
5.2 Tools
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.
6. Lessons Learned
6.1 NASA Lessons Learned
A documented lesson from the NASA Lessons Learned database notes the following:
- Acquisition Philosophy and Mechanism. Lesson Number 1414 553: "Since the procurement's goal is to minimize the time from start to finish, part of its philosophy is to instill efficiency into the Contractor-Government roles and relationships. Thus, it becomes paramount during the selection process to ensure that the Contractor's processes, procedures, and tools are adequate (based on some established criteria such as ISO 9001 and/or CMMI) to allow the Government to take a 'hands-off approach during implementation. Also, any criteria to be used to verify/validate and or assess the Contractor's work after the contract award must be consistent and compatible with the performance criteria levied on the Contractor."
Additional CMM/CMMI Lessons Learned by NASA associated with implementing and maintaining this requirement are:
- Preparing for an appraisal helps you get measurable process improvement.
- CMMI process helped Centers establish a baseline of where they are.
- Organizations develop an extensive set of "tools" (i.e., templates, spreadsheets) to help projects with CMMI practices and artifacts.
- The use of a toolset helped projects reach compliance much faster.
- The use of organizational tools helps support small project development efforts.
- Software Engineers that have participated in the CMMI process can be mentors that can help implement project tools and help projects utilize and tailor the software development processes.
- The CMMI process helps establish sponsorship across departments and with Engineering management.
- Establish a relationship early with the CMMI Lead Appraiser.
- PIID development depends on a good artifact collection process and a data management approach.
- The CMMI workshops can be used to review the processes in-depth and reinforce the toolsets.
- The CMMI process helped establish a method of tracking progress on software development activities.
- The CMMI process improves project management and software configuration management areas.
- CMMI assessments help identify areas for process and project improvement.
- Projects appreciate systematic and analytical feedback on what they are doing.
- Measurement and analysis are a big challenge in the CMMI process.
- Improved quality and review of management plan early in the life cycle and reuse of the plans for new projects.
- Resource planning and tracking at the individual process level provided little additional benefit to the projects.
- Smaller projects need to have lightweight processes to avoid being smothered (especially for a one-person task).
6.2 Other Lessons Learned
- As part of its annual review, the Aerospace Advisory Panel included this finding in the Computer Hardware/Software section of its Annual Report for 2000 422 : "NASA has initiated plans to have its critical systems processes evaluated according to the Capability Maturity Model (CMM) of the CMMI Institute and to work toward increasing the CMM level of its critical systems processes."
- Evaluate all software problem reports or software change tickets identified as ‘no impact’ to design or testing. Ensure the rationale is adequate, and if found inadequate perform a delta design/code review to ensure the code and data are compliant with the requirements.
- Enforce policy to use controlled design artifacts (ICDs, SRSs, SDDs) for implementation and verification purposes, rather than relying on informal design information.
Controlled content must be sufficient for implementation and verification purposes.
Software problem reports or software change tickets must be closed only based on formally controlled content.
7. Software Assurance
- For Class A software: CMMI-DEV Maturity Level 3 Rating or higher for software.
- For Class B software (except Class B software on NASA Class D payloads, as defined in NPR 8705.4): CMMI-DEV Maturity Level 2 Rating or higher for software.
7.1 Tasking for Software Assurance
1. Confirm that Class A and B software acquired, developed, and maintained by NASA is performed by an organization with a non-expired CMMI-DEV rating, as per the NPR 7150.2 requirement.
2. Assess potential process-related issues, findings, or risks identified from the CMMI assessment findings.
7.2 Software Assurance Products
- Assessment of CMMI Assessment Findings
- Software Assurance Process Audit Report
- SW Development Processes and Practices Audit Report
- Identification of any process-related risks, and findings from CMMI appraisal findings.
Objective Evidence
- Evidence of confirmation that the organization has the required CMMI-Dev rating.
7.3 Metrics
- # of software process Non-Conformances by life cycle phase over time.
- # of Compliance Audits planned vs. # of Compliance Audits performed
- # of Open vs. Closed Audit Non-Conformances over time
- Trends of # of Non-Conformances from audits over time (Include counts from process and standards audits and work product audits.)
- # of Non-Conformances per audit (including findings from process and compliance audits, process maturity)
- # of process Non-Conformances (e.g., activities not performed) identified by SA vs. # accepted by the project
- Trends of # Open vs. # Closed over time
- # of Risks by Severity (e.g., red, yellow, green) over time
- # of Risks with mitigation plans vs. total # of Risks
- # of Risks trending up over time
- # of Risks trending down over time
7.4 Guidance
The project is responsible for funding the required CMMI evaluation and for addressing all risks that are identified during the CMMI evaluation.
The Capability Maturity Model Integrated CMMI®-DEV is an internationally used framework for process improvement in development organizations. It is an organized collection of best practices and proven process areas. Practices cover topics that include eliciting and managing requirements, decision making, measuring performance, planning work, handling risks, and more. Using these practices, NASA can improve NASA software projects’ chances of mission success.
This requirement provides NASA with a methodology to:
- Measure software development organizations against an industry-wide set of best practices that address software development and maintenance activities applied to products and services.
- Measure and compare the maturity of an organization’s product development and acquisition processes with the industry state of the practice.
- Measure and ensure compliance with the intent of the NPR 7150.2 process-related requirements using an industry-standard approach.
- Assess internal and external software development organizations' processes.
- Identify potential risk areas within a given organization’s software development processes.
Look at the CMMI Published Appraisal Results 327site, and verify that the software organization appraisal results are listed.
For Class B software, an exception can be exercised for those cases in which NASA wishes to purchase a product from the "best in class provider," but the best in class provider does not have the required CMMI® rating. For Class B software, instead of a CMMI® rating by a development organization, the project will conduct an evaluation, performed by a qualified evaluator selected by the Center Engineering Technical Authority, against the CMMI-DEV Maturity Level 2 practices, and mitigate any risk, if deficiencies are identified in the evaluation. If this approach is used, the development organization and project are responsible for correcting the deficiencies identified in the evaluation. When this exception is exercised, the OCE and Center Engineering Technical Authority are notified of the proposition and provided the results of the evaluation. The project manager should seek guidance from the Center Office of Procurement for help in making these determinations.
For class B software the project can conduct an evaluation, performed by a qualified evaluator selected by the Center Engineering Technical Authority, against the CMMI-DEV Maturity Level 2 practices, and mitigate any risk if deficiencies are identified in the evaluation. Look for the assessment report performed by a qualified evaluator selected by the Center Engineering Technical Authority for the results. Verify that the assessment was approved by a qualified evaluator selected by the Center Engineering Technical Authority. Verify that the identified risks have been addressed by the project, engineering, and SMA as needed.
Every task that involves performing an audit should also clarify that all audit findings are promptly shared with the project and will be addressed in the handbook guidance.
7.5 Additional Guidance
Additional guidance related to this requirement may be found in the following materials in this Handbook: