bannerd

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin

Excerpt Include
SWEHBVD:2D-SWE Page Message
SWEHBVD:2D-SWE Page Message
nopaneltrue

Show If
spacePermissionedit
Panel
titleVisible to editors only

Suggested content updates on this page are in red text. Links in the text are pointing to the "Review" pages in SITE that have additional suggested changes to tie all of these changes together. 

Tabsetup
1. The Requirement
1. The Requirement
12. Rationale
23. Guidance
34. Small Projects
45. Resources
56. Lessons Learned
Div
idtabs-1

1. Requirements

Excerpt

2.1.5.5 The Center Director, or designee, shall report on the status of the Center’s software engineering discipline, as applied to its projects, upon request by the OCE, OSMA, or OCHMO.  

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

Expand
titleClick here to view the history of this requirement: SWE-095 History

Include Page
SWE-095 History
SWE-095 History


Include Page
SWEHBVD:SWE 1.3 A.13
SWEHBVD:SWE 1.3 A.13

Div
idtabs-2

2. Rationale

The intent of the requirement is for the Centers to allow the NASA Office of the Chief Engineer (OCE) and relevant Technical Authorities (TA) to assess the status of NASA software improvement initiatives, measurement data, and workforce data, to address internal and external questions asked of the software community, to identify efficiencies in software engineering workflows, to facilitate collaboration between Centers, and to address Agency cost savings studies and questions.


Div
idtabs-3

3. Guidance

3.1 Status of the Center’s Software Engineering Discipline

In addition to the reporting and access requirements for measurement data at the Center level described in SWE-094 - Reporting of Measurement Analysis, periodic Agency-level reporting of the status of the Center’s software engineering discipline as applied to its projects is also required.  This data may be requested by the NASA OCE or relevant TAs to help them obtain high-level views of the Agency’s software engineering discipline.

Typically requests, made once or twice each year, can be expected to include:

  • Project NPR 7150 compliance matrices.
  • CMMI assessment status, schedule, and findings.
  • Contractor CMMI assessment status, schedule, and findings.
  • NPR issues and suggestions.
  • Training status.
  • Software organization improvement goals and metrics.
  • Software metrics.
  • Project status and risks.

Responses to NASA OCE or relevant TAs can take the form of email, face-to-face meetings, presentations, spreadsheets, reports, or any other format that provides the requested information in a manner that the OCE or TA can understand the data being provided.  If a specific format is desired, the OCE representative or TA will include that information in the request. 

See also SWE-091 - Establish and Maintain Measurement Repository.

See also Review - SWE-002 - Software Engineering Initiative for the requirement on the Software Engineering Initiative.

See also Topic Review - 7.01 - History and Overview of the Software Process Improvement (SPI) Effort for additional details on the SPI Initiative.

3.2 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

Related Links

Include Page
SWE-095 - Related SWEs
SWE-095 - Related SWEs

Include Page
SWE-095 - Related SM
SWE-095 - Related SM

3.3 Center Process Asset Libraries

Excerpt Include
SPAN
SPAN
nopaneltrue

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

Include Page
SPAN improvement
SPAN improvement

Div
idtabs-4

4. Small Projects

No additional guidance is available for small projects.

Div
idtabs-5

5. Resources

5.1 References

refstable
Show If
groupconfluence-users
Panel
titleColorred
titleInstructions for Editors
Expand

Enter the necessary modifications to be made in the table below:

SWEREFs to be addedSWEREFS to be deleted


SWEREFs called out in the text: 197, 572, 577

SWEREFs NOT called out in text but listed as germane: 157, 252, 329, 336, 355

Related Links Pages

Children Display


5.2 Tools


Include Page
SWEHBVD:Tools Table Statement
SWEHBVD:Tools Table Statement

Div
idtabs-6

6. Lessons Learned

6.1 NASA Lessons Learned

  • Selection and use of Software Metrics for Software Development Projects, Lesson No. 3556
    Swerefn
    refnum577
    : "The design, development, and sustaining support of Launch Processing System (LPS) application software for the Space Shuttle Program provide the driving event behind this lesson.

"Metrics or measurements should be used to provide visibility into a software project's status during all phases of the software development life cycle to facilitate an efficient and successful project."  The Recommendation states that "As early as possible in the planning stages of a software project, perform an analysis to determine what measures or metrics will be used to identify the 'health' or hindrances (risks) to the project. Because collection and analysis of metrics require additional resources, select measures that are tailored and applicable to the unique characteristics of the software project, and use them only if efficiencies in the project can be realized as a result. The following are examples of useful metrics:

    • "The number of software requirement changes (added/deleted/modified) during each phase of the software process (e.g., design, development, testing).
    • "The number of errors found during software verification/validation.
    • "The number of errors found in delivered software (a.k.a., 'process escapes').
    • "Projected versus actual labor hours expended.
    • "Projected versus actual lines of code, and the number of function points in delivered software.


  • Flight Software Engineering Lessons, Lesson No. 2218
    Swerefn
    refnum572
    : "The engineering of flight software (FSW) for a typical NASA/Caltech Jet Propulsion Laboratory (JPL) spacecraft is a major consideration in establishing the total project cost and schedule because every mission requires a significant amount of new software to implement new spacecraft functionality."

The lesson learned Recommendation No. 8 provides this step as well as other steps to mitigate the risk from defects in the Flight Software (FSW) development process:

"Use objective measures to monitor FSW development progress and to determine the adequacy of software verification activities. To reliably assess FSW production and quality, these measures should include metrics such as the percentage of code, requirements, defined faults tested, and the percentage of tests passed in both simulation and testbed environments. These measures should also identify the number of units where both the allocated requirements and the detailed design have been baselined, where coding has been completed and successfully passed all unit tests in both the simulated and testbed environments, and where they have successfully passed all stress tests."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.