SWE-058 - Detailed Design

1. Requirements

4.3.2 The project manager shall develop, record, and maintain a software design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-058 - Last used in rev NPR 7150.2D

RevSWE Statement

3.2.3 The project shall develop, record, and maintain a detailed design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested.

Difference between A and B

Descoped from a detailed design to just a design;
otherwise, no change


4.3.3 The project manager  shall develop, record, and maintain a design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested.

Difference between B and C

Changed "design" to "software design".


4.3.2 The project manager shall develop, record, and maintain a software design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested.

Difference between C and DNo change

4.3.2 The project manager shall develop, record, and maintain a software design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested.

1.3 Applicability Across Classes















Key:    - Applicable | - Not Applicable

2. Rationale

The detailed design, as an extension of the architectural design, provides detailed descriptions of the lowest level of software components. The detailed design provides enough information for someone to create code without significant interpretation. The design maintains consistency with design standards, programming language dependencies, and external interfaces. Any redesign or maintenance of the software work product anywhere in the life cycle must also conform to the detailed design to avoid software performance degradation or error issues. The documentation of the design and descriptions of the lower-level components enables other members of the team to base their activities on previous versions to assure successful coding, compiling, and testing.

Recording, which may be textual, visual, or a combination, allows for the design to be inspected to ensure that the design meets the project's requirements and architectural design. The design also needs to be maintained to assure updates are completed, and to assist in future modifications to the software.

3. Guidance

NPR 7150.2 describes the software detail design activity, which is preliminary at project Preliminary Design Review (PDR) and baselined at Critical Design Review (CDR) (topic 7.08 - Maturity of Life-Cycle Products at Milestone Reviews), as occurring after the completion of the software architectural design process. This, in turn, is the "process of defining the architecture, components, modules, interfaces, and data" of a system or component. The software work product detailed design, which flows out of the software architectural definition (see SWE-057) and the preliminary design phase, is focused on defining the lower-level components, modules, and interfaces, and bringing them to a level of maturity sufficient to begin coding activities.

3.1 Design Readiness

A review of the success criteria for Preliminary Design Reviews (PDRs) (see 7.09 - Entrance and Exit Criteria) by the software development team will assure the readiness to proceed to the detailed design phase. The software development team then decides which criteria are necessary for the project.

An example checklist for determining readiness to begin software detailed design activities is shown below. It suggests "readiness" questions the software team leader can ask before starting and continuing into and during the detailed design activity. This checklist also assists the leader in understanding the software design issues of the project. If the leader cannot answer a question affirmatively, the project leader must determine whether or not they are ready to proceed.

Consider the following before starting the detailed design:

  • Do you have a well-documented software development process (see SWE-036)?
  • Do you understand what is to be performed and produced in each phase of the design process (see 5.08 - SDP-SMP - Software Development - Management Plan)?
  • Do you have a software architecture definition document (see 5.13 - SwDD - Software Design Description)?
  • Do you have a systems requirements specification (see 5.09 - SRS - Software Requirements Specification)?
  • Are you familiar with the methods, tools, standards, and guidelines for your project (see SWE-061)?
  • Are applicable and efficient design methods being implemented on your project?
  • Are the developers trained and experienced in the chosen development process and methods (see SWE-017)?
  • Is software reuse being considered throughout the development effort (see SWE-027)?
  • Is off-the-shelf software being considered for use on the project (see SWE-027)?
  • Has an analysis of alternatives been completed?
  • Is the selection of architecture and design methods based on system operational characteristics?

Consider the following during detailed design:

3.2 Coding Standards and Processes

The design activity proceeds from the software architectural definition (see SWE-057). The key decomposition and dependency features and the design/coding standards specified or begun during the development of the software architecture are then maintained and/or completed during the design phase of the software development life cycle.

The software development team reviews and maintains the selected coding standards (see SWE-061) that were established for all the languages being used, documented, or referenced in the project's software documentation.

The initial steps in developing the software detailed design involve the definition of functions, inputs, and outputs for each component. 077

In some projects that employ component-based development, the components may be pre-existing and fixed, or customizable in only a limited way. The design focus then lies in integrating existing components to form new systems. 173

The software development team can describe the design by using a programming design language or by using pseudocode.

Consideration  077 can be given to the reuse of software for all or some major components, such as:

  • Application generators.
  • Database management systems.
  • Human-computer interaction utilities.
  • Mathematical utilities.
  • Graphical utilities.
  • Library modules.
  • Library modules with shells built to standardize interfaces (e.g., for error handling) and enhance portability.

Once the final functions are determined, they are coded into the selected programming language. Although the decomposition activities nominally occur during the software architecting activities, the software development team must revisit the architecture if one or more of the decomposition criteria  077 will not be satisfied in the proposed detailed design:

  • Will the module have too many statements?
  • Will the module be too complex?
  • Does the module have low "cohesion" (a measure of how strongly related each piece of functionality expressed by the source code of a software module is)?
  • Does the module have high "coupling" (the degree to which each program module relies on each other in the other modules)?
  • Does the module contain similar processing to other modules?

3.3 Design Considerations

A best practice is to use modularity in the design, with minimal coupling between components and maximum cohesion within each component. Modular designs are more understandable, reusable, and maintainable than designs consisting of components with complicated interfaces and poorly matched groupings of functions. Each level of design is best described by including only the essential aspects and omitting the inessential detail. This means the design description includes the appropriate degree of "abstraction" (Abstraction captures and represents only those details about an object that is relevant to the current perspective). This enhances understandability, adaptability, and maintainability.

Component-based software engineering emphasizes the use of software components with distinct functionality that overlap as little as possible. This is a reuse-based approach for developing software for defining, implementing, and joining independent components into systems. Wikipedia is a good source for top-level information on this approach.

The last step of detailed design (some may also say this is the first step of the coding phase) includes the initial steps for coding the process modules in preparation for unit testing. This transition from detailed design to the coding and test phase of the software life cycle occurs when the team begins to compile the developed work products into the programming language (see SWE-060 and SWE-061).

3.4 Detailed Design Documentation and Progress Reviews

As a part of this requirement, the software development team must record descriptions of the design down to the lower-level units. The main records 077 or outputs of the detailed design phase are captured in the following set of project documents and work products:

When documenting elements of the detailed design include information such as

  • Identifier and types.
  • Purpose and function.
  • Constraints.
  • Subordinates and dependencies.
  • Interfaces.
  • Resources.
  • Processing characteristics and special needs, if any (e.g., required timing or special conditions necessary to execute).
  • Other enabling data (e.g., products from the use of Universal Modeling Language (UML) or a tool such as Rhapsody).

Progress against plans is monitored by project management and documented at regular intervals in progress reports. Developers verify detailed designs in design reviews, level by level. Peer reviews and inspections of the software design before coding begins are used to eliminate design errors before conducting testing. See the guidance (SWE-087 - Software Peer Reviews and Inspections for Requirements, Plans, Design, Code, and Test Procedures) related to software peer reviews and inspection of design items.

Two variations of software/peer reviews/inspections are useful:

  • Design inspection.
  • Scenario-based analyses.

In a design inspection, the software development team traces the logic of a module from beginning to end. In scenario-based analyses, component behavior is examined for responses due to specific inputs.

3.5 Design Maintenance

Once these detailed designs are created, they are maintained to reflect current project status, progress, and plans, which will change over the life of the project. If requirements changes occur, or if the design approach proves faulty, the software architecture design may need to be changed before continuing with the detailed design of the software work products. When requirements change (SWE-071 - Update Test Plans and Procedures), software designs, test plans, procedures, and the resulting test reports may also need to be updated or revised to reflect the changes.

Just as the project team ensures that initial detailed designs and their reports are reviewed and approved before use, the team also ensures that updates are also reviewed and approved the following project procedures. (See SWE-080 - Track and Evaluate Changes)

The software development team must continue to maintain accurate and current software design descriptions throughout the operation and maintenance phases of a project.

The Software Architecture Review Board 323, a software engineering sub-community of practice, is a good resource of software design information including sample documents, reference documents, and expert contacts.

Consult Center Process Asset Libraries (PALs) for Center-specific guidance and resources related to best practices, tools, and design standards for software detailed design activities.

4. Small Projects

The completion of a software development project inherently means that detailed design activities are conducted and completed. Smaller projects may benefit from limiting the development or use of original or unique tools and environments in the detailed design process. Smaller projects may consider using previously developed coding standards and documentation processes, if applicable, rather than developing their own. These standard applications may be available in the Center's Process Asset Library (PAL) or the software PALs of other Centers.

5. Resources

5.1 References

5.2 Tools

Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lesson Learned database contains the following lessons learned related to software design:

  • MER Spirit Flash Memory Anomaly (2004). Lesson Number 1483 557: Enforce Design Guidelines: "A severely compressed flight software development schedule may prevent the achievement of a full understanding of software functions. During the MER (Mars Exploration Rover) software development process there was a continuous reprioritization of activities and focus. One impact of this dynamic process was that only the highest priority flight software issues and problems could be addressed, and memory management problems were viewed as low risk." Recommendation No. 1 states to "Enforce the project-specific design guidelines for COTS software, as well as for NASA-developed software. Assure that the flight software development team reviews the basic logic and functions of commercial off-the-shelf (COTS) software, with briefings and participation by the vendor."
  • Software Design for Maintainability. Lesson Number 0838 526: Impact of Non-Practice: "Because of increases in the size and complexity of software products, software maintenance tasks have become increasingly more difficult. Software maintenance should not be a design afterthought; it should be possible for software maintainers to enhance the product without tearing down and rebuilding the majority of code."
  • Mars Observer Inertial Reference Loss. Lesson Number 0310 526: Design for Maintenance, Lesson Learned No. 4: "Design flexibility of the flight computer and software is critical to the ability to uplink software patches for the correction of unexpected in-flight spacecraft anomalies."
  • ADEOS-II NASA Ground Network (NGN) Development and Early Operations – Central/Standard Autonomous File Server (CSAFS/SAFS) Lessons Learned. Lesson Number 1346 550: Use of Commercial Off the Shelf (COTS) products: "Match COTS tools to project requirements. Deciding to use a COTS product as the basis of system software design is potentially risky, but the potential benefits include quicker delivery, less cost, and more reliability in the final product. The following lessons were learned in the definition phase of the [software] development.
    • "Use COTS products and re-use previously developed internal products.
    • "Create a prioritized list of desired COTS features.
    • "Talk with local experts having experience in similar areas.
    • "Conduct frequent peer and design reviews.
    • "Obtain demonstration versions of COTS products.
    • "Obtain customer references from vendors.
    • "Select a product appropriately sized for your application.
    • "Choose a product closely aligned with your project's requirements.
    • "Select a vendor whose size will permit a working relationship.
    • "Use vendor tutorials, documentation, and vendor contacts during the COTS evaluation period."
  • Fault-Tolerant Design. Lesson Number 0707 517: "Systems which do not incorporate Fault Tolerant Design (FTD) as a part of their development process will experience a higher risk of a severely degraded or prematurely terminated mission, or it may result in excessively large weight volume, or high cost to achieve an acceptable level of performance by using non-optimized redundancy or overdesign...Incorporate hardware and software features in the design of spacecraft equipment that tolerates the effects of minor failures and minimizes switching from the primary to the second string. This increases the potential availability and reliability of the primary string."

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-058 - Detailed Design
4.3.2 The project manager shall develop, record, and maintain a software design based on the software architectural design that describes the lower-level units so that they can be coded, compiled, and tested.

7.1 Tasking for Software Assurance

  1. Assess the software design against the hardware and software requirements, and identify any gaps.
  2. Assess the software design to verify that the design is consistent with the software architectural design concepts and that the software design describes the lower-level units to be coded, compiled, and tested.
  3. Assess that the design does not introduce undesirable behaviors or unnecessary capabilities.
  4. Confirm that the software design implements all of the required safety-critical functions and requirements.
  5. Perform a software assurance design analysis.

7.2 Software Assurance Products

  • Software Design Analysis
  • Results of software assurance design analysis, including assessments in Tasks 1, 2, and 3. 
  • List of any identified design risks and issues.

    Objective Evidence

    • Software design analysis results, including design implementation of the required safety-critical functions and requirements.

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of architectural issues identified vs. number closed.
  • # of design issues found versus the number of design issues resolved.
  • # of safety-related requirement issues (Open, Closed) over time.
  • # of safety-related non-conformances identified by life-cycle phase over time.
  • # of software work product Non-Conformances identified by life-cycle phase over time

7.4 Guidance

Task 1: Review the guidance on design in the software guidance section for this requirement (SWE-058) and keep in mind those considerations as the detailed design is being developed. Become familiar with the architectural design (preliminary design) as it evolves into the detailed design. Assess the evolving design by reviewing the requirements traces and attending design peer reviews. The hardware and software requirements should trace through to the architectural design and into the detailed design. If there are requirements that do not trace through the architectural design into the detailed design, identify the gaps and bring them to the attention of the design team. Track any such gaps to make sure they are addressed.

Tasks 2 & 3: Review the evolving design and assess whether the design flows down from the software architectural design that has been defined during the preliminary design phase. The evolving detailed design should contain enough detail to write the code, compile it and test it. Attend the design peer reviews to get an understanding of how the final system will work in terms of satisfying the operational scenarios. Mentally stepping through the proposed designs with different scenarios in mind may give a good feel for how the final system will behave, what possible pitfalls there might be in the design, any missing elements in the design, and whether there are any superfluous capabilities or undesirable behaviors. Any issues found should be discussed with the software design and implementation team.

Task 4: Confirm that the software detailed design implements all of the required safety-critical functions and requirements. There should be a list of all software safety-critical components that have been identified by the system hazard analysis, as per SWE-205 - Determination of Safety-Critical Software. Requirements for safety-critical functions should be captured in the software design document or the system or software hazard reports. These should all be traced down into the design at a level where they can be implemented. 

Task 5: Develop a software assurance design analysis. The guidance for performing a software assurance design analysis can be found in this Handbook in Topic 7.18: Documentation Guidance under SADESIGN. For convenience, this information is also listed below:

Software Assurance Design Analysis

In software design, software requirements are transformed into software architecture and then into a detailed software design for each software component.  The software design also includes databases and system interfaces (e.g., hardware, operator/user, software components, and subsystems).  The design addresses software architectural design and software detailed design.  The objective of doing design analysis is to ensure that the design is a correct, accurate, and complete transformation of the software requirements that will meet the operational needs under nominal and off-nominal conditions, introduces no unintended features, and that design choices do not result in unacceptable operational risk. The design should also be created with modifiability and maintainability so future changes can be made quickly without the need for significant redesign changes.

Consider the list below when evaluating the software design:

  1. Has the software design been developed at a low enough level for coding?
  2. Is the design unnecessarily complicated?
  3. Is the design adequately documented for usability and maintainability?
  4. Has system performance been considered during design?
  5. Has the level of coupling (interactivity between modules) been kept to a minimum?
  6. Has software planned for reuse and OTS software in the system been examined to see that it meets the requirements and performs appropriately within the required limits for this system?
  7. Does this software introduce any undesirable capabilities or behaviors?
  8. Has the software design been peer-reviewed?
  9. Has the software design been developed at a low enough level for coding?
  10. Is the design complete and does it cover all the approved requirements?
  11. Have complex algorithms been correctly derived, provide the needed behavior under off-nominal conditions and assumed conditions, and is the derivation approach known and understood to support future maintenance?
  12. Examine the design to ensure that it does not introduce any undesirable behaviors or any capabilities, not in the requirements?
  13. Have all requirements sources been considered when developing the design (for example, think about interface control requirements, databases, etc.)?
  14. Have the interfaces with COTS, MOTS, GOTS, and Open Source been designed?
  15. Have all internal and external software interfaces been designed for all (in-scope) interfaces with hardware, user, operator, software, and other systems and are they detailed enough to enable the development of software components that implement the interfaces?
  16. All safety features are in the design (mitigations, controls, barriers, must-work requirements, must-not-work requirements)
  17. Does the design provide the dependability and fault tolerance required by the system, and is the design capable of controlling identified hazards?  Does the design create any hazardous conditions?
  18. Does the design adequately address the identified security requirements both for the system and security risks, including the integration with external components as well as information and data utilized, stored, and transmitted through the system?   
  19. Does the design prevent, control, or mitigate any identified security threats and vulnerabilities? Are any unmitigated threats and vulnerabilities documented and addressed as part of the system and software operations?
  20. Operational scenarios have been considered in the design (for example, the use of multiple individual programs to obtain one particular result may not be operationally efficient or reasonable; transfers of data from one program to another should be electronic, etc.).
  21. Have users/operators been consulted during design to identify any potential operational issues?
  22. Maintainability: Has maintainability been considered? Is the design modular? Can additions and changes be made quickly?
  23. Is the design easy to understand?

Some design principles to think about during design reviews:

Key Design Principles (Quoted from “Tutorialspoint Simply Easy Learning”)  643

Following are the design principles to be considered for minimizing cost, maintenance requirements, and maximizing extendibility, the usability of architecture −

Separation of Concerns

Divide the components of the system into specific features so that there is no overlapping among the functionality of the components. This will provide high cohesion and low coupling. This approach avoids the interdependency among components of the system which helps in maintaining the system easily.

Single Responsibility Principle

Each module of a system should have one specific responsibility, which helps the user to clearly understand the system. It should also help with the integration of the component with other components.

Principle of Least Knowledge

Any component or object should not know the internal details of other components. This approach avoids interdependency and helps maintainability.

Minimize Large Design Upfront

Minimize large designs upfront if the requirements of an application are unclear. If there is a possibility of modifying requirements, then avoid making a large design for the whole system.

Do not Repeat the Functionality

Do not repeat functionality specifies that functionality of components should not be repeated and hence a piece of code should be implemented in one component only. Duplication of functionality within an application can make it difficult to implement changes, decrease clarity, and introduce potential inconsistencies.

Prefer Composition over Inheritance while Reusing the Functionality

Inheritance creates a dependency between children and parent classes and hence it blocks the free use of the child classes. In contrast, the composition provides a great level of freedom and reduces inheritance hierarchies.

Identify Components and Group them in Logical Layers

Identity components and the area of concern that is needed in the system to satisfy the requirements. Then group these related components in a logical layer, which will help the user to understand the structure of the system at a high level. Avoid mixing components of different types of concerns in the same layer.

Define the Communication Protocol between Layers

Understanding how components will communicate with each other requires complete knowledge of deployment scenarios and the production environment.

Define Data Format for a Layer

Various components will interact with each other through data format. Do not mix the data formats so that applications are easy to implement, extend, and maintain. Try to keep data format the same for a layer, so that various components need not code/decode the data while communicating with each other. It reduces the processing overhead.

System Service Components should be Abstract

Code related to security, communications, or system services like logging, profiling, and configuration should be abstracted into separate components. Do not mix this code with business logic, as it is easy to extend the design and maintain it.

Design Exceptions and Exception Handling Mechanism

Defining exceptions in advance helps the components to manage errors or unwanted situations in an elegant manner. The exception management will be the same throughout the system.

Naming Conventions

Naming conventions should be defined in advance. They provide a consistent model that helps the users to understand the system easily. It is easier for team members to validate code written by others and hence will increase the maintainability.

  • No labels