- 1. Introduction
- 2. Design Analysis Guidance
- 3. Safety Analysis During Design
- 4. Analysis Report Content
- 5. Resources
The Software Design Analysis product focuses on analyzing the software design that has been developed from the requirements (software, system, and/or interface). This topic describes some of the methods and techniques Software Assurance and Software Safety personnel may use to evaluate the quality of the architecture and design elements that was developed.
Since the design primarily guides the code implementation, it is important to ensure that the architecture and design are correct, safe, secure, complete, understandable, and captures the intent of the requirements. The detailed design captures the low-level component-based approach to implementing the software requirements, including the requirements associated with fault management, security, and safety. When the detailed design is complete, the analysis of the requirements traceability documents should show the relationship between the software design components and the software requirements and provides evidence that all requirements are accounted for. The information in this topic is divided into several tabs as follows:
- Tab 1 – Introduction
- Tab 3 – Safety Analysis During Design – provides additional guidance when safety critical software is involved with analysis emphasis on safety features
- Tab 4 - Analysis Reporting Content – provides guidance on the analysis report product content
- Tab 5 – Resources for this topic
The following is a list of the applicable SWE requirements that relate to the generation of the software design analysis product:
NPR 7150.2 Requirement
NASA-STD-8739.8 Software Assurance and Software Safety Tasks
The project manager shall define and document the acceptance criteria for the software.
The project manager shall perform a software architecture review on the following categories of projects:
1. Assess the software design against the hardware and software requirements, and identify any gaps.
3. Assess that the design does not introduce undesirable behaviors or unnecessary capabilities.
5. Perform a software assurance design analysis.
The project manager shall track and evaluate changes to software products.
2. Software Design Analysis Guidance
- is safe,
- is secure with known weaknesses and vulnerabilities mitigated,
- introduces no unintended features, and
- does not result in unacceptable operational risk.
Software Assurance and Software Safety tasks in NASA-STD-8739.8 that relate to design analysis are found in SWE-052, SWE-058, SWE-060, SWE-087, SWE-134, and SWE-157.
2.1 Use of Checklists and Known Best Practices
As part of the design analysis, Software Assurance and Software Safety personnel review the design to ensure that good general design practices have been implemented. There are several checklists in this Handbook that list some of the design best practices. The use of the SADESIGN Checklist (see below) is important when evaluating the software design as it highlights many good general design practices. Another checklist that can be used for safety-critical software is found in this Handbook, under the Programming Checklists Topic: 6.1 - Design for Safety Checklist. The “Software Design Principles” tab under Topics provides information for specific design aspects that should be considered during the analysis for both safety critical software and non-safety critical software. Teams may decide to formulate some of this information into a checklist that is applicable to their project.
Some good general design practices to be considered are:
- Begin by breaking the design into smaller chunks.
- Keep the design simple.
- Keep the design modular so it will be easier to test and maintain.
- Keep boundaries, interfaces, and constraints in mind.
- Understand how users will interact with the system.
- Include error handling in the designs.
- Don’t repeat code portions –if the code portions need to be used repeatedly, put them into a function, a package or subroutine that can be called.
- Prototype new approaches or designs for difficult requirements.
- Peer review designs, particularly interfaces, data flows, and logic flows
- Use design documentation, pseudo code, process diagrams, logic diagrams to aid in evaluating the design.
Additional guidance and some key design practices can be found in SWE-058, tab 7.
2.2 Use of peer reviews or inspections
- Assess the software design against the hardware and identify any gaps.
- Assess the software design against the system requirements and design and identify any gaps.
- Confirm the design does not contain undesirable functionality.
- Confirm the design addresses possible unauthorized access, vulnerabilities, and weaknesses.
2.3 Review of Traceability
Review the bi-directional tracing between requirements and design and ensure they are complete. As the project moves into implementation, the bi-directional tracing between design and code should also be checked.
2.4 Analysis by Software Architecture Review Board (SARB) - applies to NASA projects only
The Software Architecture Review Board (SARB) is a NASA-wide board that engages with flight projects in the formative stages of software architecture. The objectives of SARB are to manage and/or reduce flight software complexity through better software architecture and help improve mission software reliability and save costs. NASA projects that meet certain criteria (for example, large projects, ones with safety critical concerns, projects destined for considerable reuse, etc.) may request the SARB to do a review and assessment for their architecture. For more guidance on SARB, see Tabs 3 and 7 in SWE-143 - Software Architecture Review
2.5 Problem/Issue Tracking System
3. Safety Analysis During Design
The Safety Design Analysis is a portion of the overall Software Safety Analysis that is performed on all safety critical software, as defined in NASA-STD-8739.8. A full Software Safety Analysis encompasses all the aspects of the development life cycle (requirements, design, implementation, and test) for safety-critical software and focuses on the safety features (safety requirements, controls, mitigations, fault identification, isolation and recovery, etc.) During the Design phase, Software Safety personnel analyze the design to ensure that it will not adversely impact the safety of the system/software. This tab discusses the Software Safety Analysis activities during design.
3.1 Review Software Design Analysis Information
- “Software Design Principles” Tab of Topics in this Handbook.
3.2 Design Peer Reviews or Walk-throughs
Safety Considerations during Design Peer Reviews/Walk-throughs:Does the design:
- Reduce the complexity of the software and interfaces.
Design for user-safety instead of user-friendly.
Design for testability during development and integration.
Give more design “resources” (such as time, effort) to the higher risk aspects such as hazard controls.
Include separation of commands, functions, files, and ports.
Include design for Shutdown/Recovery/Safing.
Plan for monitoring and detection.
Isolate the components containing safety-critical requirements as much as possible.
Interfaces between safety-critical components should be designed for minimum interaction.
Document the positions and functions of safety critical components in the design hierarchy.
Document how each safety-critical component can be traced back to the original safety requirements and how the requirements are implemented.
Specify safety-related design and implementation constraints.
Document execution control, interrupt characteristics, initialization, synchronization, and control of the components. For high risk systems, interrupts should be avoided since they may interfere with software safety controls. Any interrupts used should be priority-based.
Specify any error detection or recovery schemes for safety-critical components.
Consider hazardous operations scenarios.
The design of safing and recovery actions should fully consider the real-world conditions and the corresponding time to criticality. Automatic safing is often required if the time to criticality is shorter than the realistic human operator response time, or if there is no human in the loop. This can be performed by either hardware or software or a combination depending on the best system design to achieve safing.
Select a strategy for handling faults and failures. Some of the techniques that can be used in fault management are below:
Does the software design:
- Consider any potential issues with the use of COTS, Open Source , reused or inherited code.
- Select sampling rates with consideration for noise levels and expected variations of control system and physical parameters.
- Identify test and/or verification methods for each safety-critical design feature.
- Design for testability. Include ways that the internals of a component can be adequately tested to verify that they are working properly.
- Consider maintainability in the design (For example: anticipate potential changes in the software, use a modular design, object-oriented design, uniform conventions, and naming conventions, use coding standards that support safety practices, use documentation standards, common tool sets)
A few more safety-specific design considerations are below:
- Has the design been reviewed to ensure that software design’s correct implementation of safety controls or processes does not compromise other system safety features or the functionality of the software?
- Have Safety reviews approved the controls, mitigations, inhibits, and safety design features to be incorporated into the design?
- Are any needed or identified safety conditions, constraints, parameters, trigger points, boundary conditions, environments, and other software circumstances for safe operation, in the appropriate modes and states all flowed from the software requirements and incorporated into the design?
- Does the design maintain the system in a safe state during all modes of operation or can it transition to a safe state when and if necessary?
- Are any partitioning or isolation methods used in the design to logically isolate the safety critical design elements from those that are non-safety critical effective? This is particularly important with the incorporation of COTS or integration of legacy, heritage, and reuse software. Any software that can write or provide data to safety critical software will also be considered safety critical unless there is isolation built in, and then the isolation design is considered safety critical.
- Are appropriate fault and or failure tolerance incorporated into the software design as designated?
3.3 Other Types of Design Analysis
- Design Interface Analysis: The Design Interface Analysis verifies the proper design of a software component's interfaces with other components of the software, system, or even hardware. This analysis will verify that the software component's interfaces, especially the control and data linkages, have been properly designed. Interface requirements specifications (which may be part of the requirements or design documents, or a separate document) are the sources against which the interfaces are evaluated. Interface characteristics to be addressed should include inter-process communication methods, data encoding, error checking (e.g., data entry validity, value/range, type checks), and synchronization.
The analysis should consider the validity and effectiveness of checksums, cyclic redundancy checks (CRCs), and error correcting code. CRC is a type of error-detecting code used in digital networks and storage devices to detect unintentional changes to raw data. Blocks of data entering these systems get a short check value attached, based on the remainder of a polynomial division of their contents. When the data is retrieved, the calculation is repeated and if the check values do not match, the data is corrupt and corrective action can be taken.
The sophistication of error checking or correction that is implemented should be appropriate for the predicted bit error rate of the interface. An overall system error rate should be defined and budgeted to each interface.
3.4 Problem/Issue Tracking System
4. Analysis Reporting Content
Documenting and Reporting of Analysis Results.
When the design is analyzed, the Software Design Analysis work product is generated to document the results. It should include a detailed report of the design analysis results. Analysis results should also be reported in a high-level summary and conveyed as part of weekly or monthly SA Status Reports. The high-level summary should provide an overall evaluation of the analysis, any issues/concerns, and any associated risks. If a time-critical issue is uncovered, it should be reported to management immediately so that the affected organization may begin addressing it at once.
4.1 High-Level Analysis Content for SA Status Report
Any design analysis performed since the last SA Status Report or project management meeting should be reported to project management and the rest of the Software Assurance team. When a project has safety-critical software, any analysis done by Software Assurance should be shared with the Software Safety personnel.
- Identification of what was analyzed: Mission/Project/Application
- Period/Timeframe/Phase analysis performed during
- Summary of analysis techniques used
- Overall assessment of design, based on analysis
- Major findings and associated risk
- Current status of findings: open/closed; projection for closure timeframe
4.2 Detailed Content for Analysis Product:
The detailed results of all software design analysis activities are captured in the Software Design Analysis product. This document is placed under configuration management and delivered to the project management team as the Software Assurance record for the activity. When a project has safety-critical software, this product should be shared with the Software Safety personnel.
- Identification of what was analyzed: Mission/Project/Application
- Person(s) or group performing the analysis
- Period/Timeframe/Phase analysis performed
- Documents used in analysis (e.g., versions of the system and software requirements, interfaces document, architectural and detailed design)
- Description or identification of analysis techniques used. Include an evaluation of the techniques used.
- Overall assessment of design, based on analysis results
- Major findings and associated risk – The detailed reporting should include where the finding, issue, or concern was discovered and an assessment of the amount of risk involved with the finding.
- Minor findings
- Current status of findings: open/closed; projection for closure timeframe
- Include counts for those discovered by SA and Software Safety
- Include overall counts from the Project’s problem/issue tracking system.
No references have been currently identified for this Topic. If you wish to suggest a reference, please leave a comment below.
NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN.
The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool. The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.