Page History
Set Data | ||||
---|---|---|---|---|
| ||||
45 |
Tabsetup | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
2.1 Use of ChecklistsConsider the checklist below, from SADESIGN, when evaluating the software design. Another checklist that can be used for safety-critical software is found in this Handbook, under the Programming Checklists Topic: 6.1 - Design for Safety Checklist.
2.2 Use of peer reviews or inspectionsDesign items designated in the software development plans should be peer reviewed or inspected. Some of the items to look for during these meetings are:
2.3 Review of TraceabilityReview the bi-directional tracing between requirements and design and ensure they are complete. As the project moves into implementation, the bi-directional tracing between design and code should also be checked. 2.4 Analysis by Software Architecture Review Board (SARB) - applies to NASA projects onlyThe Software Architecture Review Board (SARB) is a NASA-wide board that engages with flight projects in the formative stages of software architecture. The objectives of SARB are to manage and/or reduce flight software complexity through better software architecture and help improve mission software reliability and save costs. NASA projects that meet certain criteria (for example, large projects, ones with safety critical concerns, projects destined for considerable reuse, etc.) may request the SARB to do a review and assessment for their architecture. For more guidance on SARB, see Tabs 3 and 7 in SWE-143 - Software Architecture Review. 2.5 Reporting of ResultsAny design analysis done in the interim between status reports or prior to milestone reviews should be reported on to management and the rest of the team. When a project has safety-critical software, any analysis done by Software Assurance should be shared with the Software Safety personnel. The results reporting should include:
2.6 Problem/Issue Tracking SystemFindings, issues, and concerns from all the different software and safety design analyses performed should be documented in a problem/issue tracking system and tracked to closure. These items should be communicated to the software development personnel and possible solutions discussed. The analysis done by Software Assurance and Software Safety can be reported in one combined report if desired. Div |
3. Safety Design Analysis3.1 Review Software Design AnalysisThere are many considerations for analyzing the design with respect to safety. Most of the design analysis that is used for non-safety projects is still applicable for safety critical software. So, to begin with, the Software Safety personnel should either review or ensure that the Software Assurance personnel have reviewed the set of items listed in Tab 2 -Software Design Analysis Guidance. The first of these is the SADESIGN checklist (previously in Topic 7.18). Another checklist that can be used for safety-critical software is found in this Handbook, under the Programming Checklists Topic: 6.1 - Design for Safety Checklist. 3.2 Design peer reviews or design walkthroughsDesign peer reviews or design walkthroughs for safety-critical components are recommended for safety-critical components to identify design problems or other issues. One of the most important aspects of a software design for safety critical software is to design for minimum risk. “Minimum risk” includes the hazard risk, the risk of software defects, risk of human operator errors and other types of risk such as programmatic, cost, schedule, etc. When possible, eliminate identified hazards and risks or reduce the associated risk through design. Some of the ways risk can be reduced through design are listed below. This list can be used by attendees of design peer reviews or walk-throughs to help evaluate the design with respect to safety and risk considerations. Safety Considerations during Design Peer Reviews/Walk-throughs:
A few more safety-specific design considerations are below:
3.3 Other types of design analysis can be done to analyze particular aspects of the design.All of these design analyses would be useful to perform, but they require more time and effort so the safety team should choose those they feel would provide the most value, depending on the areas where risk is highest in the design. Some of the other available design analysis methods are below: a. Acceptable Level of Safety: Once the design is fairly mature, a design safety analysis can be done to determine whether an acceptable level of safety will be attained by the designed system. This analysis involves analyzing the design of the safety components to ensure that all the safety requirements are specified correctly. The requirements may need to be updated once the design has determined exactly what safety features will be included in the system. Then review the design looking for the places and conditions that lead to unacceptable hazards. Consider the credible faults or failure that could occur and evaluate their effects on the designed system. Does the designed system produce the desired result with respect to the hazards? b. Prototyping or simulating: Prototyping or simulating parts of the design may show where the software can fail. In addition, this can demonstrate whether the software can meet the constraints it might have, such as response time, or data conversion speed. This could also be used to provide the operator’s inputs on the user interface. If the prototypes show that a requirement cannot be met, the requirement must be modified as appropriate or the design may need to be revised. c. Independence Analysis: To perform this analysis, map the safety-critical functions to the software components, and then map the software components to the hardware hosts and FCRs. All the input and output of each safety-critical component should be inspected. Consider global or shared variables, as well as the directly passed parameters. Consider “side effects” that may be included when a component is run. d. Design Logic Analysis: The Design Logic Analysis (DLA) evaluates the equations, algorithms, and control logic of the software design. Logic analysis examines the safety-critical areas of a software component. A technique for identifying safety-critical areas is to examine each function performed by the software component. If it responds to or has the potential to violate one of the safety requirements, it should be considered critical and undergo logic analysis. A technique for performing logic analysis is to compare design descriptions and logic flows and note discrepancies. This most rigorous type of analysis can also be done using Formal Methods. Less formal DLA involves a human inspector reviewing a relatively small quantity of critical software products (e.g., PDL, prototype code) and manually tracing the logic. Safety-critical logic to be inspected can include failure detection and diagnosis, redundancy management, variable alarm limits, and command inhibit logical preconditions. e. Design Data Analysis: The Design Data Analysis evaluates the description and intended use of each data item in the software design. Data analysis ensures that the structure and intended use of data will not violate a safety requirement. A technique used in performing design data analysis is to compare the description to the use of each data item in the design logic.Interrupts and their effect on data must receive special attention in safety-critical areas. Analysis should verify that interrupts and interrupt handling routines do not alter critical data items used by other routines. The integrity of each data item should be evaluated with respect to its environment and host. Shared memory and dynamic memory allocation can affect data integrity. Data items should also be protected from being overwritten by unauthorized applications. f. Design Interface Analysis: The Design Interface Analysis verifies the proper design of a software component's interfaces with other components of the system. The interfaces can be with other software components, with hardware, or with human operators. This analysis will verify that the software component's interfaces, especially the control and data linkages, have been properly designed. Interface requirements specifications (which may be part of the requirements or design documents, or a separate document) are the sources against which the interfaces are evaluated. Interface characteristics to be addressed should include inter-process communication methods, data encoding, error checking and synchronization. The analysis should consider the validity and effectiveness of checksums, CRCs, and error correcting code. The sophistication of error checking or correction that is implemented should be appropriate for the predicted bit error rate of the interface. An overall system error rate should be defined and budgeted to each interface. g. Design Traceability Analysis: This analysis ensures that each safety-critical software requirement is included in the design. Tracing the safety requirements throughout the design (and eventually into the source code and test cases) is vital to making sure that no requirements are lost, that safety is “designed in”, that extra care is taken during the coding phase, and that all safety requirements are tested. A safety requirement traceability matrix is one way to implement this analysis. 3.4 Documenting and Reporting of Results of the Design Analysis:Any design analysis done in the interim between status reports or prior to milestone reviews should be reported on to management and the rest of the team. When a project has safety-critical software, any analysis done by Software Assurance should be shared with the Software Safety personnel. The results reporting should include:
3.5 Problem/Issue Tracking SystemFindings, issues, and concerns from all the different safety design analyses performed should be documented in a problem/issue tracking system.
|