bannerd


SWE-143 - Software Architecture Review

1. Requirements

4.2.4 The project manager shall perform a software architecture review on the following categories of projects: 

a. Category 1 Projects as defined in NPR 7120.5.
b. Category 2 Projects as defined in NPR 7120.5, that have Class A or Class B payload risk classification per NPR 8705.4.

1.1 Notes

NPR 7150.2, NASA Software Engineering Requirements, does not include any notes for this requirement.

1.2 History

SWE-143 - Last used in rev NPR 7150.2D

RevSWE Statement
A


Difference between A and B

NEW

B

4.2.4 The project manager shall perform a software architecture review on the following categories of projects:

    1. Category 1 Projects as defined in NPR 7120.5.
    2. Category 2 Projects as defined in NPR 7120.5 that have Class A or Class B payload risk classification per NPR 8705.4.
Difference between B and C

No change

C

4.2.4 The project manager shall perform a software architecture review on the following categories of projects:

    1. Category 1 Projects as defined in NPR 7120.5.
    2. Category 2 Projects as defined in NPR 7120.5 that have Class A or Class B payload risk classification per NPR 8705.4.

Difference between C and DNo change
D

4.2.4 The project manager shall perform a software architecture review on the following categories of projects: 

a. Category 1 Projects as defined in NPR 7120.5.
b. Category 2 Projects as defined in NPR 7120.5, that have Class A or Class B payload risk classification per NPR 8705.4.



1.3 Applicability Across Classes

This requirement applies based on the selection criteria defined in this requirement.

Class

     A      

     B      

     C      

     D      

     E      

     F      

Applicable?

   

   

   

   

   

   

Key:    - Applicable | - Not Applicable


2. Rationale

Software architecture deals with the fundamental organization of a system, as embodied in its components and their relationships to each other and the environment. 487  Software architecture reviews are conducted to inspect quality attributes, principles of design, verifiability, and operability of a software system. The software architecture is reviewed to evaluate its effectiveness, efficiency, and robustness relative to the software requirements.  The software architecture review helps manage and/or reduce flight software complexity through improved software architecture, improved mission software reliability and cost savings.

3. Guidance

A software architecture review may be a peer review or may occur as part of a major milestone review (e.g., Mission/System Definition Review [MDR/SDR]). In either case, the architecture review conforms to SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking and SWE-089 - Software Peer Reviews and Inspections - Basic Measurements for peer reviews.  See also Topic 7.10 - Peer Review and Inspections Including Checklists. Thus, to conduct an architecture review, the project must:

  1. Identify required participants.
  2. Use a checklist to evaluate the architecture description.
  3. Use established readiness and completion criteria.
  4. Track actions identified in the reviews until they are resolved.
  5. Identify and record measurements for the review.

The Software Architecture Review Board sub-community of practice, accessible to NASA users on NASA Engineering Network (NEN), provides additional information, including guidance, checklists, and examples, for conducting software architecture reviews. Some of this information is summarized in the sections that follow.

3.1 When is an Architecture Review Required

SWE-143 states that an architecture review is required for projects meeting the following criteria:

  1. Category 1 Projects as defined in NPR 7120.5.
  2. Category 2 Projects as defined in NPR 7120.5, that have Class A or Class B payload risk classification per NPR 8705.4.

Here are expanded definitions for the criteria in SWE-143:

  • NPR 7120.5E, NASA Space Flight Program and Project Management Requirements defines Category 1 projects as human space flight projects, projects with a life cycle cost exceeding $1B, or projects with significant radioactive material.  082
  • NPR 7120.5E  defines Category 2 projects as projects that have life cycle costs greater than $250M and less than $1B or have life cycle costs less than $250M with a high priority level based on “the importance of the activity to NASA, the extent of international participation (or a joint effort with other government agencies), the degree of uncertainty surrounding the application of new or untested technologies” and a Class A or Class B payload risk classification. 082
  • NPR 8705.4, Risk Classification for NASA Payloads defines the Class A payload risk classification as payloads with high priority, very low (minimized) risk, very high national significance, very high to high complexity, greater than 5 year mission lifetime, high cost, critical launch constraints, no alternative or re-flight opportunities, and/or payloads where “all practical measures are taken to achieve a minimum risk to mission success. The highest assurance standards are used.”  048
  • NPR 8705.4, Risk Classification for NASA Payloads defines the Class B payload risk classification as payloads with high priority, low risk, high national significance, high to medium complexity, two- to five-year mission lifetime, high to medium cost, medium launch constraints, infeasible or difficult in-flight maintenance, few or no alternative or re-flight opportunities, and/or payloads where “stringent assurance standards [are applied] with only minor compromises in application to maintain a low risk to mission success.” 
  • Per NPR 8705.4, “The importance weighting assigned to each consideration is at the discretion of the responsible Mission Directorate.”

3.2 When to Hold an Architecture Review

Software architecture reviews are held during the software architecture formulation phase before the software architecture is baselined.  Generally, a software architecture review should be held between the Mission or System Definition Review (MDR/SDR) and the preliminary design review (PDR). The earlier reviewers are involved in the software architecture development process, the more effective their inputs will be to the architecture development. (See reviews in Topic 7.09 - Entrance and Exit Criteria)

3.3 Identifying Participants

Projects should select participants representing key stakeholders in the software.  This may include but is not limited to: engineers for the computing hardware running the software, engineers for other systems communicating with the software, software testers, software installers, software users, suppliers of input data, and consumers of output data. SWE-088 - Software Peer Reviews and Inspections - Checklist Criteria and Tracking suggests a minimum of three reviewers.

NPR 7150.2 does not require independent reviewers.  However, if the project seeks an independent review, the NASA Software Architecture Review Board (SARB) is a resource of experienced software architects.  Projects can request a review by the SARB 404.

3.4 Evaluation Criteria for Software Architecture Reviews

3.4.1 Criteria in ISO/IEC 12207

Systems and software engineering - Software life cycle processes

6.4.3.3.2.1 - “The system architecture and the requirements for the items shall be evaluated considering the criteria listed below. The results of the evaluations shall be documented.

a) Traceability to the system requirements.

b) Consistency with the system requirements.

c) Appropriateness of design standards and methods used.

d) Feasibility of the software items fulfilling their allocated requirements.

e) Feasibility of operation and maintenance.

NOTE System architecture traceability to the system requirements should also provide for traceability to the stakeholder requirements baseline.”

224

3.4.2 Criteria Defined by the Software Architecture Review Board

The FAQ for the Software Architecture Review Board summarizes the criteria for a good architecture review as follows:

Software architecture should be evaluated concerning the problem it is trying to solve. That's why a project's problem statement is an essential part of review preparation; it describes the problem to be solved and lists success criteria held by various stakeholders. Software architecture should also be evaluated concerning common concerns within its problem domain e.g., flight software...

During a review, it's important to keep in mind the distinction between software architecture and software architecture description. A system can have great architecture and a poor description, or vice versa, or any of the other two combinations. In a review, it's important to note where each weakness lies: in architecture or description.323

The SARB provides a detailed checklist on the SARB sub-community of practice, accessible to NASA users on NEN. Preparing for a SARB Checklist  PAT-023 provides the following advice for software architecture reviews - : 

Click on the image to preview the file. From the preview, click on Download to obtain a usable copy. 

PAT-023 - Preparing for a SARB Checklist


The SARB has also authored a detailed "Software Architecture Review Board (SARB) Checklist" 407 and a "Candidate Questions for Reference Architectures" 410 (such as product lines).

See also PAT-029 - Software Architecture Review Board Checklist, PAT-030 - SARB Review Checklist with Guidance

3.4.3 Criteria Defined by the Office of Safety and Mission Assurance  

The Office of Safety and Mission Assurance compiled the following list of criteria for a software architecture review (edited):

  • Is it clear what the architecture needs to do? 
  • Is the architecture capable of supporting what the system must do? 
  • How does the architecture compare with previously implemented architectures, both successful and unsuccessful?  In particular, does the architecture display superior qualities that should be incorporated on other missions and/or at other Centers?
  • Have all system requirements allocated to software been identified? Has the project identified which of these allocated requirements drive the architecture?
  • Have all the stakeholders been identified to provide the essential design input specifying how the software is to be operated, how it is to execute its functions, and how it is to provide its products?
  • Have all the relevant quality attributes been identified and prioritized, and have they been characterized relative to the nature of the mission?
  • Has the source of all software elements been identified? Is it clear which portions of the system are custom-built, which are heritage/reused, which are commercial off-the-shelf (COTS), and which are open-source software (OSS)?  Does the rationale exist explaining the project’s respective choices? 
  • Is it clear whether the architecture will be compatible with the other system components? 
  • Has the architecture been properly documented for future reference?
  • Does the architecture induce unnecessary complexity when addressing what must be done?
  • Is the proposed fault management approach appropriate for the software and system? 
  • Is the architecture flexible enough to support the maturation of the architects’ understanding of what the software must do?
  • Does the architecture support efficient code development and testing?
  • Have safety-critical requirements been properly allocated to the software architecture?
  • Have safety-critical elements of the architecture been identified properly?
  • Will the architecture support future systems that are natural extensions or expansions of the current system?
  • Does the architecture support performance requirements, including processing rates, data volumes, downlink constraints, etc.? 
  • To enable effective reuse, does the proposed architecture support some degree of scalability? 
  • Have the appropriate hazards been traced to the software architecture components?
  • Are the software architecture rules defined and documented?
    • Patterns, abstractions, algorithms
    • Monitoring and control
    • Data representation and data management
    • Concurrent threads, processes, memory management
    • Real-time execution, throughput
    • Synchronization
    • Inter-process communication
    • Languages, libraries, operating systems
    • Verification and validation
  • Confirm that the documentation of the architectural concept includes at least the minimum information listed below:
    • An assessment of architectural alternatives.
    • A description of the chosen architecture.
    • Adequate description of the subsystem decomposition.
    • Definition of the dependencies between the decomposed subsystems.
    • Methods to measure and verify architectural conformance.
    • Characterization of risks inherent to the chosen architecture.
    • The documented rationale for architectural changes (if made).
    • Evaluation and impact of proposed changes.
  • Is the format of the architecture description acceptable? Is the content clear and understandable?
  • Is the software architecture description more than a high-level design? Does it include quality attributes, rationale, and principles?

3.5 Record Results and Track Actions to Closure

The results of the software architecture review, including findings, concerns, best practices, etc. are captured in a report and presented by the review board to management and others who can improve future processes and ensure that identified concerns are addressed.  Problems found during the software architecture review should be addressed in the project’s closed-loop problem tracking system or corrective action system to ensure they are addressed.

3.6 Measurements for Software Architecture Reviews

Projects can use the same measurement process for software architecture reviews as for other peer reviews.  See the SWE-089 - Software Peer Reviews and Inspections - Basic Measurements.

3.7 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

3.8 Center Process Asset Libraries

SPAN - Software Processes Across NASA
SPAN contains links to Center managed Process Asset Libraries. Consult these Process Asset Libraries (PALs) for Center-specific guidance including processes, forms, checklists, training, and templates related to Software Development. See SPAN in the Software Engineering Community of NEN. Available to NASA only. https://nen.nasa.gov/web/software/wiki  197

See the following link(s) in SPAN for process assets from contributing Centers (NASA Only). 

SPAN Links

4. Small Projects

No additional guidance is available for small projects. 

5. Resources

5.1 References

5.2 Tools


Tools to aid in compliance with this SWE, if any, may be found in the Tools Library in the NASA Engineering Network (NEN). 

NASA users find this in the Tools Library in the Software Processes Across NASA (SPAN) site of the Software Engineering Community in NEN. 

The list is informational only and does not represent an “approved tool list”, nor does it represent an endorsement of any particular tool.  The purpose is to provide examples of tools being used across the Agency and to help projects and centers decide what tools to consider.


5.3 Process Asset Templates

SWE-143 Process Asset Templates
Click on a link to download a usable copy of the template. 

6. Lessons Learned

6.1 NASA Lessons Learned

The NASA Lesson Learned database contains the following lessons learned related to software architecture reviews:

  • MER Spirit Flash Memory Anomaly (2004). Lesson Number 1483 557: "Shortly after the commencement of science activities on Mars, an MER rover lost the ability to execute any task that requested memory from the flight computer. The cause was incorrect configuration parameters in two operating system software modules that control the storage of files in system memory and flash memory. Seven recommendations cover enforcing design guidelines for COTS software, verifying assumptions about software behavior, maintaining a list of lower priority action items, testing flight software internal functions, creating a comprehensive suite of tests and automated analysis tools, providing downlinked data on system resources, and avoiding the problematic file system and complex directory structure.."
  • NASA Study of Flight Software Complexity.  Lesson Learned 2050 571: “Flight software development problems led NASA to study the factors that have led to the accelerating growth in flight software size and complexity. The March 2009 report on the NASA Study on Flight Software Complexity contains recommendations in the areas of systems engineering, software architecture, testing, and project management." 

    Specifically, the study recommends that projects
    • a. allocate a larger percentage of project funds to up-front architectural analysis in order to achieve project cost savings,
    • b. increase the number and authority of software architects, and
    • c. create a professional architecture review board to provide constructive early feedback to projects.

6.2 Other Lessons Learned

No other Lessons Learned have currently been identified for this requirement.

7. Software Assurance

SWE-143 - Software Architecture Review
4.2.4 The project manager shall perform a software architecture review on the following categories of projects: 

a. Category 1 Projects as defined in NPR 7120.5.
b. Category 2 Projects as defined in NPR 7120.5, that have Class A or Class B payload risk classification per NPR 8705.4.

7.1 Tasking for Software Assurance

From NASA-STD-8739.8B

1. Assess the results of or participate in software architecture review activities held by the project.

7.2 Software Assurance Products

  • Software assurance assessment of the architecture review.
  • Software Assurance Status Reports
  • Assessment of Software Reviews results
  • Issues and risks identified relating to the software architecture review; 


    Objective Evidence

    • Software design analysis results.
    • Software architecture review results and findings (if applicable).

    Objective evidence is an unbiased, documented fact showing that an activity was confirmed or performed by the software assurance/safety person(s). The evidence for confirmation of the activity can take any number of different forms, depending on the activity in the task. Examples are:

    • Observations, findings, issues, risks found by the SA/safety person and may be expressed in an audit or checklist record, email, memo or entry into a tracking system (e.g. Risk Log).
    • Meeting minutes with attendance lists or SA meeting notes or assessments of the activities and recorded in the project repository.
    • Status report, email or memo containing statements that confirmation has been performed with date (a checklist of confirmations could be used to record when each confirmation has been done!).
    • Signatures on SA reviewed or witnessed products or activities, or
    • Status report, email or memo containing a short summary of information gained by performing the activity. Some examples of using a “short summary” as objective evidence of a confirmation are:
      • To confirm that: “IV&V Program Execution exists”, the summary might be: IV&V Plan is in draft state. It is expected to be complete by (some date).
      • To confirm that: “Traceability between software requirements and hazards with SW contributions exists”, the summary might be x% of the hazards with software contributions are traced to the requirements.
    • The specific products listed in the Introduction of 8.16 are also objective evidence as well as the examples listed above.

7.3 Metrics

  • # of architectural issues identified vs. number closed
  • # of Non-Conformances from reviews (Open vs. Closed; # of days Open)

See also Topic 8.18 - SA Suggested Metrics

7.4 Guidance

Software assurance and software safety participate in all software architecture review activities held by the project. Software architecture reviews are held during the software architecture formulation phase before the software architecture is finalized.  Generally, a software architecture review should be held before the software preliminary design review (PDR). The earlier the review board is involved in the software architecture development process, the more effective the inputs will be to the architecture development.

A checklist provides advice for software architecture reviews (see sample below).  Confirm that these questions and topics are discussed at the software architecture review. Some of the basic questions to be answered during the review include: 

  • Is it clear what the architecture needs to do? 
  • Is the architecture capable of supporting what the system must do? 
  • How does the architecture compare with previously implemented architectures, both successful and unsuccessful?  In particular, does the architecture display superior qualities that should be incorporated on other missions and/or at other Centers?
  • Have all the requirements been defined? More specifically:
    • Have all system requirements allocated to software been identified?
    • Has the project identified which of these allocated requirements drive the architecture?
  • Have all the stakeholders been identified to provide the essential design input specifying how the software is to be operated, how it is to execute its functions, and how it is to provide its products?
  • Have all the relevant quality attributes been identified and prioritized, and have they been characterized relative to the nature of the mission?
  • Has the source of all software elements been identified? Is it clear which portions of the system are custom-built, which are heritage/reused, and which are commercial off-the-shelf (COTS)?  Does the rationale exist explaining the project’s respective choices? 
  • Is it clear whether the architecture will be compatible with the other system components? 
  • Has the architecture been properly documented for future reference?
  • Does the architecture induce unnecessary complexity when addressing what must be done?
  • Is the proposed fault management approach appropriate for the software and system? 
  • Is the architecture flexible enough to support the maturation of the architects’ understanding of what the software must do?
  • Does the architecture support efficient code development and testing?
  • Have safety-critical elements of the architecture been identified properly?
  • Will the architecture support future systems that are natural extensions or expansions of the current system?
  • Does the architecture support performance requirements, including processing rates, data volumes, downlink constraints, etc.? 
  • To enable effective reuse, does the proposed architecture support some degree of scalability? 
  • Have safety-critical requirements been identified properly?
  • Have the appropriate hazards been traced to the software architecture components?
  • Are the software architecture rules defined and documented?
    • Patterns, abstractions, algorithms

    • Monitoring and control

    • Data representation and data management

    • Concurrent threads, processes, memory management

    • Real-time execution, throughput

    • Synchronization

    • Inter-process communication

    • Languages, libraries, operating systems

    • Verification and validation

Confirm that the software architecture review includes the following activities:

Software Architect Essential Activities

  • Understand what a system must do
  • Define a system concept that will accomplish this
  • Render that concept in a form that allows the work to be shared
  • Communicate the resulting architecture to others
  • Ensure throughout development, implementation, and testing that the design follows the concepts and comes together as envisioned
  • Refine ideas and carry them forward to the next generation of systems

Confirm that the documentation of the architectural concept includes at least the minimum information listed below:

The actual format for recording and describing the architectural concept is left to the software project team.  As a minimum, include the following:

  • An assessment of architectural alternatives.
  • A description of the chosen architecture.
  • Adequate description of the subsystem decomposition.
  • Definition of the dependencies between the decomposed subsystems.
  • Methods to measure and verify architectural conformance.
  • Characterization of risks inherent to the chosen architecture.
  • The documented rationale for architectural changes(if made).
  • Evaluation and impact of proposed changes.

Check to see that: the format is acceptable and the content is clear and understandable 

Confirm that the following attributes have been considered: quality attributes, principles of design, verifiability, and operability.

  • Give team members a clearer understanding of the project

Confirm that the following attributes and characteristics of the architecture have been considered:

Software architecture is not just high-level design, it includes quality attributes, rationale, and principles.

Software architecture is not a one-time effort

  • Make software architecture a driving force throughout the life cycle
  • Good architectures don’t step aside once development starts

Embrace well-architected software as a response to system complexity

  • Weak architecture …
    • Can’t be analyzed or validated for correct behavior, except case by case
    • Can’t be changed with confidence, even to correct errors
    • Can’t be operated with confidence, other than the way it was tested
    • Can’t be reused easily or inherited from
    • Conduct software architecture reviews

Document any issues or risks that arise during the architecture review and confirm that the project is addressing these. 

7.5 Additional Guidance

Additional guidance related to this requirement may be found in the following materials in this Handbook:

  • No labels