|0||1. The Requirement|
|3||4. Small Projects|
|5||6. Lessons Learned|
|6||7. Software Assurance|
5.4.5 The project manager shall monitor measures to ensure the software will meet or exceed performance and functionality requirements, including satisfying constraints.
The metrics could include planned and actual use of computer hardware resources (such as processor capacity, memory capacity, input/output device capacity, auxiliary storage device capacity, and communications/network equipment capacity, bus traffic, partition allocation) over time. As part of the verification of the software detailed design, the developer will update the estimation of the technical resource metrics. As part of the verification of the coding, testing, and validation, the technical resource metrics will be updated with the measured values and will be compared to the margins.
Click here to view the history of this requirement: SWE-199 History
1.3 Applicability Across Classes
It is very important to consider constraints on resources in the design of a system so that the development effort can make appropriate decisions for both hardware and software components. As development proceeds, it is important to check regularly that the software is meeting the performance and functionality constraints. These results should be reported at major milestone reviews and regularly to the Project Manager.
It is important to remember that functional requirements are the ‘what’ and nonfunctional requirements are the ‘how’. So, the testing of functional requirements is the verification that the software is executing actions as it should, while nonfunctional testing helps verify that customer expectations are being met.
An early metric for gauging the success of meeting functional requirements would be the results of unit testing. The next step would be looking at the number of issues captured while dry-running verifications. Finally, the number of issues found and resolved along with the number of requirements verified during formal verification are good metrics to track.
Performance requirements can be difficult to determine since many are domain and project-specific. They tend to be dependent on the software design as a whole including how it interacts with the hardware. Performance requirements are usually, but not always, related to the Quality attributes of the software system.
Some typical performance requirements involve the following:
At a minimum, these items should be evaluated and reported to the Project at all major milestone reviews as well as any major maintenance upgrades.
4. Small Projects
No additional guidance is available for small projects.
|title||Visible to editors only|
Enter the necessary modifications to be made in the table below:
SWEREFs NOT called out in text but listed as germane: none
SWEREFs called out in text: none
6. Lessons Learned
6.1 NASA Lessons Learned
No Lessons Learned have currently been identified for this requirement.
6.2 Other Lessons Learned
No other Lessons Learned have currently been identified for this requirement.
7. Software Assurance
7.1 Tasking for Software Assurance
- Confirm the project monitors and updates planned measurements to assure the software will meet or exceed performance and functionality requirements, including satisfying constraints.
- Monitor and track any performance or functionality requirements that are not being met or are at risk of not being met.
7.2 Software Assurance Products
|title||Definition of objective evidence|
- The number of requirements being met, any TBD/TBC/TBR requirements, or any requirements that are not being met.
- Performance and functionality measures (Schedule deviations, closure of corrective actions, product and process audit results, peer review results, requirements volatility, number of requirements satisfactorily tested versus the total number of requirements, etc.)
Task 1: The software assurance personnel will review the software development/software management plan to become familiar with the functional and performance metrics the project has planned to collect. Functional metrics would be the metrics measuring “what” the project is doing, for example, the number of unit tests run and passed would be a functional measure. Examples of performance metrics are:
- Peak demand processing i.e. transactions per second over some prescribed duration
- Sustained processing i.e. transactions per second over any time span
- Response time i.e. time to service an interrupt
- Storage capacity/utilization
- Sampling rates
- CPU utilization
- Memory capacity/utilization
Software assurance then confirms that the project is collecting the planned measures and assessing whether they will meet their performance and functionality goals. Confirm that the project is updating the set of metrics being collected if the initial set is chosen is not providing the information they need.
Task 2: Software assurance will do an independent analysis of the performance and functionality measures that the project is collecting to determine whether the project will meet or exceed its functional and performance requirements. It is particularly important to analyze some of these measures to get an early indication of problems. For example, if the number of unit tests run and passed is much lower than expected or planned for this point in the schedule, then it is unlikely the project will deliver on schedule without some correction. Requirements volatility is another early indicator of problems. If there is still a lot of volatility in the requirements going into implementation, then it is likely that implementation will be slower than planned and involve a lot of rework to account for changing requirements.
In this case software assurance should do its own or independent assessment of the performance and functionality measures that the project is collecting to determine whether the project will meet or exceed their functional and performance requirements and not just accept or rely on the engineering assessment.