A ‘Performance Test Plan’ documents the approach that will be used to verify and ensure that a product or system meets its performance requirements. Below I have outlined the key elements that should be contained within a Performance Test Plan.
- System Under Test
Describe the system under test. Include the system functionality, software architecture, hardware, and network. You should also state what type of development it is. For example, in-house, off-the-shelf, or a contracted development i.e. if it is developed by another organisation.
- Test Objectives
Detail the high-level objectives of the performance test. For example, the primary objective of the project would be to validate that the performance of the system under test, can support the predicted business volumes for 2020 within the specified response times.
- Test Requirements & Scenarios
Test Requirements
Detail what testing needs to be done to achieve the objectives, it should correspond to the test scenarios. For example, the requirements for this project are to firstly execute a ‘Load’ test to determine whether the system will perform with acceptable response times, at loads of up to 100 concurrent users running a scenario of typical business processes. Secondly is to execute a ‘Stress Test’ to determine the maximum number of concurrent users that the system can support and provide information about any bottlenecks and limitations found in the system when under stress.
Test Scenarios
In this section describe the scenarios that will be created to simulate different conditions of load on the system. e.g. Each scenario will consist of at least one test script being executed by any particular (or maximum) number of Virtual Users (vusers).
- Measurable Success Criteria
This generally takes the form of target response times for each type of transaction.
- Stakeholders
Detail the Approvals and Reviewers of the document.
- Assumptions
Detail any assumptions.
- Scope / Out of Scope
Describe what is in scope and what is not in scope for the performance testing.
- Risks & Mitigation
Describe any project and performance risks that have been identified and map the performance risks to the test types. This will ensure the risk can be mitigated by running a particular test. For example, there is a risk that the system will not be able to handle 100 concurrent users. To mitigate this risk run a load test with 100 concurrent users to verify it can handle the specified load.
- Testing Types
Detail all the types of tests that will be executed e.g.
- Load
- Stress
- Soak
- Spike
- Business Processes/Test Scripts
Detail the Process ID, Process Name, and Process Description in this section.
Detail the activities involved in the script. Outline the data requirements (e.g. script parameters and static data), and any dependencies that it may have on other scripts.
- Test Approach
Describe the test approach that will be used. For example, how many test cycles will be performed after the planning and preparation phases have completed successfully.
- Test Data
In this section specify the data requirements. This can be broken down into 3 areas.
- Reference data
Data that must exist on the database prior to test execution as it will be referenced by the transactions executed against the database during testing; for example, brokers details, valid banks. Detail the static data that will be required to support the testing.
- Transaction data
The parameterised details of transactions that are to be executed during testing. For example, a purchase transaction would need card data of the buyer, card number, and buyer’s details. Outline the transaction data that will be required as parameter data for scripts.
- Bulk data
This is data that must exist on the database but is unchanged throughout testing. This data is either not at all involved or only indirectly involved in the testing. This is often for reasons of providing bulk, realistic searching or sorting, etc. Outline the bulk data that will be required for populating the database to represent a size typical of production. Generally, this data will not need to be refreshed between tests.
It is important that all these data types are specified for the required testing. Think about how you are going to create them. How will you make sure they are in the system and/or scripts for your test? Make specific reference to the following:
- Volumes
- Uniqueness
- Other constraints
- Methods of load and restore and
- Database refresh requirements.
- Test Environment
Describe and outline the test environment, both the hardware and software to be used e.g. consider the following
- Test Environment e.g. Pre-Production
- Environment owner e.g. Operations Manager
- Dates available for test
- Whether it is for Shared/Exclusive use
- The representative of production
- If it requires a data/code refresh
- If there are constraints
- Backup/restore processes
- Network location & setup
- Test Infrastructure
Describe the machines that will be used and that are available for the performance testing. For example, IP Addresses and Network IDs.
- Monitoring
Detail the monitoring that will be used and collected during the performance testing, include all possible parts of the system. If you do not have access to monitor a component, or do not have the necessary tools, it is critical to state this as a risk as this will result in a gap in the test results.
Example areas to consider monitoring are:
- All servers (databases, application servers)
- All network components (network segments, routers, load balancers)
- All parts of test architecture (Controller, load generators)
Detail the monitors you will be using:
- Test tool monitors (e.g. LoadRunner)
- Standard monitors (perfmon, system admin pages)
- System logs,
- Other monitors (check with tech support)
- Your own processes
- Test Framework
Entry & Exit Criteria
Detail the Entry and Exit Criteria for the planning, preparation execution phases of the project.
- Suspension / Resumption Criteria
Detail when the performance testing will be suspended. For example, when the system is unavailable and when it will be resumed e.g. System is available.
- Reporting
Document the Reporting methods to be used throughout the project, this can be done through weekly reports.
- Communication
Detail the Communication methods to be used throughout the project. For instance, daily stand-ups etc.
- Incident Management
Describe the incident management process to be used within the project.
- Schedule
Detail the planned test schedule (e.g. project plan) including the following:
- Test preparation activities
- Test setup activities
- Test execution activities
- Test evaluation activities
- Test tuning activities
- Test reporting activities
- Test resources
- Roles and Responsibilities
- Key Contacts
Describe the names and contact numbers for key personnel on the project.
- Support Requirements
Detail the required activities from staff outside the test team. If the required support is not received, this may affect the planned test schedule.
- Deliverables
Describe all the deliverables that will be delivered during the project.
- Performance Test Plan (document containing the above elements)
- Performance Test Completion report will be created once the planned tests are completed, detailing:
1) Summary of tests run, results & system tuning conducted
2) Key findings, conclusions & recommendations
3) RAG metrics reporting on the Measurable Success Criteria of each test – were the key test objectives met?
4) Statement of Readiness – concise statement answering the key questions asked during the performance testing. Is the application ready to go live?
- Glossary
Describe any names that exist in the document and need an explanation.
In conclusion this is not an exhaustive list, but all the above elements should be considered when producing a Performance Test Plan.
Contact us at SQA Consulting, to see how we may assist your company in the development of your team’s skills and see how we can assist you in performance testing your applications.