Laura asked a question about how to get the whole team involved in performance testing.
A technique that worked well for my team recently was this:
- Work together with the product owner to work out the goals of the performance testing
- Use this to define particular test scenarios (ideally prioritised a bit)
- If you’re completely new to a tool, work out how to script, debug and run one of the easy-to-medium scenarios yourself first, and start to understand what information you’ll want to capture in the actual test run
- Pair with a developer on scripting and debugging a scenario, to share your knowledge
- Divide and conquer scripting the rest of the scenarios, with plenty of opportunities to review progress and course-correct. We do final pull request reviews too, which helped. Identifying potential ‘tricky bits’ and unknowns in the scripting and making a conscious decision about who was going to tackle those also worked well.
- Divide and conquer running the final test scenarios. I created a template for each test run, for capturing the important information (e.g. start and end time, checklists for test data setup, placeholder tables for capturing key metrics, etc.), so the devs could rattle through running the scenarios. Then I was able to focus on working through the results and digging into the reasons behind the behaviour, to draw conclusions and recommend further work to consider.
I hope that helps! I’d be interested to hear more about your situation and would be happy to talk in more detail.