As testing approaches evolve, and as testers gain experience, the approach to test planning changes, too.
Test plans have come a long way.
At one end, you’ll find the traditional, many-section document brimming with objectives, timelines, risks, and traceability matrices. At the other, a one-page dashboard that keeps only the lean essentials. Both aim to answer the same questions; What are we testing? Why? When? But they shape conversations, hand-offs, and expectations in very different ways.
Think back over the plans you’ve written, or inherited, and consider how your approach has shifted as teams, tooling, or release cadence changed. Then please reply and let me know:
What you liked about each format
What was clear and helpful
Anything you didn’t like
Key differences you notice between a detailed plan and a lean template
Pros and cons you’ve experienced with both format
Bullet points, quick sketches, or redacted snippets are all welcome, but please remember to remove any sensitive information first.
Let’s see how test-plan formats are evolving across the MoTaverse and what trade-offs we’re all making along the way.
I can think of 3 formats from my past and present:
The v model test plan
Years I ago I used to work in a v-model process where quarterly release were planned. Each stage required sign off by all stakeholders: Requirements, Design, System Test and UAT. I can’t quite remember if it was called a System Test Strategy or Plan.
The main problem with it is that 80% of the document became a template. So the document was seen as an essential deliverable rather than a usable document that everyone benefited from. Only the Test Manager knew about it. They wrote it and never shared it with the team.
So the positives were, all the stakeholders would sign off on anything we said we would and wouldn’t do in the testing. As long as we delivered on that commitment, we were fine. The downside was there was no flexibility and of course the document was never shared with the wider QA team. The other elephant in the room was, was anyone actually reading it as it was 80% the same document?
Nothing
From one extreme to the other. In a start up company, working agile and there was no need to worry about sign offs, so we did no documented test plan. Stuff came in, we tested it, it went out. That was fine when the customer was working inside these iterations and signing off features by playing with them. But as the product got bigger and more complex, the impacts on changes to not only other areas of the product but also other products were becoming more frequent. Also not every customer wanted to work iteratively with us, they wanted it thoroughly proven before they got it. We also had become quite reactive to issues that could have been prevented, slowing down our feature delivery.
The Tactical Test Plan
Now we put together a tactical test plan. We work iteratively, but once we’re clear what we are committing to a software release we put together a brief test plan to briefly explain what our key quality objectives will be for the release, how we’ll be covering those objectives and anything extra we need up front. Anything that isn’t subject to change, isn’t documented so you lose the 80% template, its fresh every time. So its a best of both worlds scenario, the main principle being using the document to do your systems thinking and get some confidence in how we are going about proving the quality as efficiently as possible. Its presented across QA and shared with the engineering squads.
The downside is its a very QA centric and readers may think “not my problem, I’m not in QA”. What we don’t document, means its not considered as part of the planning and maybe in some cases the fact its an area that doesn’t change, doesn’t mean it shouldn’t. So I certainly think it still needs to evolve, we just need to be careful it doesn’t turn into a “deliverable” and remains a “tool”.