Test Strategy Documentation in an Agile world

Back in “ye old days” where I worked in an Waterfall environment when you could be designated tester on project ABC, there was always a requirement on a per project basis to write up a comprehensive test strategy document. As well as detailing the specific tests that would be executed, it would outline things things like:

  • overall approach
  • browser coverage
  • what was out of scope(based on test case prioritisation of high, medium, low)

So in the example of project ABC which was estimated at 3 months one of the first tasks before starting the execution would be to write up this document and send it out for review to the various stakeholders on the project. This could sometimes take a couple of days to write, which wasn’t an issue when working on such a big project as there would be time to do it.

In an agile world where we are working in sprints of 2 weeks, where there are multiple projects(or user stories as they are known at our company), these are a lot smaller in size but if we wrote up test strategy documents for each project we’d spend the whole iteration on documentation.

Again, this is something I’ve thought about for a while and would appreciate peoples thoughts on some of the questions I have and maybe give some examples/pointers

  • is a “per project” test strategy document the right approach in an agile environment?
  • would an global test strategy document be the right approach which is more of a static company wide test strategy document, acting more as governance, as apposed to doing this for each project?
  • is a test strategy document even necessary at all?

I’ve had some great feedback during my short time on this site so looking forward to seeing what people come back with



What story will you tell with your test strategy document?

I have been wrestling with the same question lately with my new work place. In the past, they have been working on a IEEE-829 based set of test documentation, which is entirely unsuitable to an Agile environment. A contracted tester is busy with a set of test documentation based on the ISO-29119 (part 3) based document, which is slightly better, but still takes forever to write and has tons of information with no value. But the new standard does have one thing which may be valuable. It recommends that you write a master test strategy for all projects, and a lighter-weight (but still heavy) document for individual sprints/projects.

I don’t like either method. My own vision of a test strategy document is a document which tells the story about how we plan on adding value by testing. As such, our document should have:

  • A list of which functionality should be tested in this iteration (scope)
  • Why some things should not be tested, and the risk of not testing those things (risk analysis)
  • What we need in order to test, in both time and materials (task list)
  • Ideas about how we could test the things (i.e. exploratory, automated, requirements based, etc) (approach)

With this information, the team leaders or managers have enough information to make a plan. The team has enough information to make test designs (assuming other things). And it shouldn’t take too long to write.

In a previous job, we shortened the test strategy to an hour long planning session with the team and writing a short email to communicate the results of the meeting.


To quote the agile manifesto “Working software over comprehensive documentation”. Now many people take this to mean no documentation but it really means only write valuable documentation.

We now need to consider if the Test Strategy document valuable and useful to the team? Every team will have a different answer to this.

I certainly don’t like the idea of a global document as the teams should be able to decide what is appropriate for their project.

Every team should have a Definition of Done and if some of the information from the strategy should go anywhere it would be here. If you do insert some of the strategy information into the DoD I would recommend not to do it in a prescriptive way so the team has the flexibility to choose whats appropriate on per User Story basis.


In one team (a project with multiple sprints) I was on - we started the Test strategy doc in Sprint 1, and then slowly build it from there. The test strategy doc was all the things that where equal across sprints:

  • Environments
  • Tools
  • Approach to test automation
  • Definition of Done, Definition of Ready, 3 amigos
  • Delivery pipeline

That worked quite well, and improvements to the doc & test process went to the backlog along with other work products. Until it didn’t add value. DoR never got implemented, so nothing is perfect…

With that - and if you really really need a (pr sprint/pr. release) test plan the content of the test plan comes down to Requirement Scope, Test Scope and People. With the right tool this can be automatically pulled. - as mentioned here:


If you have worked in the “old” Waterfall environment than you must be familiar with the bulk of heavy documentation that was associated with the testing process.
You surely know the format and content of those documents, and as you say it is a heavy and time consuming task to develop all those documents.
I can suggest you a very light/lean document that can provide visibility and treceabilty to the testing process, but it’s not a silver bullet.
You should tailor the document in accordance with your business domain, stakeholder requirements and expectations.
Now lets go to the document content:
-Writte a very short introduction : describing the project name and purpose, Test Environment description and requirements (resources), responsibilities (who is doing what). All that information can be written on one page.
Before starting to test list the content of the sprint and state the purpose of the test (that’s another half page).
Now open a spread sheet and define the following columns: SW Version, Test equipment version, Version of the automated SW (if any), HW Version, User Story Nama & Number, Device tested, Test level (Unit, CSCI, Integration, System), Test Purpose, Input data, Expected results, ACTUAL RESULTS, Test Pass/Fail, Criteria for test fail.
So for each Sprint you have to update/add the relevant data for the current Sprint.
When you complete Sprint Testing issue test report that shows the test activity you have accomplish, and anitiate corrective activity (when needed) distribute also a list of all test that failed ( if you have failed test in the current sprint).
So you have evolving test document that will be updated for each Sprint.
when you have completed all sprint testing I sugest to run again all test cases that you acumulated in the spread sheet on the final SW Vervsion before approving release to production/customer.
Verify that all test cases that failed were restested and Pass.
if you have some part of the SW code that was not tested ( for that you must perform code traceability) or some requirements not tested (I suggest to test all requirements) explain why it was not tested and describe the risk of not testing.
Still the testing must be done but with minimum documentation effort. If you dont work in a regulatory domain that might be enough documentation to show what was acomplished during teting.


You do need a test strategy, which is a strategy for your testing on the project/programme (or a strategy for testing in your company)… But how you choose to document that strategy and at what level should be as lean as possible. Try modelling or diagramming your strategy. Or try detailing it on a single slide.

The strategy should describe your test approaches, your test reporting methods, your strategy for managing environments, strategy for reporting bugs, key stakeholders and decision makers, etc (think overall, big picture on describing HOW you will test and report that testing and any discoveries).

This info isn’t sprint specific, but relates to the whole project (ie, cuts across many sprints). You’d be crazy to reproduce a strategy doc every 2 weeks.

Also, along with this strategy document, you might also write a lean test plan (linking your strategy to project timelines, and a prioritisation of product risks/quality criteria). And you might also consider a lightweight test policy (or mission statement).

But again, the key is to keep it lean. Cut out the lengthy text that people won’t read and reduce the bumph down to be the important info in an easily consumable format.


I have adopted the approach of having an overarching Test Strategy for the company which sets out the broad principles of how we test, tooling , guidelines etc. This is a pretty static document, it does change but no more than 3 or 4 times a year.

Each scrum team then has a ‘Test Approach’ document that is no more than 3 pages long, it is created in planning and provides just enough details for the team and stakeholders. It takes no longer than 2 hours to write and kept updated over the lifetime of the project, but again very lightweight updates.

I take the approach that any document that is written but adds no value to the delivery should be challenged and hopefully removed. We tend to use wiki’s as living documents but with the ‘just enough’ approach to writing them.


Scrum or Feature team will document minimal stuff/lightweight.

  1. Where will test results be stored.
  2. How the team’s testing contributes to the overall code coverage system and test metrics.
  3. How often will tests run, and where/what tool is used to schedule tests. And, what tool is used to track defects.
  4. A risk analysis chapter - It includes security analysis and a stress testing plan.
  5. A list of out of scope stories.
  6. In agile, feature/component teams own their testing, so no overarching strategy is in place. Each component will often have different test strategy and different tools, trying to capture specifics that will change during a project helps little, but where the test scripts and test data are going to be stored is critical.
  7. Where will the tests run, what kit is going to be used, what people are going to be used.

Basically a test resourcing plan only.


I have been dealing something like 4 months ago. But we are not in a regulated market with our product.

What was/is important for us is that stakeholders of the project (epic at my company) “the receivers” of such document expect to learn different things out of it as they are doing different jobs. So we come up with the document called “lean test plan” which is usually done within 1 hour and covers things which stakeholders and the team itself are interested in with epics (project or big story).

For us, it is what is going to be tested and what not. What new equipment / SW do we need to buy to test it effectively. And very importantly what are our odds to automated testing as we sometimes have technical difficulties with test automation or we may miss support in the testing framework. I also seek for team dependencies and possible risks.

1 Like

I’ve used a similar approach to Alan’s.

  • Have a high level, pure ‘strategy’ document that isn’t updated very often. It’s more of a description of how my team works, and the approaches we use. But it’s not a ‘guarantee’. Since my team touches all of the high priority projects in the company, not everything in the strategy doc is applicable. Which leads to our next doc …
  • Live test summary per project that we’re engaged in. By live, I mean the summary is kept updated on a weekly basis. We perform mostly exploratory testing based on an evolving risk analysis of the product. As sessions are completed and debriefed, I incorporate our findings into the test summary, which we call a Quality Story, as that’s its true purpose.

The Quality Story itself starts with a summary of the current status. 3-4 sentences max. I follow that with an overview of what/how we tested (bullet points). Then a brief highlight of the top quality characteristics that 1) stakeholders are concerned about and 2) other characteristics that may be greatly impacted by issues found. I will briefly describe ‘how’ the characteristic is impacted and reference bugs (links) as necessary.

To be honest, it can be a little rough up front with those who are less comfortable not having a detailed test plan. It takes time to build confidence with your stakeholders, but once gained, things become very smooth sailing as they recognize your value. The Quality Story helped make that happen. For those who display more resistance, we offer the ‘low-tech dashboard’ approach, which is a ‘chart’ version of our quality story. The visuals are magically appeasing to some. But that usually isn’t required after the first engagement.

Using this approach, based on constant communication instead of charts (for the most part), we’ve become a valued stakeholder in the development process, which can be quite difficult when you’re not part of the development team.

To sum up, we really try to write ‘agile’ documentation, but you do have to be on it. Do not let your quality stories grow stale. If your week of work didn’t produce more visible issues, stay in communication with your stakeholders, as they may wonder about your efforts.



Hi guys,
I hope that the thread can be the correct. I also though for a Test Strategy, where I will update no often, and a simple test plan, how Lisa and Janet’s book suggest.
My question is, can I find a valid template to create a test strategy document? Becuase, I considered these points:

  1. Scope
    2.Test Approach:
    a. Process of testing
    b. Testing Level
    c. Type of Test
    d. Roles&Responsability
    e. Manage defects
  2. Test Environment
  3. Testing tools
    a. Automation
    b. Management
  4. Release
    a. Test plan simplify

I agree, a test strategy is helpful in setting out the framework of testing for a given project or programme.

I believe it should describe when particular types of testing are required (performance, security (pen) test etc); The iterative process you are using, (e.g. scrum, or hybrid, company defined agile approach); How stories will be broken down; Who is involved; Environments; Tools; Defect mgmt process (e.g. does a defect get deferred, or replace another story, who will prioritise etc); release and iteration schedule; risk and issues.

Not necessarily in that order, or covering everything your test team might do. It should also describe the interactions between the team’s, (e.g. stand-ups, defect reviews, story elaboration workshops co-location etc)

What I can’t settle on is the format for the information. Word docs feel a bit dry in an agile environment.

I would welcome suggestions to more effective communication of this information.

1 Like

This probably won’t work for everyone but the method I’m about to describe worked well for my team. What we did in my team was to create a high level quality strategy at the very beginning, updating it as we introduced new browsers/environments/features. This detailed what environments we were using, browser support, definition of done for user stories & releases etc. - All the things that would not change on a user story basis.

Each user story then had a test approach and test specification. Taking the agile manifesto - we decided the best way to do this was bullet points and a checklist. As a team, we had created a check-list of considerations. For example, does this feature impact performance, security, usability? etc. In sprint planning, we would tick which box the feature related too and whilst the feature was being developed, as testers we would write a test approach and test specification with check points back to the checklist, which will be reviewed.

We tried many different ways to do this and finally agreed on bullet points. It conveyed all the information needed for stakeholders and was clear and concise. As it was bullet points, generally the test approach and test specifications only took a few hours to write and so the sprint was still able to be on schedule. Our quality strategy said that we would test to the quality strategy and any testing that would go against this would be clearly described within the individual user story.

As I say, this approach worked best for the team I was in. It may not work for everyone but we found that it worked well for us. I hope this helps.


1 Like

I just saw 2 new things/ideas in this thread, because I used my posting above as a way to document my thinking to my future self. The 2 ideas I am going to roughly steal, please read the posts above, because I’m sumarising here.

  • Separate the process of test planning in general from testing a feature, try to use this to communicate the commonly used plan to everyone. Cut down on repetition. A great idea from @zeff.morgan
  • Each user story must also have a test approach, as a team each task needs to be checked against criteria like Security,Usability , and of course testability. This lets you checklist any testing activities in sprint. That good tip from @la1

In practice, where I work, we have 3 teams. Everyone does it different, one team, I dunno what they do at all; the team near me write it all up in a big table on the wiki, and I tend to create a short powerpoint, and a word document. Hope nobody minds this thread necro, but I’m looking for those bullet point organised tips, which are best brought out higher up, by everyone else and still relevant I hope.