Test case steps - to write or not to write?

Hi everyone.

We struggle to find the time to write test cases in VSTS (company tool of choice) with the actual steps, so that anyone in theory could follow them and not just the person (tester) who knows the application.
I wanted to change them to a BDD format to form the basis for the automated regression tests, but there isnt time for the testers to write them in one of the teams.

As we all should do at time to time, I wanted to step back and think about the value of the individual steps and am interested in other people’s thoughts and experiences. Do you find they add value? If you dont do them, have you noticed any issues at all?
And following on from that, do you take evidence of test passes or just failures? This is another thing I am thinking about in terms of value add.



the problem is that unless your BDD description is really detailed you run the risk of the steps not being detailed enough for the most novice user to be able to re-execute.
Can the test scripts not be written whilst the devs are writing the code?

1 Like

Hi Paul - thats what I’d like to happen, but the team are under time pressure so have been told by the team lead not to do so. Before I insist on something being done, I want to sanity check that it is worthwhile doing and adds value, and thats why I’m interested in knowing what others think about this.

1 Like

If you did not write tests in excruciating detail, how would that affect the test execution. Say, for example, you just had a test description which is written with some detail. (“Ask for data from {X} dataset”) Could an experienced tester perform this test? Would the results of the test be valuable to your team?

How many people who aren’t the tester have EVER performed the tests from VSTS?

If a tester who lacks experience should run a test without detailed steps, but WITH someone who could answer questions and help out with some of the details, would that tester learn more or less about the system and how people could (ab)use it?

The time it takes for the “detailed” actions is to plan out the test steps, write them down, translate them for company tool of choice, recover the written tests from company tool of choice, confirm that the test is the correct test for that time and place, execute the test, report the results, not only in company tool of choice but also directly to people who need the results quickly. Could you use this time in a more constructive way? (And in that time, don’t forget maintaining and updating the test sequences)


the value is being able to track test cases, what about your regression suite? or your smoke testing suite?

1 Like

I’m assuming this is manual testing scenario only.

In my opinion, there are many benefits in writing concise tests which for an executor, with enough application knowledge, should be easy enough to comprehend and follow. All executors should have enough application knowledge to not only test what’s mentioned in the test but also test around based on app/domain knowledge.

Humans are bad at following steps mechanically but great at exploring areas. Machines on the other hand are best at following detailed instructions. Hence make the most of human testers.

I find it very painful to read and follow tests that are written in excruciating details. they take longer to read and comprehend and cost a lot more to maintain (as mentioned by @brian_seg)

You mentioned BDD, however please note its a development methodology that is used for better collaboration in the team. If you are not using BDD the way it should be used them it just becomes an overhead (could be separate post by itself)

note: even if you are thinking to automate tests in future, more concise tests help in faster comprehension and automation engineers can easily combine and split as necessary in the framework for better automation.

In case of evidences, I have hardly seen anyone using evidences for old tests. I understand that in some industries evidences are necessary for audit purposes. But I’d try to make evidence gathering process as easy as possible for a tester. e.g. instead of requesting them to attach screenshot for every steps, I’d capture automated screen recording of the application. attaching separate screenshots etc can be very time consuming and can cost a lot.


Thanks Brian, all good points. I need to decide on the value of detailed steps versus the potential issues in the future if someone new were to try to run tests independently and couldn’t as there is no detail.
Food for thought!

1 Like

Thanks @ychaudhari - yes, we are talking about manual tests.

You picked something I hadnt, and thats allowing someone to do exploratory testing, but we have set end to end sign-up processes to follow for the process to work so having steps to follow can be useful in that sense. There’s no reason someone cant go ‘off-piste’ and try something different though.

1 Like


There are testing tools like Eureqa, Qvidalabs in the market where you can do the drag and drop for your web applications and it will create the scripts for you for automation. You dont specifically even have to write the BDD formats. you can get continuous script-less automation from day1 as well as load tests for the scenarios / features you want them tested.

This should not take time as it will be ready as you run your sprints and or executed any number of times.

The evidence/reports provides a playback of passes as well as failures thus saving significant time at a fraction of the costs being charged only for execution time and available on the cloud.

Hope that helps.




I can definitely relate to this. I think my observation is that testers generally tend to be very detail-oriented people. Makes sense as that’s kind of fundamental to what we do, i.e. test execution. But as per @brian_seg’s point, is that same level of detail orientation appropriate to how we do testing, i.e. for test cases? I think the answer is likely boild down to ‘its horses for courses’, meaning if we have justification/time/resourcing to create very detailed test cases, then that is what we need to do.

As an example of why I don’t do that in my current job: it’s a 30-person company with just 10 in the IT team, so that’s reflected in budgets and time pressures, i.e. we are busy, busy, busy. Therefore, when a new tester came on board who was pushing to have more detail in the test cases I pushed back against it as the company can’t afford the time to do this. Obviously this is glossing over a detailed discussion that we had, but keeping things simple, we adopted a ‘just barely good enough’ as per agile practices as moving fast and keeping things lean is more important to us than documenting test cases to the nth degree.

As always, YMMV! I can see how this approach would be utterly inappropriate somewhere that testing is more, and literally, mission critical, e.g. NASA.


A test is used to evaluate a product, learning about it, through questioning, experiments, explorations, inference, modeling, etc…
A test with low information value is a useless test(see Cem Kaner’s big books/articles);

For a tester or a person in the tester role, that doesn’t know the application, restricting him not to learn anything would be awful. As you give him a recipe of how not to find bugs.
For a tester that knows the app and reuses the cases, it’s again awful. As a tester should be on a constant learning path. This tester doesn’t grow. He doesn’t expand his skills further, by being limited to some cases; he will not try other techniques, tools, depths of the app or logic, not think of other test ideas, approaches to a feature, etc…


In a previous role, I wrote down detailed test steps but that didn’t stop me branching off into what I now recognise as ‘exploratory testing’, and indeed, I often then wrote those exploratory tests into the scripts so that their scenarios could become part of the regression suite.

And there are some sectors - finance, medicine and other regulated businesses - where detailed test steps are required for due diligence purposes.

Finally, just bear in mind that although very many apps nowadays are intuitive in their operation, some software that is intended to be used by non-expert end users may require help pages writing that could be used to step through a workflow - either as training or to guide new users. A detailed test script makes a very good place for such help pages to start,


So what do you have now in the steps area? Anything? Or just a test title like “reset password” with a broad scenario instructing to reset the password, use various emails - valid, invalid, etc.

Trying to get an idea of what a case looks like for your team, because some things can be intuited and others need to be explicit.


@jcost - its a mix of cases with a title and steps, and others with just a title and no steps at all.

I dont want ‘war and peace’, but think we need some info so that if anyone new comes in, or someone else covers, there is enough info to use as a guide. The steps are not that intuitive, and if the process was simpler then I’d probably be ok with less detail.


Hi Steve, at our company we use test cases and test steps extensively - I find that if the product / feature / use case under test is demanding on the tester having domain knowledge, then having test cases at the smoke test / exploratory level (having some detailed steps for those outside the team to start learning from) coupled with more granular test cases that have less detailed steps (providing a platform for those with less knowledge to collaborate and learn from product QA) have worked out well.

Personally, I’ve opted for capturing the various steps and use cases in the test case summary area as opposed to having exact test steps (action with expected outcome) as I personally prefer the QA /person running the test case to come and “chat” - that way I can infect / pass on things to test and look out for, digging deeper into their test approach and building confidence for the both of us on doing a good job. I use screenshots and tables to capture the parameter / setup for the test case which doesn’t work for test steps.

I think for your scenario - if you expect people outside of the team to frequently pitch in and help because the time pressure is always there - maybe detailed test cases with detailed steps can help, especially if the test approach and risks are non-intuitive. For your plan of using test steps as a jump pad toward BDD style tests - my speculation is that unless the test step lexicon is standardized, you might not be able to reuse the steps without additional investment - proceed with caution!

What has worked well for us to engage non-QA helping out with the testing effort was the use of smoke tests that had detailed steps across a wide area of different application areas - we got quick and wide coverage, and passed on a bit of domain knowledge - but it was at the cost of some QA and the developers herding the helpers. It’s not great but it’ll get the job done when you need bodies!

This all depends on what pain you’re trying to solve / alleviate with test steps of course.


Thanks everyone.

I’ve decided to go with a BDD approach and if there is any supplemental info that is needed only once or twice (for a newbie) then this is saved as reference data rather that appearing in every test.
We have one 28 step test reduced to 8 as a result and it just reads so much better.
I dont want to go back to old Waterfall days of writing tests - ‘Click on this, then click on that and navigate here etc’.
Moving forwards is the way to go!


It’s interesting that @sjwatsonuk decided to go with BDD. Hope you’ll perhaps kindly keep the thread informed on how things work out down the line…

In case it helps others, in my expeience even in BDD there are some choices to be made at the low level implementation :slight_smile:. It may be useful for folks to be aware of Declarative vs Imperative styles. This is a useful and clear article that may help folks in terms of ‘phrasing’ their BDD: https://itsadeliverything.com/declarative-vs-imperative-gherkin-scenarios-for-cucumber

I agree with you that sometimes the task of creating test steps is arduous, but think of it like this. Once the test steps are written, that application knowledge can then be attained by anyone needing to execute the scripts. Moreover, if at some point in the future a greater focus on regression or automation is needed, all the info is already there without needing to refactor or add info.
My advice is to start basic, write the steps in full, then in time, convert to other formats as required or even automate. Only, and this is a long shot, …only if the software is evolving so fast that written scripts are out of date in the next few sprints AND if the QA resources do not rotate to maintain system knowledge, would I say “write only headers and objectives for your tests”.

Hope that made sense…