How do you write your test cases?

Hello MOT,

I want to ask a question that is both simple and complex to answer: How do you write your test cases?

Why do I say it is simple?

Well I mean according to some definitions testers write test cases and since they do it often it should be easy. Also there is a ton of information online on how to write a test case, what should it contain, so again, easy and simple.

Why do I say it is complex?
Well, a PM once told me that the tests should be written in such a way that a person coming from the street should be able to run them. I mean that is not impossible to do but it will take time, a lot of time. One has to find a way to explain concepts that are not really in the day to day dictionary ( API for example).

I think that they want somenthing like this When tests steps are really clear - YouTube

What do you think? How do you do it?


Good to ask Why? when my manager asked me to write Test case with too much details, I challenged my Manager, what value I would get by writing details, his answer is same like your manager “someone would execute in my absence.” Ok, if this is the case, let me create Application Knowledge documentation on set up, configuration, use cases, system architecture, Error booklet and reason of these error. This can be used by anyone support team and others who can get benefit. Instead use mind map tool to document test


TL;DR: We should know better than to require expected results in test case steps.


If anything is worth documenting in such detail, then it should IMO be documented as test automation.


I write them at the appropriate level for the audience that’s going to consume them.

If we’re going to be outsourcing our testing, I’ll be writing them closer to the “a person coming from the street should be able to run them” level of detail.

On my current team that’s heavily shifted-left, relies upon automation, etc, we have no explicit test cases.


I really liked the youtube video :stuck_out_tongue:

A lot of people write test cases like ‘as an admin I want to be able to do A’ but there are a lot of different ways to write it and you should write them like your team understands them.

Although I almost never see people write ‘As a user I am not allowed to X or Y’ which might be overkill but a valid scenario in some context.


Elaborate enough for a reasonable avid tester rerunning it and understand after an introduction to the test object. Since I am currently in Azure DevOps I also try to keep test cases as brief as possible


I’m not doing any at the moment, but a I’ve been working a blog post about test cases - I’ll share it as soon as it’s done.


Hi, great topic…I am documenting one for my team to follow…shall share with the community as well.


First of tell your PM that they are wrong! As with any documentation, you need to keep it to the level of the reader. If you are actually taking people from the street to perform some task through written instructions, those instructions need to be on the level of anyone from the street can read them. If you are writing test cases that will be used by other professional testers they need to be on that level.

A common problem we face with test cases is that we want to use them for too many things at the same time. Like if we want to write down our test ideas, and use them to distribute the test problem to multiple people, and to generate reports and to help onboard new people and to give them to non-testers and use for automation and have them as reproduction steps in a bug report. It is very hard to find the level and typically it will carry non of those responsibilities really well. Where if you are to say, I want to use them for senior testers to guide more junior testers what to cover, a checklist or mind map will do that for you. And you will require the reader to have prior knowledge to both the product and to test techniques.

Without understand the purpose of your test cases it is hard to suggest the best format. But a few common important purposes and some inspiration for how to fulfil it.

  • Sharing test ideas. Read about commanders intent. Intent (military) - Wikipedia
    The principle behind this is that if you are too explicit with your instructions you remove the ability for people to use their intelligence and adapt to reality. Which is why checklists are preferred too instructions. Because a checklist you can check of in any order and skip steps that are not required, where instructions do not facilitate that. Testing Charters is another example of communicating test ideas.
  • Audit Trail aka. what have we actually tested. Start by documenting what you are doing not what you intend to do. As in post facto. Then if you are forced to write that in test cases, write them after you have tested not before. The reason is that this eliminates all waste of putting all that effort into something that did not even happen or became obsoleted. This has two major advantages. Firstly you will typically get a better audit trail. Because instead of having a vague instructions saying something like “Login with a valid user” you can write “Login with” since that is what you did. And secondly it will have a greater coverage, since for every finding you did you can create a test case if you want. But unless you actually have to have an audit trail for security / certification reasons it is very expensive to do it.
  • Onboarding of new testers. Guided learning is my preferred method. Where you give the person a mission and encourage explorations, instead of having them follow instructions. Like what is the password requirements for a user. How many reports are there. What are two ways to create a new user etc. More fun to write, more fun answer and helps new testers to get a deeper understanding quicker.

Here is that blog post, I just finished it today:

1 Like

So, time to be controversial.

Been doing this job for a while now and everyone has their ‘best’ and ‘common’ approach. Audience is the big one here.

  • For me - long gone are the extensive documented and scripted test cases, many steps of click this, do that, check that this is displayed etc. They do not serve as a guide to understand the product, they only ‘check’ a specific scenario on a specific journey. As a “tester”, you should get stuck in to understand the products and how they behave, explore!

  • Exploratory testing, whilst not documented still needs a scenario to work with and set the boundaries (and traceability to then repeat what you have done), you will understand the inner working far better and so the system under test, whilst also finding the obscure failure in requirement / defects.

  • This gets us to ‘Requirement Specification’ testing, I have successfully implemented this again in my current role. We are primarily Web and API testing of our products. Have no ‘Full Specification’ of those products (Historic issue) and so use a BDD / Cucumber approach AS THE TEST CASE! - Sounds crazy, but trust me it works.

This works because when written correctly, Everybody in the company understands what is under test (Exec, Managers, PO’s, Dev and QA). On running a test, you are not following a single journey route (Who does, certainly not any customers). Best of all, maintenance is low, understanding is high and coverage is maintained as high as you like!

However you test - it has to be what works best for “The team”… And the level of understanding and experience… Find your own way.


@clittlefair I like how you think, for me exploratory is the way to go, scripted test cases are not the most economical ways for testers to spend their time, this post was intended for junior people who can’t go around and avoid test cases, this is especially true for companies in my area where they are all working for external clients, usually, old-school enterprises, who unfortunately equate quality with the number of test cases designed and the number of bugs reported.

1 Like

Writing test cases varies depending on what the test case is measuring or testing. This is also a situation where sharing test assets across dev and test teams can accelerate software testing. But it all starts with knowing how to write a test case effectively and efficiently.

Test cases have a few integral parts that should always be present in fields. However, every test case can be broken down into 8 basic steps.

Step 1: Test Case ID

Test cases should all bear unique IDs to represent them. In most cases, following a convention for this naming ID helps with organization, clarity, and understanding.

Step 2: Test Description

This description should detail what unit, feature, or function is being tested or what is being verified.

Step 3: Assumptions and Pre-Conditions

This entails any conditions to be met before test case execution. One example would be requiring a valid Outlook account for a login.

Step 4: Test Data

This relates to the variables and their values in the test case. In the example of an email login, it would be the username and password for the account.

Step 5: Steps to be Executed

These should be easily repeatable steps as executed from the end user’s perspective. For instance, a test case for logging into an email server might include these steps:

  1. Open email server web page.
  2. Enter username.
  3. Enter password.
  4. Click “Enter” or “Login” button.

Step 6: Expected Result

This indicates the result expected after the test case step execution. Upon entering the right login information, the expected result would be a successful login.

Step 7: Actual Result and Post-Conditions

As compared to the expected result, we can determine the status of the test case. In the case of the email login, the user would either be successfully logged in or not. The post-condition is what happens as a result of the step execution such as being redirected to the email inbox.

Step 8: Pass/Fail

Determining the pass/fail status depends on how the expected result and the actual result compare to each other.

Same result = Pass
Different results = Fail