How do you usually approach testing when the requirements are vague or incomplete?

As @andres_gr mentioned dealing with vague requirements, it really got me thinking.
Curious to know - do you lean more on intuition, product knowledge, user behavior…Or something else entirely?

7 Likes

Great question! I see this way to often when I coach people, there is no documentation or requirements and people tend to go by intuition. Sometimes you almost have too…

But I strongly recommend not doing it, unless you just note down your intuition questions to ask them later but never use intuition to assume the product works like that.

I recommend ā€˜asking questions’ until you get answers. Because without requirements you cannot check or validate the product. There will always be a PO/Analyst who knows the answer and if not, it’s their job to note down how it should work. Never assume, ā€œbecause it will make an ass out of u and meā€

So if a requirement is vague, ask questions to clarify.

9 Likes

From my own experience, i would say incomplete requirements will always hinder the successful delivery of the project.
So back in 2022 I was working as software tester on application which has stripe payment gateway integration and real time payment status updation for payment/refund.
The project started on good note however later on the client realizes somehow that documentation was not properly maintained. And taking advantages of that they start giving so many changes in the requirements that the subsequent sprints become completely mess.

Not all the clients are same, but our bad luck was that the project was already lengthy and on top of that we met a client who had such thought, all this made the project completely mess.
The worse thing if the documentation is not maintained and project turned out to be failure then BA or PM are usually the first person to face the consequences then later on all other team members.

In my perspective I would suggest to have clarity on requirements, taking sign-off for the requirements is necessary else it will difficult to decide the exit criteria for the sprints.

In some organization there is no BA or PM and qa have to handle it, then still same approach should be preferred that requirements should be cleared, else we cannot visualize the product for testing that we are about to deliver.

4 Likes

Proceeding without clarification is dangerous, but so is waiting endlessly for clarity. It’s a risk when requirements are vague… The trick is to acknowledge the risk upfront n communicate it effectively. document it as a known risk and the proceed with … As per agile principles it value responding to change over following a plan,… Requirements evolve… stakeholders often don’t know what they want until they see something tangible…waiting for perfect requirements can lead to analysis paralysis, especially for complex systems… Deadlines are real. Sometimes teams can’t afford to wait until everything is perfectly documented…Often, development needs to start with partial clarity to meet release timelines…But , as a QA, we can be the one who drives clarity by identifying requirement gaps n having those discussions before testing begins…

3 Likes

Lets build more context on this. We can fall in a couple of scenarios like:
There are no requirements at all. Or maybe there are and we don’t know of them.
There are some explicit requirements or only explicit requirements.
There are implicit requirements we gathered with time (in business domain, team, from stakeholders, clients, managers), and others not.
There’s multiple sets/versions/changes of requirements in multiple places - some conflicting.
We have things we know about and gathered - which might not be called requirements, but some sort of oracles: comparable products/features, history of product, other products built internally, demands of stakeholders, tickets, emails, meeting notes, solution designs, claims of different departments like sales, marketing, support).
And surely there’s others we don’t know, some of which we’re aware of, others we have no idea.

Learn and question what you don’t know is a simple idea to get some sort of oracles to help you testing. A tester raises a problem they noticed, and sometimes it’s in their understanding or a feature, or a trade-off, or a niche not worth mentioning thing. And this is fine. One should want to reduce unnecessary trouble and loss of credibility.

2 Likes

I start by asking the product owner what their expected behavior is. Then talk to the dev and confirm that was their approach and get additional feedback. I update the story with the details from these conversations and then bring it up in the retrospective to try to avoid this happening again. If there is a discrepancy between the dev and PO, we talk about it in a parking lot after the daily standup where everyone can weigh in.

2 Likes

When there are no requirements, I explore. Even if I have clear requirements, I still explore — it’s how I ā€œtestā€ them too. I assume the basics, outline a bunch of questions, and use my findings to start building the requirements collaboratively with stakeholders.

3 Likes

I like @parwalrahul’s comparable definitions and examples to help share thoughts on this question:

Note, if you’re a MoT Professional Member, you can add your own definitions to the MoT Software Testing Glossary. Always good to get a variety of perspectives.

2 Likes

I can think of a couple of situations where requirements were both implicit and tacit.

Situation 1

I ran a 45-minute time-boxed exploratory testing session for a web app I’d used for the first time. My goal was to explore to discover any useful information I could observe. I wrote down notes based on what I was thinking and observing. As it was a bit of a sweep, I decided not to focus on discovering problems and instead focused on writing down as many questions as possible. Things like ā€œWhat should do?ā€ and ā€œWhat should I expect to see ?ā€ That kind of thing. After a debrief with a developer, I’d got answers for most of my questions. So things became a little bit clearer.

Situation 2

Running a short, time-boxed exploratory testing session over a requirements document. Thin on the ground with information, I would question every sentence I read, asking myself, How might this be interpreted? Does this make sense? What information might be missing here? Are there flaky words which could be interpreted as many things? I’d note down these observations and turn them into questions to ask the author of the document.

2 Likes

Really like the way you broke it down. That ā€œnever use intuition to assume the product works like thatā€ really stuck with me. I have definitely leaned on intuition in fast-paced situations, but I am learning that asking the right questions early saves a lot of back and forth later.

Appreciate your perspective, and the reminder!

2 Likes

Thanks for sharing your experience Ujjwal. That real-life context and the shift in requirements really hit home-its something many of us can relate to but don’t always speak about openly. Totally agree that unclear documentation can derail even the most well-planned sprints.

Your point about getting sign-offs and defining exit criteria early on is a strong takeaway, definately bookmarking this.

1 Like

Your point about balancing between the waiting endless and moving forward with partial clarity resonates a lot. As testers, we often have to be the ones to spark those conversations and bring just enough clarity to keep things moving. Waiting for a perfectly polished spec just isn’t practical most of the time.

1 Like

You brought up such a valuable perspective-how requirements can exist in so many forms and places some we don’t even realize at first. That idea of learning to recognize oracles beyond traditional documentation is powerful. It’s a reminder that testing is not just validation, its also exploration and uncovering what’s missing.

1 Like

Really like this! Involving both dev and PO to align expectations early is so such a solid practice. Love how you loop back in retros to prevent future alignment.

1 Like

Liked how you turn the absence of requirements into an opportunity to explore and collaborate, that mindset of using questions to build the requirements is so powerful. It shows how testing is not about reactive but also a creative and problem solving.

2 Likes

The time-boxed exploratory sessions and focus on formulating questions instead of just finding bugs is such a smart shift-especially with vague or incomplete docs.
The ability to turn ambiguity into clarity is something we can all learn from.
Appreciate the detailed breakdown!

Thankyou @parwalrahul for the definitions.

2 Likes

I can do 1 better! :rofl: I’ve been teaching this to all my coachees & Analysts & Developers.
Whenever you ask a question and they answer with " It should be this or that " or somethin similar ā€œNormally it should be X or Yā€

The ā€œShouldā€ & ā€œNormallyā€ are trigger words for their assumptions and their feeling. Which is something you don’t want. So trigger them say ā€œI’m not happy with should, can you find out for real before I continue?ā€

This works REALLY well in refinements. At some point at my client during a refinement I asked a question and he literally said and props to him: ā€œI was going to say ā€˜it should be this’ but then I realized that’s not good enoughā€ and he noted it down as a follow up.

I went back to him after the meeting and gave him a compliment and then he said: Thanks during my analysis & writing the stories: I said should so many times and it now made my analysis better because I can easily eliminate the should and clear the user stories of assumptions for you guys.

2 Likes

Helping to deal with that vagueness is large part of tester’s job. I would say you are here to point out vague parts and help to clarify them.

It’s been years since I last worked with requirements thrown over the wall. In my current team, we meet with Product Manager once a week. He maintains a list of high level priorities, and we regularly discuss details of items we are currently working on. So he has some general idea of direction we are heading and problems he wants us to solve, but specifics are ironed out iteratively during multiple conversations.

Back when I worked with requirements documents, I would read them many times from different angles. What exactly we are asked for? I would write test ideas that come to my mind and see if requirements doc is accurate oracle. More often than not that alone would surface underspecified areas. Sometimes that would also surface internal inconsistencies or contradictions.

If requirement covers a change of existing system, I would also try to imagine how the new thing fits existing processes and functionalities. These interaction points would often be underspecified.

I don’t think I had time for that often, but I would also try to review existing tests in light of new requirements. Which are obsolete? How do they need to change? Do they cover something that seems to go against the new requirement?

2 Likes

These are some good collections of checklists and heuristics that I use to guide me in such situation:

2 Likes

Alright, many things can be done which include but would not be limited to:

  1. Starting off by trying things such as - ā€œA certain way Software is not supposed to workā€
    ex: Unhelpful error messages, damaged text
  2. Identifying parts of the system and creating models on how they interact - the model can be refined moving forward
  3. If the application under test was already tested before. The test report could tell you which test cases passed - are features, and which test cases failed - not a feature. Assuming all the results are true positives.
  4. Casual Interaction either previous or in the current scenario with people on how they use their systems can throw some insights
1 Like