Testing without requirements?

We’ve all been there at one point in our careers. You’re sitting there thinking about what you’re going to do because you’ve got no specifications, requirements or use cases but you’ve been asked/told/expected to test.

It can be daunting for some. For me, it was when I was very new to the world of testing and so felt like a deer in headlights thinking “What do I do?” “I can’t look idle, there has to be something, anything I can work on to provide value.”

With the benefit of hindsight that many of us now have, what would you say to someone in this situation? What can they do? Where should they start?


Look though the product documentation.
Is there a build document? If so you can test that the application has been built/configured correctly?

Are there security requirements your organisation follows? Test to see if the application meet those requirements.

Go and talk to the people involved? What can they tell you about it? Does everyone’s understanding match and does the application ddo as expected?

1 Like
  1. Most project nowadays involve sprints. A good place to start would the jira items. There should be acceptance criteria which the testers can use to start testing.
  2. In other instances like insurance platform, there should be training manuals for onboarding new hires. It may not be up to date or complete but it’s a good place to start.
  3. In the case of the public facing web app which only provide info for general public, it would be good to know from the developers what the technology was used to create the app e.g. was it angular etc? also, find out from the devs what the APIs which means to ask the devs what the functionalities are. It would help to have some web dev background to understand what’s happening under the hood. Many web apps have the following functionality: search, log in, pull data from server, put data into server. Other things to test would performance.
  4. know the difference between user experience and functionality. some feedback would enhance user experience but may not necessarily be a bug. bug would be when it’s not doing what it’s supposed to do.
  5. last but not least, exploring and providing feedback - the good, bad, the ugly. I find testing can be also just feedback about what it’s about, what’s nice, what’s great, what’s not nice, do you think it’s doing what it’s supposed to do e.t.c.

good luck

1 Like

A long time ago I read an article about Guided Learning that I took to heart which is good in this situation. The tl;dr is that compare what you learn if you get “sit 4 hours with the product” vs. “Find all 5 places where your avatar shows”. The latter is the guided part and was shown to be more efficient in learning. So here is a general guide (which is never as good as a specific guide).

Testing tours to the rescue! First start with the Feature tour. Your mission is to identify what are all the features of the product. The second is the Claims tour. What does the product claim it can do, and can it? With your newly acquired knowledge you end your mission with a Data tour. Pick some important data and follow it through the product.

Now you should know the major features of the product and some of how different parts are connected.

1 Like

It might not be ideal, but it happens.

When I have to test without requirements, I usually follow this rather loose process:

  • If I don’t know the app/feature, start by exploring it. Figure out from the application what it does, if I can access a database, look at what data it saves where.
  • Go through the app/feature and document what it does. Look for anything that doesn’t make sense, obvious errors, etc. If I can’t work out from context what should happen, note what does happen.
  • After the first pass documentation/exploration/test is done, talk with the developer(s) about what should be happening - usually with questions like “When I do X, Y happens, which seems a bit odd to me. Is that what should be happening?”
  • If I don’t have a clearer idea what is expected by the second pass through the application, I start checking for things like keyboard navigation - are shortcuts defined, does the tab order make sense, etc - and layout, spelling, usability issues.

Generally speaking, if there’s no other requirements available, the application UI itself can stand in as the requirement. There’s a lot more detective work than is needed when you do have some form of requirements documentation, but there’s still a lot of information that can be gleaned from the application.


I always tell my testers in their domain teams don’t be afraid to push back and say I don’t have any requirements to test against. Otherwise it can drive bad behaviour and lead to poor quality. Quite often you can see stories that are very light on information and with no acceptance criteria.

1 Like

This reminds me of a great anecdote I have heard somewhere. Once there was a war going on, and a soldier comes rushing to the officer, “Sir, bad news, the enemy have surrounded us”. The officer looks up and say, “jolly good, now we can attack in any direction”!!

So to say, when there are no requirements, no specifications, I would rather go exploratory and try to bring everything down…


No requirements? No specs? No use cases? No problem!

What you need in this case , is to create a solid Test Plan for your future testing.

Put this strategy into a well-defined document, which consists of all the details you consider important to cover, your knowledge and relation to the project and the customer - and you have both the blueprints for future testing and an official paper to show to your management.

There are many different standards on how to create/fill the good Software Test Plan document (or you can invent your own). I myself recommend to go through this article - which both describes what should be written in the doc and also has some nice templates to use.

1 Like

Testing without requirements happens more often than people Realise… Packaged as agile at times…
I’d say ask questions… And then ask some more… The more interaction and feedback you get, the more you can refine your approach…


I guess if a project makes it to the testing stage and the testteam complains that there’re no requirements, somebody has been sleeping. In any delivery methodology or framework, there’s a moment the QA Team need to get involved and this shouldn’t happen at the time testing starts. There must have been some information used in the stages prior to testing (ie development), which may not have been document or might not carry the label of requirements (some suggestions above).I suggest the testteam gets involved earlier in the process to recognize those points. Otherwise, fall back on experience and skills.

I would ask them if they are afraid to learn, willing to learn, as there are plenty of resources available.
They would have to start understanding some basics of testing.

  • Would the explorers stop exploring if they wouldn’t have a map, some specifications of where exactly to go, what exactly to find?
  • Would doctors stop investigating and experimenting when people with strange symptoms go to them and they don’t immediately know what’s happening?
  • Would a psychologist stop analyzing someone and trying to identify and help them identify the problems and solutions and say: I don’t understand you, what you want, need…
  • Would a food critic stop criticizing a restaurant if they are not described everything about the restaurant and food?
  • Would a developer stop developing if they don’t have all specifications?

Everyone explores, investigates, analyzes, is questioning, observes, infers, models, etc… empirically. And so do testers.

Good reads: