Does anyone do any practical interview tests?

We are hiring developers, and there are practical coding tests that we give them so we can see how they approach coding. We then ask questions afterwards to cover other areas, so its a good mix.

When I do tester interviews, I give them a user story with a number of ‘holes’ in it, and ask how they would test it. I want to see if they read it well enough to ask about some of the functionality which is not that well outlined, give suggestions for improvements and how they would test a story.

I started to wonder if there was a better way to do this - and I am not thinking about a technical coding test to see if they know C# (I could do that), but I am interested in their approach to testing generally. It could be sitting them in front of a real or made-up web page and asking them to test it.

Does anyone do a test like this? If so, what do you ask the candidate to test, how much info do you give them, how long do they get and do you find it useful?



Hi Steve, have you seen this thread? Technical tests for testers? there are a few people on there talking about technical tests for testers - as part of the interview process. There might be something of use.


I give them a simple system, either a simple piece of software or a theoretical idea of a piece of software (usually their choice), and ask them to look for problems in it. I state that I encourage them to ask any questions they like. While they look for problems I ask questions about what they’re doing. I ask “why are you doing that?” or “what are you looking for?”. When I get a reply I follow up with something like “Oh, I see, why does that matter?” or “why would that help you find a problem?”. I try to be nice about it. If they get into a rut I suggest another idea or ask a question that suggests there’s more to look at.

The idea is to find out:

  1. Can the candidate explain what they’re doing? I’m looking for test framing, awareness of their own processes, and ability to make some of the tacit processes in testing explicit in a way I can understand.
  2. Does the candidate ask questions? I’m happy to point out where the semi-hidden log files are and the purpose/customer for the software if asked. If I get none of these questions I’ll prompt them with a suggestion such as “I guess it depends who the user is? I’m happy to answer questions on that.”
  3. Are the candidate’s questions any good? I’ll ask questions about the questions. So “are there any log files?” might be answered with “yeah, do you need log files? Why do you need log files?” or “where is the software used?” might be answered with “do you think it makes a difference?”. I’m trying to make them defend their questions - they can’t just ask any old thing and think I’ll assume they have a good reason for asking.

I’m not interested, really, in how many bugs they find but I’ll direct them to at least one if they don’t find any because I want to see how they investigate them. I want to see them remove any unnecessary steps, see how else it might occur, evaluate risk, and how they might report the issue or even if it’s worth reporting.

A good tester testing and a poor tester testing look exactly the same, so getting them to test and explain why what they’re doing is professional software testing rather than playing without particular testing skill or knowledge is a good way to see what their internal structure might look like.


Hey Chris,

I like this. From what I can see, that’s how our company does it too. Dice game, Triangle generator game,…
One thing you said triggered something though.

“A good tester testing and a poor tester testing look exactly the same, so getting them to test and explain why what they’re doing is professional software testing”.

I’m an advocate for testers to develop their “explain how you test” and “use the right words for the right things”, hence the TestSphere cards and workshop. What I’ve noticed is that we’re exceptionally bad at this.

  • Either we don’t know how to explain ourselves and we fail at bringing the message across.
  • Or we’re pretty good at it and use words the other person might not understand or misinterpret.
  • or…

I only recently started coaching a few people for interviews and my views are that almost always needs a case-by-case adjustment of what people will want to hear in the interviews.
Do they expect your answers to be “I have a process heavy focus” or “I want to bring value quickly” or "I prevent bugs"
It’s damn hard for people who’re not good at interviews to know what language to use, which questions to pose or how the tasks/games should go.

I realize this has gone rather off topic…
My concluding question would be:
If explainability and interviewing are different skills than testing but good testing is hard to distinguish from bad:
Could we come up with a technical, practical test that only measures testing skill (at its core)?

My first reflex is that it’s probably impossible to do well, but I’m not sure.


Well first I should add, if it wasn’t obvious, that my expectations have to match the person I’m interviewing. If they use different words but can make me understand them, I’m okay with that, and if they’re more “senior” then I expect them to be able to explain themselves better. I need to be a great interviewer to get the best out of an interviewee especially if they are nervous. I find that people testing at interview get stuck into a focused state, and reminding them to defocus in the wrong way can cause them to be even more nervous, and I have to take all of that into account.

I also have to be careful about what a practical interview is not. I’m looking to see their internal testing structure, what they know about what they know, and that they know about their skills or what information and tools they’re missing. I like to see their approach. I’m not going to find out answers to specific questions without asking specific questions, so if someone doesn’t mention an answer I need I’ll have to ask that later in the interview.

Testing is huge, we know this. So to evaluate it all in an hour or three is impossible. I don’t think this is exclusive to testing. I’ve seen and been on both sides of many interviews, some of which were virtually useless at evaluating a candidate. So we do the best we can with what we have.

To give you some idea of the depth of complexity I’m talking about here’s Caner’s writing on recruitment which is the closest thing I know to required reading before interviewing a tester:


Thanks Aine

I did see this but its a coding test to cover automation, and whilst its useful, it doesnt cover the testing approach. We seem to be really good at creating tests to see how someone would automate a scenario, but not as good at evaluating how they create the scenarios in the first place. This is the missing piece for me. You cant automate what you havent defined. There’s some good ideas here to review though.



Hi all,

As a former recruiter and someone who has recently been through a number of interviews, I have some opinions about this…

  1. The role / seniority of the person you’re looking to hire should definitely be a factor in the kind of interview you conduct - generally, we expect senior colleagues to have more experience and solid knowledge than junior ones
  2. Understanding how the person thinks and approaches testing is more important than the “results” of the task, or if they can put a name to the techniques they use, in my opinion

Having said that, I also think it’s important to ensure that candidates in senior roles have a good foundation of knowledge and understanding to be able to guide and lead junior colleagues, even if you decide that the senior person doesn’t need to be a hard-core tester (perhaps in more of a management or consulting role).

In a recent interview I attended, they didn’t ask me to complete any tasks at all, but I’d viewed their site and tested the service beforehand and found several bugs / issues that I discussed with them in the interview. I also downloaded the app on mobile and read many reviews on the Play Store. They shared that they were unaware of at least two of the bugs I raised, which I hoped would show them what I would add to the team in a practical situation; what risks I might uncover that others hadn’t. An advantage of not asking me to complete any tasks is that I could show them how I took the initiative and what steps I took during my own, self-created, task.

Another interview I attended covered a lot more. They asked me to test their application on site (it involved exchanging money, so I didn’t look at it beforehand) and I made sure to “think out loud” throughout - something I’m really big on when it comes to understanding how someone tests. Again, I identified a number of issues they had not (including a potential security issue) and they also got to see me work out what different things were on the actual system I’d be testing, as opposed to something made up and potentially less relevant. They also asked me to name a few testers I admire, which I really liked. This lets you know if a candidate is interested in testing outside of their immediate working environment, in my opinion, and potentially some of the ideas they subscribe to.

To try and answer your questions, Steve, I would suggest that you still have testers test a user story (maybe do this as a three amigos exercise, which I did at another interview (I get around :P)) as well as actual software, as these are different skills and some testers don’t even think of testing beyond the SUT (shift left is still unheard of to some people). I would always recommend the “think out loud” technique, and so would discourage tasks that candidates complete at home or by themselves / in silence.

In the user story test, it sounds like you’re doing the right thing in terms of amount of information and “holes”. In the software test, I would give them as little information as possible; probably just the context of the page they’re on / part of the workflow they’re starting in, or to let them know not to click a particular button that goes to the next stage so as to keep it to a specific page / area / feature. For me, personally, I’d look for someone who doesn’t need requirements to start testing, which is why I’d keep information to minimum. Of course, they can still ask questions, but they might not realise that, so I’d look out for how they react when they don’t know about something. Chris has mentioned some great questions he asks during this kind of task.

As to how long they get, I wouldn’t leave them in a room alone to “complete” the task. For me, it’s not about “finishing” it (when is testing ever finished?), so I’d just block out time in the interview for the exercise and get a feeling of when I have enough information to move on to something else.

This response is longer than I intended… I think you’ve inspired me to write a blog about various testing interview techniques :slight_smile: I hope this was useful to you in some way. Feel free to follow up with questions.



Thanks Cassandra

I like the ‘think out loud’ scenario and I may well ask a candidate to draw up on a whiteboard what they will test and explain as they go. It then also covers presentation skills as well as testing ability so there are dual benefits. Although leaving someone to do a test has worked reasonably well in the past, there’s been a gap where I have to ask them to tell me what they were thinking. This way we can have an ongoing dialog. Anything is worth a trial, especially if it leads to better results :grinning:


1 Like

You don’t necessarily need to go for either or - why not do both, if you have the time?

Here’s the blog post I wrote, inspired by this discussion:

It turned out a bit differently to what I expected, but hopefully it’s useful to someone!



When I was hired for my first testing job, I was given a page with some deliberate bugs and typos added in, and asked to write bug reports based on what I found. I think it was quite a good exercise, especially as I managed to find an actual bug that hadn’t been put in deliberately!

When hiring my replacement for the same role, I asked interviewees to spend half an hour testing our recently relaunched company site and write up some bug reports for issues that they found. They were given a selection of browsers and devices that they could use. After the session finished, I did a debrief where they talked me through the bugs they documented (as well as those they didn’t have time to write down), and I also got them to justify their choice of browsers and devices to check that they were thinking critically about prioritising their testing.

Overall, I think both formats were a good way of getting some extra insight about a tester’s capabilities and mindset, alongside more traditional face-to-face interviews.