Does anyone do any practical interview tests?

We are hiring developers, and there are practical coding tests that we give them so we can see how they approach coding. We then ask questions afterwards to cover other areas, so its a good mix.

When I do tester interviews, I give them a user story with a number of ā€˜holesā€™ in it, and ask how they would test it. I want to see if they read it well enough to ask about some of the functionality which is not that well outlined, give suggestions for improvements and how they would test a story.

I started to wonder if there was a better way to do this - and I am not thinking about a technical coding test to see if they know C# (I could do that), but I am interested in their approach to testing generally. It could be sitting them in front of a real or made-up web page and asking them to test it.

Does anyone do a test like this? If so, what do you ask the candidate to test, how much info do you give them, how long do they get and do you find it useful?

Thanks!

2 Likes

Hi Steve, have you seen this thread? Technical tests for testers? there are a few people on there talking about technical tests for testers - as part of the interview process. There might be something of use.

3 Likes

I give them a simple system, either a simple piece of software or a theoretical idea of a piece of software (usually their choice), and ask them to look for problems in it. I state that I encourage them to ask any questions they like. While they look for problems I ask questions about what theyā€™re doing. I ask ā€œwhy are you doing that?ā€ or ā€œwhat are you looking for?ā€. When I get a reply I follow up with something like ā€œOh, I see, why does that matter?ā€ or ā€œwhy would that help you find a problem?ā€. I try to be nice about it. If they get into a rut I suggest another idea or ask a question that suggests thereā€™s more to look at.

The idea is to find out:

  1. Can the candidate explain what theyā€™re doing? Iā€™m looking for test framing, awareness of their own processes, and ability to make some of the tacit processes in testing explicit in a way I can understand.
  2. Does the candidate ask questions? Iā€™m happy to point out where the semi-hidden log files are and the purpose/customer for the software if asked. If I get none of these questions Iā€™ll prompt them with a suggestion such as ā€œI guess it depends who the user is? Iā€™m happy to answer questions on that.ā€
  3. Are the candidateā€™s questions any good? Iā€™ll ask questions about the questions. So ā€œare there any log files?ā€ might be answered with ā€œyeah, do you need log files? Why do you need log files?ā€ or ā€œwhere is the software used?ā€ might be answered with ā€œdo you think it makes a difference?ā€. Iā€™m trying to make them defend their questions - they canā€™t just ask any old thing and think Iā€™ll assume they have a good reason for asking.

Iā€™m not interested, really, in how many bugs they find but Iā€™ll direct them to at least one if they donā€™t find any because I want to see how they investigate them. I want to see them remove any unnecessary steps, see how else it might occur, evaluate risk, and how they might report the issue or even if itā€™s worth reporting.

A good tester testing and a poor tester testing look exactly the same, so getting them to test and explain why what theyā€™re doing is professional software testing rather than playing without particular testing skill or knowledge is a good way to see what their internal structure might look like.

3 Likes

Hey Chris,

I like this. From what I can see, thatā€™s how our company does it too. Dice game, Triangle generator game,ā€¦
One thing you said triggered something though.

ā€œA good tester testing and a poor tester testing look exactly the same, so getting them to test and explain why what theyā€™re doing is professional software testingā€.

Iā€™m an advocate for testers to develop their ā€œexplain how you testā€ and ā€œuse the right words for the right thingsā€, hence the TestSphere cards and workshop. What Iā€™ve noticed is that weā€™re exceptionally bad at this.

  • Either we donā€™t know how to explain ourselves and we fail at bringing the message across.
  • Or weā€™re pretty good at it and use words the other person might not understand or misinterpret.
  • orā€¦

I only recently started coaching a few people for interviews and my views are that almost always needs a case-by-case adjustment of what people will want to hear in the interviews.
Do they expect your answers to be ā€œI have a process heavy focusā€ or ā€œI want to bring value quicklyā€ or "I prevent bugs"
Itā€™s damn hard for people whoā€™re not good at interviews to know what language to use, which questions to pose or how the tasks/games should go.

I realize this has gone rather off topicā€¦
My concluding question would be:
If explainability and interviewing are different skills than testing but good testing is hard to distinguish from bad:
Could we come up with a technical, practical test that only measures testing skill (at its core)?

My first reflex is that itā€™s probably impossible to do well, but Iā€™m not sure.

2 Likes

Well first I should add, if it wasnā€™t obvious, that my expectations have to match the person Iā€™m interviewing. If they use different words but can make me understand them, Iā€™m okay with that, and if theyā€™re more ā€œseniorā€ then I expect them to be able to explain themselves better. I need to be a great interviewer to get the best out of an interviewee especially if they are nervous. I find that people testing at interview get stuck into a focused state, and reminding them to defocus in the wrong way can cause them to be even more nervous, and I have to take all of that into account.

I also have to be careful about what a practical interview is not. Iā€™m looking to see their internal testing structure, what they know about what they know, and that they know about their skills or what information and tools theyā€™re missing. I like to see their approach. Iā€™m not going to find out answers to specific questions without asking specific questions, so if someone doesnā€™t mention an answer I need Iā€™ll have to ask that later in the interview.

Testing is huge, we know this. So to evaluate it all in an hour or three is impossible. I donā€™t think this is exclusive to testing. Iā€™ve seen and been on both sides of many interviews, some of which were virtually useless at evaluating a candidate. So we do the best we can with what we have.

To give you some idea of the depth of complexity Iā€™m talking about hereā€™s Canerā€™s writing on recruitment which is the closest thing I know to required reading before interviewing a tester: http://www.testingeducation.org/BBST/foundations/Kaner_JobsRev6.pdf

6 Likes

Thanks Aine

I did see this but its a coding test to cover automation, and whilst its useful, it doesnt cover the testing approach. We seem to be really good at creating tests to see how someone would automate a scenario, but not as good at evaluating how they create the scenarios in the first place. This is the missing piece for me. You cant automate what you havent defined. Thereā€™s some good ideas here to review though.

Steve

2 Likes

Hi all,

As a former recruiter and someone who has recently been through a number of interviews, I have some opinions about thisā€¦

  1. The role / seniority of the person youā€™re looking to hire should definitely be a factor in the kind of interview you conduct - generally, we expect senior colleagues to have more experience and solid knowledge than junior ones
  2. Understanding how the person thinks and approaches testing is more important than the ā€œresultsā€ of the task, or if they can put a name to the techniques they use, in my opinion

Having said that, I also think itā€™s important to ensure that candidates in senior roles have a good foundation of knowledge and understanding to be able to guide and lead junior colleagues, even if you decide that the senior person doesnā€™t need to be a hard-core tester (perhaps in more of a management or consulting role).

In a recent interview I attended, they didnā€™t ask me to complete any tasks at all, but Iā€™d viewed their site and tested the service beforehand and found several bugs / issues that I discussed with them in the interview. I also downloaded the app on mobile and read many reviews on the Play Store. They shared that they were unaware of at least two of the bugs I raised, which I hoped would show them what I would add to the team in a practical situation; what risks I might uncover that others hadnā€™t. An advantage of not asking me to complete any tasks is that I could show them how I took the initiative and what steps I took during my own, self-created, task.

Another interview I attended covered a lot more. They asked me to test their application on site (it involved exchanging money, so I didnā€™t look at it beforehand) and I made sure to ā€œthink out loudā€ throughout - something Iā€™m really big on when it comes to understanding how someone tests. Again, I identified a number of issues they had not (including a potential security issue) and they also got to see me work out what different things were on the actual system Iā€™d be testing, as opposed to something made up and potentially less relevant. They also asked me to name a few testers I admire, which I really liked. This lets you know if a candidate is interested in testing outside of their immediate working environment, in my opinion, and potentially some of the ideas they subscribe to.

To try and answer your questions, Steve, I would suggest that you still have testers test a user story (maybe do this as a three amigos exercise, which I did at another interview (I get around :P)) as well as actual software, as these are different skills and some testers donā€™t even think of testing beyond the SUT (shift left is still unheard of to some people). I would always recommend the ā€œthink out loudā€ technique, and so would discourage tasks that candidates complete at home or by themselves / in silence.

In the user story test, it sounds like youā€™re doing the right thing in terms of amount of information and ā€œholesā€. In the software test, I would give them as little information as possible; probably just the context of the page theyā€™re on / part of the workflow theyā€™re starting in, or to let them know not to click a particular button that goes to the next stage so as to keep it to a specific page / area / feature. For me, personally, Iā€™d look for someone who doesnā€™t need requirements to start testing, which is why Iā€™d keep information to minimum. Of course, they can still ask questions, but they might not realise that, so Iā€™d look out for how they react when they donā€™t know about something. Chris has mentioned some great questions he asks during this kind of task.

As to how long they get, I wouldnā€™t leave them in a room alone to ā€œcompleteā€ the task. For me, itā€™s not about ā€œfinishingā€ it (when is testing ever finished?), so Iā€™d just block out time in the interview for the exercise and get a feeling of when I have enough information to move on to something else.

This response is longer than I intendedā€¦ I think youā€™ve inspired me to write a blog about various testing interview techniques :slight_smile: I hope this was useful to you in some way. Feel free to follow up with questions.

Cassandra

3 Likes

Thanks Cassandra

I like the ā€˜think out loudā€™ scenario and I may well ask a candidate to draw up on a whiteboard what they will test and explain as they go. It then also covers presentation skills as well as testing ability so there are dual benefits. Although leaving someone to do a test has worked reasonably well in the past, thereā€™s been a gap where I have to ask them to tell me what they were thinking. This way we can have an ongoing dialog. Anything is worth a trial, especially if it leads to better results :grinning:

Steve

1 Like

You donā€™t necessarily need to go for either or - why not do both, if you have the time?

Hereā€™s the blog post I wrote, inspired by this discussion: http://www.cassandrahl.com/blog/tester-interviews-techniques-and-tasks/

It turned out a bit differently to what I expected, but hopefully itā€™s useful to someone!

Cassandra

3 Likes

When I was hired for my first testing job, I was given a page with some deliberate bugs and typos added in, and asked to write bug reports based on what I found. I think it was quite a good exercise, especially as I managed to find an actual bug that hadnā€™t been put in deliberately!

When hiring my replacement for the same role, I asked interviewees to spend half an hour testing our recently relaunched company site and write up some bug reports for issues that they found. They were given a selection of browsers and devices that they could use. After the session finished, I did a debrief where they talked me through the bugs they documented (as well as those they didnā€™t have time to write down), and I also got them to justify their choice of browsers and devices to check that they were thinking critically about prioritising their testing.

Overall, I think both formats were a good way of getting some extra insight about a testerā€™s capabilities and mindset, alongside more traditional face-to-face interviews.

3 Likes