Just a heads up: Some candidates, usually of a pragmatic bent, hate being asked this sort of âhow would you test an Xâ question. Iâve noticed this in hiring, forum discussions, chats, and the fact that I hate being asked that sort of question. I find that the response I now give tends to be âwhy would I do that?â or âwhy would you hire me for that?â which I would also respect as an answer, but newer and more nervous candidates donât tend to have the confidence to come out with. Sometimes I will explain back to the interviewer what I think they want out of their question and then I explain my methodology and understanding of context instead, as well as give them a list of answers just so that Iâm not seen to be dodging the question. Also âhow would you test a penâ has been around for a long time now, so seasoned interviewees have their answer ready. Many people spit out creative answers at a rate of knots (âThrow it into the sea! Jab it in my eye!â) because they are actually answering âhow might you test a penâ rather than âhow would you go about getting yourself ready to test a prototype of a particular pen, in a context I havenât explained to you, if you had to for some reason I havenât thought ofâ, which I think is a reasonable thing for a person to do. A tester doesnât necessarily have to know how to explain a tacit context to be good. After all I imagine the pen industry has a series of standards and legal mandates to follow - they might not even test their own pens. I donât know the first thing about material science. Thereâs a lot of tacit knowledge around the testing of writing implements that might rend my other questions pointless!
I do quite like the âwhat would be a high quality productâ question because itâs trying to find out if the candidate considers quality to be subjective and therefore sensitive to the context of the business and users, or if they believe it is intrinsic.
The trap to avoid is that the interviewer is hiding assumptions - contextual information - from the candidate. If the candidate runs with their own tacit assumptions then they have gotten the question âwrongâ. Also if youâre going to ask a question with hidden context you better have that context prepared - Iâve embarrassed more than a few interviewers by asking questions they donât have answers to in their own fantasy creation, with reasons that the question is important to establish the value of the testing, and itâs not fair to the candidate that someone refuses them work because their questions are bad and the candidate called them out on it; thatâs actually what I want in a candidate.
The thing I always ask candidates to do is a practical exercise, either testing a piece of software or simulating a kick-off chat about a future piece of software, or both. These are both things a candidate will have to do in their job, and it shows (somewhat) how theyâll work after theyâre hired rather than what they say they would do. The tricky thing is that some person poking software at random and well structured testing look very similar from the outside, so I also ask questions. I ask questions about why theyâre doing what theyâre doing, why they need an answer to the question theyâre asking, and so on. A candidate having a lot of questions is pointless if the answers donât get used.
I am sure to tell them that I donât expect them to know everything about the software so Iâm happy to answer questions. I do not tell them that there are log files or tools available to help them unless they donât ask for them.
I also give them the answers or prod them to do more - if theyâre obsessing over one type of data I ask if theyâd like to try another. I want to see them at their best, and see what they can achieve, and how they go about achieving it.
I also donât care how many problems they find, as there isnât enough time to find everything (as well as their nervousness making them more single-minded), I care about their ability to perform test framing, perform risk analysis, ask valid questions to get applicable answers, come up with clever ideas and explain their actions. I sometimes ask what theyâd do if they had more time, and this can be very useful - I occasionally get ways they might apply tools or risk lists or build catalogues from their exploration or run the software on a different platform (again, I ask why theyâd do that so I know that they know why itâs valuable). Obviously this scales with seniority - I expect a veteran to be much better at explaining their testing to me.
I find it valuable for the time it takes, and I offer it for what itâs worth.