Just a heads up: Some candidates, usually of a pragmatic bent, hate being asked this sort of “how would you test an X” question. I’ve noticed this in hiring, forum discussions, chats, and the fact that I hate being asked that sort of question. I find that the response I now give tends to be “why would I do that?” or “why would you hire me for that?” which I would also respect as an answer, but newer and more nervous candidates don’t tend to have the confidence to come out with. Sometimes I will explain back to the interviewer what I think they want out of their question and then I explain my methodology and understanding of context instead, as well as give them a list of answers just so that I’m not seen to be dodging the question. Also “how would you test a pen” has been around for a long time now, so seasoned interviewees have their answer ready. Many people spit out creative answers at a rate of knots (“Throw it into the sea! Jab it in my eye!”) because they are actually answering “how might you test a pen” rather than “how would you go about getting yourself ready to test a prototype of a particular pen, in a context I haven’t explained to you, if you had to for some reason I haven’t thought of”, which I think is a reasonable thing for a person to do. A tester doesn’t necessarily have to know how to explain a tacit context to be good. After all I imagine the pen industry has a series of standards and legal mandates to follow - they might not even test their own pens. I don’t know the first thing about material science. There’s a lot of tacit knowledge around the testing of writing implements that might rend my other questions pointless!
I do quite like the “what would be a high quality product” question because it’s trying to find out if the candidate considers quality to be subjective and therefore sensitive to the context of the business and users, or if they believe it is intrinsic.
The trap to avoid is that the interviewer is hiding assumptions - contextual information - from the candidate. If the candidate runs with their own tacit assumptions then they have gotten the question “wrong”. Also if you’re going to ask a question with hidden context you better have that context prepared - I’ve embarrassed more than a few interviewers by asking questions they don’t have answers to in their own fantasy creation, with reasons that the question is important to establish the value of the testing, and it’s not fair to the candidate that someone refuses them work because their questions are bad and the candidate called them out on it; that’s actually what I want in a candidate.
The thing I always ask candidates to do is a practical exercise, either testing a piece of software or simulating a kick-off chat about a future piece of software, or both. These are both things a candidate will have to do in their job, and it shows (somewhat) how they’ll work after they’re hired rather than what they say they would do. The tricky thing is that some person poking software at random and well structured testing look very similar from the outside, so I also ask questions. I ask questions about why they’re doing what they’re doing, why they need an answer to the question they’re asking, and so on. A candidate having a lot of questions is pointless if the answers don’t get used.
I am sure to tell them that I don’t expect them to know everything about the software so I’m happy to answer questions. I do not tell them that there are log files or tools available to help them unless they don’t ask for them.
I also give them the answers or prod them to do more - if they’re obsessing over one type of data I ask if they’d like to try another. I want to see them at their best, and see what they can achieve, and how they go about achieving it.
I also don’t care how many problems they find, as there isn’t enough time to find everything (as well as their nervousness making them more single-minded), I care about their ability to perform test framing, perform risk analysis, ask valid questions to get applicable answers, come up with clever ideas and explain their actions. I sometimes ask what they’d do if they had more time, and this can be very useful - I occasionally get ways they might apply tools or risk lists or build catalogues from their exploration or run the software on a different platform (again, I ask why they’d do that so I know that they know why it’s valuable). Obviously this scales with seniority - I expect a veteran to be much better at explaining their testing to me.
I find it valuable for the time it takes, and I offer it for what it’s worth.