đŸš© What’s a red flag when hiring a software tester?

How can we better spot software testers that aren’t quite right for what our team needs?

  • Lack of experience
  • No qualifications
  • Not staying on top of trends
  • Something else, please comment
0 voters

Bonus points if you can also tell us of any :green_square: green flags when hiring a software tester. :raised_hands:

2 Likes

A lack of passion.

If someone isn’t interested in finding problems in my software I don’t want to have to force them to. I always hired testers to do testing, either because they wanted to learn about it or they wanted to do it.

If they aren’t interested in learning about testing I don’t want to make them learn. Testers without at least some passion for it won’t improve themselves, or the processes. I want the tester with a desire to be a burden on the training budget.

There’s no minimum to qualifications or experience that can make a great tester, but there’s definitely a minimum to caring.

8 Likes

How will someone get some when they get rejected? We all started as newbies.

By what is this assessed?
Other than that: same as befor.
Also certificates are not qualifications.

Why? Trends come and go. Some are even dangerous (Yes, I’m talking about AI. About some usages of AI).
When you have good core principals and values of your craft, you will stay up-to-date forever no matter the technology.

Other than that I agree on @kinofrost.

5 Likes

Hm
 maybe I missed to meaning of ‘Not staying on top of trends’. — I took it more as ‘not being interested in learning something new’.

To me, staying on top of trends also includes not following all of them and make a conscious decision about which one to follow. Following each and every trend is a waste of time – at least in some cases.
Also, not following any trends isn’t a good idea either: agile development was a trend once, and so was CI/CD, DevOps, TDD, BDD, shift left, and many more.

The art/science/gambling is to select the ones that will turn from trend to good (note how I didn’t say ‘best’) practice.

6 Likes

None of those are red flags for me, if I’ve already laid out the level of experience, qualifications, and domain trends that I require.

The main red flag for me is the CV not matching up with other publicly available information, eg LinkedIn. Minor discrepancies are fine, but don’t add 5 years of experience into your CV if you’ve only got a few months on LinkedIn, or add in a company that I can easily check with or already know (I’m in a relatively niche industry).

A green flag for me is the CV showing a willingness to help out other areas. Main role is tester but you’ve also done a bit of training, support, cutover, etc? Shows that you’re a team player and ready to muck in when we all need to lend a hand.

Edit: and another green flag for me is having a certificate / qualification in something when it was unexpected. We’re currently looking for an administrator for a reasonably well known test tool and one of the candidates has a certificate from the provider’s online academy. It’s not much but it made them stand out from the other candidates.

4 Likes

Red flags include:

  • ISTQB logo in the header of the CV. Those go straight in the bin.
  • Stupidly long CV. I received one that seemed odd because there were about 25 words per line. Turned out they had set the page size to A3 in order to keep the document down to 4 pages. It was more like 10 pages when converted to A4. Another one had set zero left and right margins, so the text went right to the edge of the page and there was no vertical line spacing. Another was in 8pt Arial Narrow.
  • Repetition of mundane experience in bulleted lists. I receive CVs with 20 or 30 bullet points per job role, all of which are the same. Most are context-free trivial things like “Wrote test cases”, “Executed test cases”, “Logged bugs in Jira” etc.
  • Spelling errors in the CV. I can just about tolerate one. More than that and you’re clearly not paying attention to detail.
  • The phrase “had exposure to
”. This usually means you were working on a project where someone else was doing that thing, maybe not even in the same room, building or even country. If you didn’t do it yourself, I don’t want to know.
  • During the practical test, telling me what you would do, but then not doing it. Anyone can blag their way through an interview. The whole point of the practical test is to see if you can do it.
  • During the practical test, guessing things instead of asking me. I get applicants to test the contact form on our website, which just writes the submitted data to a text file. You would not believe how many applicants have said “I assume the data goes to a SQL database
” and proceed to do SQL injection attacks that aren’t going to have any effect.
  • During the practical test, inventing rules to test against. A tester should treat testing as an investigation to find out what the product does. Then they should look for oracles - there are often many they could find. Only then should they decide if the behaviours are correct or at least acceptable, but that’s not so important - the whole point of testing is to find out things you don’t know.
    However, almost all testers are stuck if they don’t have documented requirements to verify, so they invent them, such as name fields mustn’t allow numbers, or UK phone numbers must contain 11 digits.

Green flags include:

  • They know who James Bach and Michael Bolton are and can explain their testing philosophy.
  • Big extra points if they have done the RST course and/or the BBST course (ideally, the tutor-led version run by the AST).
  • During the practical test, doing something that surprises me in a good way.
  • They argue their case persuasively when we disagree.
  • They are not interested in doing test automation, but recognise when it would be useful so they can get someone else to do it. I want people who love the infinite challenge of exploration rather than the mundane task of automating a finite set of (mostly uninteresting) checks.
  • They bring doughnuts.
8 Likes
  • Not having any questions in the interview
 while there is a small chance all of the questions having been answered, I think asking questions is a great opportunity to interview a potential employer. (I think that we are actually interviewing each other and sussing each other out, I don’t see it as a one-way street)

  • Claiming to be an expert in certain areas/tools etc. but then according to your CV you barely have experience in that area. I want to be able to trust someone. Any claims about being an expert needs to line up with achievements/experience.

  • More than a mistake or two in a CV. (I think a tester should have attention to detail and at the very least should be using a spell-checker. If their CV is full of mistakes, that makes me think this is what I would expect if I hired them as well).

5 Likes

Questions is such a big one for me, too. The sussing each other out is so important. I sometimes get the vibe when I’m an interviewee that a company forgets that while they’re evaluating me I’m evaluating them.

Questions are also really important to me as a way to gather information in the face of a testing question, like a way to explore what the question is to get a better answer. Questions happen all the time in testing, so good to see that they can ask them. I used to give people something to test, with a scenario including some context, and hold back things that would be useful to them, like some of the contextual information, manuals, where the log file is, existing “known” bugs and so on. Then I’d say “ask me any questions you like about the software or anything else you can think of and I’ll try to answer them”. I then let them test and query them about what they’re doing and what they’re looking for and why and so on (exposing the internal structure). I’d constantly poke them to ask me questions about it. Some would, and I’d shower them with valuable information and watch them use it. Some would not, even while I’m saying “don’t forget I’m here to answer any questions. If you have some kinda question about that. Ooo-eee sure do love me some questions over here. Maybe about this function you’re exploring. My back’s hurting from carrying all this useful information, I sure do hope someone asks me a question, can’t stress that enough”

5 Likes

When I’m interviewing software testers, there are definitely a few red flags that can make my spidey senses tingle. For instance:

  1. No curiosity about culture, collaboration, or how we see quality – If you’re not asking about how we work with developers, or what we mean by quality, I start to worry. It’s a sign that maybe you’re just going through the motions or haven’t thought much about how you’d fit into the team vibe.
  2. No diversity in tools (or even courses on different tools) – Alarms start ringing here, too. It shows either a lack of interest in learning or, worse, that you might not take on new skills even when given the chance. And let’s be real – in quality, being adaptable and ready to broaden your toolkit is the name of the game.
  3. A rigid, “only one way to test” mindset – If you come in and talk like there’s only one right approach to testing, that’s an eyebrow-raiser. Quality often demands flexibility, so if you’re not open to different approaches or perspectives, we might hit roadblocks fast.
5 Likes

This is maybe dependent on the role but I’ve interviewed candidates who exclusively focus on test automation during the interview and almost visibly turn their nose up at the suggestion of any manual testing that might be required :sweat_smile:

Most of my green flags have already been covered in the thread but another one is honesty - if you don’t know something it’s fine to say “I don’t know” as long as it’s followed up with something like “
but I would do [thing] to find out”.

5 Likes

Red flags include:

  1. Mentioning the number of bugs raised by them during the testing process in the CV or during the interview process.
  2. Writing small points in descriptive points for e.g. - instead of mentioning JIRA in tools people write descriptive points about what they do in JIRA
  3. Lack of strong foundation in basic knowledge of software testing and more focus on DSA and programming skills, in automation these things may be required but before them identifying scenarios and test cases is important
  4. Writing a resume on more than 1 page, resumes are preferred to be one page only as recruiters have hardly 10-15 seconds to review them
  5. Mentioning manual testing as a skill,
  6. Not being polite during interview calls, we usually come across candidates who show rudeness during calls or boast about themselves trying to show themselves superior to the interviewer or company
  7. Refusing to turn on the camera or share the screen during interview calls
  8. Delaying the joining date by giving different excuses such as medical emergencies or anything else
5 Likes

Its interesting that this one can be more of a trigger for those hiring testers than other roles.

The attention to detail and due diligence is a fair aspect in my view.

Less of a trigger for me personally though, if I was applying in a second language for example I may have a couple of mistakes and I’ve also worked with a few people with some level of dyslexia and they have been great testers.

In a tangent what does trigger me is managers expecting me as a tester to be a language specialist, a proof reader or the person there to pick up on others spelling mistakes.

Also a tangent, quite a few scripted red flags in play. Amber flags and taking a more exploratory approach hiring is another option.

3 Likes

If someone has dyslexia or a poor writing style, they should be taking measures to mitigate it. One of my best testers was severely dyslexic so we bought some software to help him detect and correct a lot of the mistakes. Buy it yourself if the company won’t.

However, in the case of a one-off document like a CV, errors are inexcusable. If you’re dyslexic or just can’t write well, you should get it checked by someone else. What job applicants often forget is that they are selling themselves to the company, and it’s effectively a £200,000 sale if they expect to earn £50,000 a year for 4 years. What does it tell me about you if you can’t take a £200,000 sale seriously?

2 Likes

It depends on the role I’m hiring for, what skills are needed for the role and what they’ll have to do. Here are some things that I’ve found that might make me think a candidate isn’t suitable though.

  • No eligibility to work in the country of the role.
  • CV doesn’t show skills that the job spec asks for.
  • Unable to show proficiency, talk about or champion skills they say they have.
  • “Exposure to” things and no practical experience.
  • For senior roles: hasn’t led or pushed on things to get them introduced.
  • Seems to have a one way fits all to testing.

Again these are all basically contextual and when hiring I like to try and give people a way to show me how they’re awesome, rather than me try to discount them.

4 Likes

Isn’t that interesting, I’d possibly see that as a red flag. In most organisations I’ve been working in we need testers to have an opinion on quality, rather than just throw out a measurement and leave it at that.

Usually in modern teams, as testers we need to be more impassioned about quality and what good enough looks like. Like we ask devs to care and be “product devs” we need to be “product testers” and care more about what is being put out there.

The RST vibe works better for larger, old school, organisations like banks; where testers have the luxury of detachment. In engineering, tech and start up organisations we need to be more opinionated :slight_smile:

1 Like

Sounds like we have very different testing philosophies. The thing about RST and context-driven testing is that it is adaptable to any development methodology and it works well anywhere.

I am appalled at what passes for testing in the agile teams that I see, and I see the output from a lot of them. Much of this is due to the constraints that are imposed in agile projects, mainly to suit product owners and developers. I would say it’s impossible to do good testing on some projects because an agile methodology should not have been used. But people seem to be afraid not to use agile these days, even when it’s clearly the wrong choice.

I have no idea why you think that RST practitioners don’t have an opinion on quality or that they are not impassioned. Have you ever met Bach or Bolton? Or me? We are not short of opinions or passion. Nor are any of the people who ever worked for me.

And we don’t “just throw out a measurement”. In fact, it’s the precise opposite. We refuse to produce any testing-related metrics because there aren’t any that are statistically valid. Any sensible assessment of product quality can only be a multi-faceted narrative explanation involving risk.

BTW, anyone who thinks that current testing methods work well clearly hasn’t noticed how software quality has gone down the toilet in the last decade. And it wasn’t good back then.

4 Likes

A very good point, there’s no universal red flags because your approach to testing may be context driven and subjective :smiley:

1 Like

I have never understood why testers wouldn’t take context into account when testing. It seems crazy to always do the same thing regardless of the circumstances. But that’s what most people do. It’s what ISTQB teaches you to do. It’s what developers do because they only know how to do one thing (automation). Do you do anything else the same way regardless of the context?

As for subjectivity, we can’t escape it because so little is objective in software development and testing. Risk is subjective, as are quality and value. Requirements are rarely expressed in a manner that is objectively testable - in fact the flexibility of how requirements are expressed is regarded as a benefit of agile. Usability and accessibility are inherently subjective.

2 Likes

Green flags:
Have thought about questions they’d like to ask.
Have researched the company.
Have read the job description in depth and understand what’s involved.

2 Likes

I’ve found it interesting that when some of us (native English speakers) brought up the idea of being more lenient on spelling/grammar issues with EASL applicants, some of the most rigorous objections came from other people on our team who were themselves English as a second language :upside_down_face:.

The ones that really get me are when the candidate uploads their resume as a Word document, and Word helpfully highlights all of their mistakes with various colors of squiggles.

2 Likes