If someone isnât interested in finding problems in my software I donât want to have to force them to. I always hired testers to do testing, either because they wanted to learn about it or they wanted to do it.
If they arenât interested in learning about testing I donât want to make them learn. Testers without at least some passion for it wonât improve themselves, or the processes. I want the tester with a desire to be a burden on the training budget.
Thereâs no minimum to qualifications or experience that can make a great tester, but thereâs definitely a minimum to caring.
How will someone get some when they get rejected? We all started as newbies.
By what is this assessed?
Other than that: same as befor.
Also certificates are not qualifications.
Why? Trends come and go. Some are even dangerous (Yes, Iâm talking about AI. About some usages of AI).
When you have good core principals and values of your craft, you will stay up-to-date forever no matter the technology.
Hm⊠maybe I missed to meaning of âNot staying on top of trendsâ. â I took it more as ânot being interested in learning something newâ.
To me, staying on top of trends also includes not following all of them and make a conscious decision about which one to follow. Following each and every trend is a waste of time â at least in some cases.
Also, not following any trends isnât a good idea either: agile development was a trend once, and so was CI/CD, DevOps, TDD, BDD, shift left, and many more.
The art/science/gambling is to select the ones that will turn from trend to good (note how I didnât say âbestâ) practice.
None of those are red flags for me, if Iâve already laid out the level of experience, qualifications, and domain trends that I require.
The main red flag for me is the CV not matching up with other publicly available information, eg LinkedIn. Minor discrepancies are fine, but donât add 5 years of experience into your CV if youâve only got a few months on LinkedIn, or add in a company that I can easily check with or already know (Iâm in a relatively niche industry).
A green flag for me is the CV showing a willingness to help out other areas. Main role is tester but youâve also done a bit of training, support, cutover, etc? Shows that youâre a team player and ready to muck in when we all need to lend a hand.
Edit: and another green flag for me is having a certificate / qualification in something when it was unexpected. Weâre currently looking for an administrator for a reasonably well known test tool and one of the candidates has a certificate from the providerâs online academy. Itâs not much but it made them stand out from the other candidates.
ISTQB logo in the header of the CV. Those go straight in the bin.
Stupidly long CV. I received one that seemed odd because there were about 25 words per line. Turned out they had set the page size to A3 in order to keep the document down to 4 pages. It was more like 10 pages when converted to A4. Another one had set zero left and right margins, so the text went right to the edge of the page and there was no vertical line spacing. Another was in 8pt Arial Narrow.
Repetition of mundane experience in bulleted lists. I receive CVs with 20 or 30 bullet points per job role, all of which are the same. Most are context-free trivial things like âWrote test casesâ, âExecuted test casesâ, âLogged bugs in Jiraâ etc.
Spelling errors in the CV. I can just about tolerate one. More than that and youâre clearly not paying attention to detail.
The phrase âhad exposure toâŠâ. This usually means you were working on a project where someone else was doing that thing, maybe not even in the same room, building or even country. If you didnât do it yourself, I donât want to know.
During the practical test, telling me what you would do, but then not doing it. Anyone can blag their way through an interview. The whole point of the practical test is to see if you can do it.
During the practical test, guessing things instead of asking me. I get applicants to test the contact form on our website, which just writes the submitted data to a text file. You would not believe how many applicants have said âI assume the data goes to a SQL databaseâŠâ and proceed to do SQL injection attacks that arenât going to have any effect.
During the practical test, inventing rules to test against. A tester should treat testing as an investigation to find out what the product does. Then they should look for oracles - there are often many they could find. Only then should they decide if the behaviours are correct or at least acceptable, but thatâs not so important - the whole point of testing is to find out things you donât know.
However, almost all testers are stuck if they donât have documented requirements to verify, so they invent them, such as name fields mustnât allow numbers, or UK phone numbers must contain 11 digits.
Green flags include:
They know who James Bach and Michael Bolton are and can explain their testing philosophy.
Big extra points if they have done the RST course and/or the BBST course (ideally, the tutor-led version run by the AST).
During the practical test, doing something that surprises me in a good way.
They argue their case persuasively when we disagree.
They are not interested in doing test automation, but recognise when it would be useful so they can get someone else to do it. I want people who love the infinite challenge of exploration rather than the mundane task of automating a finite set of (mostly uninteresting) checks.
Not having any questions in the interview⊠while there is a small chance all of the questions having been answered, I think asking questions is a great opportunity to interview a potential employer. (I think that we are actually interviewing each other and sussing each other out, I donât see it as a one-way street)
Claiming to be an expert in certain areas/tools etc. but then according to your CV you barely have experience in that area. I want to be able to trust someone. Any claims about being an expert needs to line up with achievements/experience.
More than a mistake or two in a CV. (I think a tester should have attention to detail and at the very least should be using a spell-checker. If their CV is full of mistakes, that makes me think this is what I would expect if I hired them as well).
Questions is such a big one for me, too. The sussing each other out is so important. I sometimes get the vibe when Iâm an interviewee that a company forgets that while theyâre evaluating me Iâm evaluating them.
Questions are also really important to me as a way to gather information in the face of a testing question, like a way to explore what the question is to get a better answer. Questions happen all the time in testing, so good to see that they can ask them. I used to give people something to test, with a scenario including some context, and hold back things that would be useful to them, like some of the contextual information, manuals, where the log file is, existing âknownâ bugs and so on. Then Iâd say âask me any questions you like about the software or anything else you can think of and Iâll try to answer themâ. I then let them test and query them about what theyâre doing and what theyâre looking for and why and so on (exposing the internal structure). Iâd constantly poke them to ask me questions about it. Some would, and Iâd shower them with valuable information and watch them use it. Some would not, even while Iâm saying âdonât forget Iâm here to answer any questions. If you have some kinda question about that. Ooo-eee sure do love me some questions over here. Maybe about this function youâre exploring. My backâs hurting from carrying all this useful information, I sure do hope someone asks me a question, canât stress that enoughâ
When Iâm interviewing software testers, there are definitely a few red flags that can make my spidey senses tingle. For instance:
No curiosity about culture, collaboration, or how we see quality â If youâre not asking about how we work with developers, or what we mean by quality, I start to worry. Itâs a sign that maybe youâre just going through the motions or havenât thought much about how youâd fit into the team vibe.
No diversity in tools (or even courses on different tools) â Alarms start ringing here, too. It shows either a lack of interest in learning or, worse, that you might not take on new skills even when given the chance. And letâs be real â in quality, being adaptable and ready to broaden your toolkit is the name of the game.
A rigid, âonly one way to testâ mindset â If you come in and talk like thereâs only one right approach to testing, thatâs an eyebrow-raiser. Quality often demands flexibility, so if youâre not open to different approaches or perspectives, we might hit roadblocks fast.
This is maybe dependent on the role but Iâve interviewed candidates who exclusively focus on test automation during the interview and almost visibly turn their nose up at the suggestion of any manual testing that might be required
Most of my green flags have already been covered in the thread but another one is honesty - if you donât know something itâs fine to say âI donât knowâ as long as itâs followed up with something like ââŠbut I would do [thing] to find outâ.
Mentioning the number of bugs raised by them during the testing process in the CV or during the interview process.
Writing small points in descriptive points for e.g. - instead of mentioning JIRA in tools people write descriptive points about what they do in JIRA
Lack of strong foundation in basic knowledge of software testing and more focus on DSA and programming skills, in automation these things may be required but before them identifying scenarios and test cases is important
Writing a resume on more than 1 page, resumes are preferred to be one page only as recruiters have hardly 10-15 seconds to review them
Mentioning manual testing as a skill,
Not being polite during interview calls, we usually come across candidates who show rudeness during calls or boast about themselves trying to show themselves superior to the interviewer or company
Refusing to turn on the camera or share the screen during interview calls
Delaying the joining date by giving different excuses such as medical emergencies or anything else
Its interesting that this one can be more of a trigger for those hiring testers than other roles.
The attention to detail and due diligence is a fair aspect in my view.
Less of a trigger for me personally though, if I was applying in a second language for example I may have a couple of mistakes and Iâve also worked with a few people with some level of dyslexia and they have been great testers.
In a tangent what does trigger me is managers expecting me as a tester to be a language specialist, a proof reader or the person there to pick up on others spelling mistakes.
Also a tangent, quite a few scripted red flags in play. Amber flags and taking a more exploratory approach hiring is another option.
If someone has dyslexia or a poor writing style, they should be taking measures to mitigate it. One of my best testers was severely dyslexic so we bought some software to help him detect and correct a lot of the mistakes. Buy it yourself if the company wonât.
However, in the case of a one-off document like a CV, errors are inexcusable. If youâre dyslexic or just canât write well, you should get it checked by someone else. What job applicants often forget is that they are selling themselves to the company, and itâs effectively a ÂŁ200,000 sale if they expect to earn ÂŁ50,000 a year for 4 years. What does it tell me about you if you canât take a ÂŁ200,000 sale seriously?
It depends on the role Iâm hiring for, what skills are needed for the role and what theyâll have to do. Here are some things that Iâve found that might make me think a candidate isnât suitable though.
No eligibility to work in the country of the role.
CV doesnât show skills that the job spec asks for.
Unable to show proficiency, talk about or champion skills they say they have.
âExposure toâ things and no practical experience.
For senior roles: hasnât led or pushed on things to get them introduced.
Seems to have a one way fits all to testing.
Again these are all basically contextual and when hiring I like to try and give people a way to show me how theyâre awesome, rather than me try to discount them.
Isnât that interesting, Iâd possibly see that as a red flag. In most organisations Iâve been working in we need testers to have an opinion on quality, rather than just throw out a measurement and leave it at that.
Usually in modern teams, as testers we need to be more impassioned about quality and what good enough looks like. Like we ask devs to care and be âproduct devsâ we need to be âproduct testersâ and care more about what is being put out there.
The RST vibe works better for larger, old school, organisations like banks; where testers have the luxury of detachment. In engineering, tech and start up organisations we need to be more opinionated
Sounds like we have very different testing philosophies. The thing about RST and context-driven testing is that it is adaptable to any development methodology and it works well anywhere.
I am appalled at what passes for testing in the agile teams that I see, and I see the output from a lot of them. Much of this is due to the constraints that are imposed in agile projects, mainly to suit product owners and developers. I would say itâs impossible to do good testing on some projects because an agile methodology should not have been used. But people seem to be afraid not to use agile these days, even when itâs clearly the wrong choice.
I have no idea why you think that RST practitioners donât have an opinion on quality or that they are not impassioned. Have you ever met Bach or Bolton? Or me? We are not short of opinions or passion. Nor are any of the people who ever worked for me.
And we donât âjust throw out a measurementâ. In fact, itâs the precise opposite. We refuse to produce any testing-related metrics because there arenât any that are statistically valid. Any sensible assessment of product quality can only be a multi-faceted narrative explanation involving risk.
BTW, anyone who thinks that current testing methods work well clearly hasnât noticed how software quality has gone down the toilet in the last decade. And it wasnât good back then.
I have never understood why testers wouldnât take context into account when testing. It seems crazy to always do the same thing regardless of the circumstances. But thatâs what most people do. Itâs what ISTQB teaches you to do. Itâs what developers do because they only know how to do one thing (automation). Do you do anything else the same way regardless of the context?
As for subjectivity, we canât escape it because so little is objective in software development and testing. Risk is subjective, as are quality and value. Requirements are rarely expressed in a manner that is objectively testable - in fact the flexibility of how requirements are expressed is regarded as a benefit of agile. Usability and accessibility are inherently subjective.
Green flags:
Have thought about questions theyâd like to ask.
Have researched the company.
Have read the job description in depth and understand whatâs involved.
Iâve found it interesting that when some of us (native English speakers) brought up the idea of being more lenient on spelling/grammar issues with EASL applicants, some of the most rigorous objections came from other people on our team who were themselves English as a second language .
The ones that really get me are when the candidate uploads their resume as a Word document, and Word helpfully highlights all of their mistakes with various colors of squiggles.