“Feature worked in previous build, no need to test it again.”
I will generalize as there’s not a single worst advice.
My problem is with the generally huge amount bad and extremely bad advice you get from forums, slacks, googling testing, blogs, training, linkedin and other sites that offer ‘guidance’, conferences, etc.
This wore me down with time and constantly slowed my progress, the progress of testing as a profession, the progress of smart testing. It has decreased the view/status we have as testers, while promoting a large amount of crappy managers to give even more crappier advice on testing.
A suggestion to people working in testing is to start thinking critically about what you say. And be critical with what you read or hear.
“You shouldn’t be concerned with the technical aspects of the system, maybe just focus on the UI and ensure that the UAT is successful”
This was prompted after I requested technical documentation on a specific component that had been changed and the Product Manager suggested the test approach. Luckily I ignored this advice!
With urgent fixes getting told:
- “It’s a really small change, tested locally, shouldn’t need to go through QA, can’t really break anything…”
Go with it and bypass QA, automation and visual regression checks
–> Breaks production.
Worst part is that automation and visual regression checks would have easily uncovered the issues.
“I wouldn’t worry about doing that - if no-one has asked for it - means they are not missing it or don’t care…”…regarding strategy documents & risks list for major new feature - Senior/Lead QA last year…
“Don’t test it! Nobody will do it on real life!”
Insecure test manager made me test the system after additional software had been installed (which was ok of course) and then let me test the system before installation. That was the system that got tested daily.
This can’t be tested
Dev: “I think it’s an issue with the testing environment. It should be fine. Let’s test in production instead”
Me: Naïvely goes along with this
Production: Takes down website
Not doing that again!
Dev to QA: Don’t worry, I have tested it, will take responsibility, making it live.
2 months later, a bug gets opened by a customer.
Manager: How can you miss it?
I suspect we’ve all experienced these:
“It’s fine - it won’t need testing” (how many seconds until I found a bug!)
“You’re not a dev, the technical side is none of your business and it’s fine” (dev’d for more years than the person objecting to my technical advice before moving to test - yes, it failed when released)
“It’s your data, it will be fine for customers, we’re not fixing it” (oh no it wasn’t)
“They’re a reputable company, their QA will be better than ours” when questioning whether we needed to acceptance test software from a partner company that we were re-selling (bug challenge between me and a colleague - gave pages of bug reports in under an hour despite us never having got our hands on the product before)
“Test everything. We want 100% coverage.”
So much awful ambiguity.
No understanding at all of rational risk/value tradeoffs.
It gave me the opportunity for some conversation around trying to understand what the most important thing the business was actually concerned about.
“No-one will ever do that”
Invariably followed by a customer doing ‘that’, sometimes within minutes of receiving the new code.
“We’ll test it in production and rollback if necessary.”
My past manager was a developer who keeps teaching QA how to work: “Why should we even regress - it’s a waste of everybody’s time!”
“There’s no point in doing edge case tests”
Product owner recently: “Don’t log bugs at the moment the devs need to fix all the other ones that you have logged and they might have solved some of them already. Can you keep them somewhere else” My response: “Without knowing what they are?!? Bugs don’t get fixed magically. And creating a separate backlog is a waste of time.”
Of course I ignored the advice and convinced the PO that we needed to log all the issues, he gave in after a short discussion… I believe I can be quite convincing.
A variation I often hear on this is “that’s an edge case; the customers won’t do that”. I usually reply “I’m just being fat-fingered; if you think I’m bad, I think the customers will be WORSE, i.e. we definitely SHOULD test the edge cases!” Usually the following gets people’s attention (though it is perhaps a tad adversarial): “I’ll not test it if you’re happy we circulate around that you have advised testing to not test it”; developers usually back down at that point as they don’t want to be on the hook if something goes wrong and they have said to NOT test it.
Totally relate to this one … such a case of ‘famous last words!’
“Test what the devs think you should test”
Sure, use their opinion as information in your decisions of what to test, but ordinarily what devs tell you to test just indicates dev testing they skipped. The thinking which allows a gap or oversight in design or coding is unlikely to be the same thinking which finds it in test… for this reason my team encourages cross-team test reviews.