Automation testers struggle with manual testing

There are a lot of automation “engineers” out there that are the test equivalent of code monkeys. They can copy-paste and modify/extend existing tests, but often struggle with any actual “engineering” work or testing, i.e. they don’t realize when they should be refactoring, approaching problems differently, missing the forest for the trees, etc, and they can only implement test cases that are spelled out clearly in acceptance criteria, really struggle to shift left and participate in requirement gathering or design discussions, etc.

My theory is that this is due to all the bootcamps and online courses out there professing to teach test automation. Folks get lured in by glossy sales pitches and dreams of large salaries, but don’t do their homework to realize that these type of training programs essentially get your foot in the door, and tech is one of those careers where you need to be constantly growing and evolving.

Often, these people have no actual interest or passion for tech nor testing, so it becomes really hard for them to push themselves to develop the deeper understanding and to build the context and skill sets they need to be strong coders and/or testers.

1 Like

This is an interesting one, I do often think of automation and manual testers as two totally different types of tester. Automation testers are usually testing at a high level and their main skills is getting the technical side of the automation working e.g. finding the best elements to select to avoid flaky tests etc.
Manual testers on the other hand go into deep dives and should be leaving no stone unturned. The automation testers pick up the easy repetitive tests leaving the manual testers to work their magic with their exploratory and other non-functional skills

Saying that I do try and get some of the automation testers to do a bit of manual testing from time to time. Rough Ratio of 95:05 just so they keep some of their testing brain

1 Like