I am lead for my product’s QA team of 4. One person does only automation and the other 3 only perform manual testing. We have been instructed by leadership to increase and focus on automation. All three of the manual testers want to learn and improve their automation skills. I’ve only worked as QA at this company and would like feed back on best practices.
How are QA teams typically structured in your organization?
In your experience, are manual and automation testers usually separate roles, or do QA engineers handle both types of testing?
If they are separate, what does the workflow look like — does the automation team primarily automate existing manual test cases, or do they design their own automated test scenarios based on requirements?
If the roles aren’t separate, what does the workflow look like? How do QA engineers typically balance manual and automation testing within the same role?
A significant factor in your decision is what your testers are doing now and what generally is the model of testing the team follows or has a bias towards.
A) In some teams the manual testers will follow very similar coverage approaches as the automators. Test cases, scripted testing, known risks and a model of testing that has more of a bias towards confirmation or verification. If this is close to your starting point then I’d really suggest some transition as those manual tester roles have very reduced value these days, but transition to what remains open.
B) A lot of teams these days have their “manual” testers with a very different focus from the automation group, there growth target is hands on, highly technical, tool loving, risk aware testers with a bias towards discovery, investigation and experimentation. Here they compliment and extend the automation and developer coverage by embracing the unknown and leaning strongly towards a testing to learn or discover model.
Now management will often think you are doing A, the tool vendors tell them so as its in their benefit and in their head it A to automation straight, in that model it makes sense but still misses out on all the other benefits of B. This is a path you will see talked about with vs or tester advancing from manual to automation, though its generally a myth.
Model B testers in my experience tend to have at least some automation but its just a tool for them that they use when required rather than a full time role.
B is though still deep broad and technical and it takes time to build those skills but may offer an alternative path some testers never take. You need to get management on board though.
So ironically I have exactly the same set-up as you and have had the same directive from above “automate more”. So I’ll tell you how I dealt with that, but my word of caution is every organisation is different and every person is different so it may not fit your problem or fit with the relationship you have with your seniors. I was very lucky that I was a respected leader coming into the organisation and I have principles with people that I won’t compromise on - which takes time to build.
So problem no 1, being pushed by your managers to automate more. I’ve experienced the “lust” from managers throwing in the automation card in complete ignorance as they think thats how we speed up. Now I’ve learned coming from automation, how important skilled manual testers and it bothers me deeply that they’re not seen as experts. So I feel I must educate them. So a couple of questions I push back with:
Where do you think test cases come from? How do you think they’re written?
If I could have 300 automated tests in a framework or 20 automated tests in a framework, which option is better? If they pick one, there is the problem. Then I ask, what if the 20 tests prevented 10 more bugs than the 300? (I was literally quoting a real scenario)
I explained to my superiors that testing is risk management. Its a balance of managing cost, quality and timescale (efficiency). Shortcut one, you impair the other 2. Automation is part of a solution to a problem, what problem do they want solved because the solution will still be a balance
Problem 2, how do you bridge the gap between manual and automation testers?
The automated testers respect the manual testers and vice versa. They need eachother and take a passionate interest and use each others skills to improve quality
We need to break the illusion that moving from manual testing to automation is “advancement”. Its not? Its just a different role requiring different skills, both technical and the softer skills.
If manual testers want to learn automation fantastic, but do they want to or do they feel they have to?
The latter won’t learn automation. They seem to have forgotten how good they are at manual testing and how much they’re valued and have been swallowed up by the market, vendors and commentary that they have to learn automation
For the former, break automation down in execution, maintenance, test development to framework development. If you can get manual testers to start by learning the execution there will be huge benefits just at that if eventually you can delegate that, so the skilled automator can spend more time developing. Then move to the maintenance to keep tests running and eventually developing new tests within the framework. Be prepared though, some won’t get past execution and thats fine. Value what they’re good at.
Finally, some tips for you as leader. Don’t be told what to do, be told what to achieve or what problems to solve. Thats the leadership we need so we can take responsibility and grow our teams. The moment we get instructions like “automate more” and we respond with “ok we’ve automated more” , we wait for the next instruction and lose our responsibility.
This is something almost every organisation wants. Higher management has that mindset that manual testing takes more time so automate as much as possible to reduce manual testing. On the other hand, team members who do manual testing wants to upgrade themselves, learn automation but they are so much occupied with manual day-to-day task that they do not get time to learn or do something apart from their day-to-day task.
Here is what I implement in my previous organisation:
We had automation and manual teams separate and ask was to train manual team to contribute to the automation.
Automation team used to build the framework and automated smoke and regression suite written by manual team. Therefore, it was regular sync between both the teams ensuring test cases are up to date and written in such a way that they can be picked for automation
Added Goal in manual team’s KRA to contribute in automation. Here 100% team response was positive -they added goal happily
Automation team gave multiple sessions and support to manual team to understand the automation and try their hands on automation. My suggestion is always to start from executing the test cases. Check the failures and try to fix those. This gives best picture and understanding of the framework.
Manual team can start fixing failures and slowly start picking new small test cases for automation. Here the team response was 50%. These 50% were able to contribute to the automation and were able to take responsibility of automating test cases along with their manual task. While 50% still struggled with time chruch. Not everyone is same in the team. It’s upto individual whether they want to learn and make time for new things or not. Few will come forward while few wont.
For those still complained about time, I asked management for dedicated 3-4 hrs in a week for automation for those team members who were really occupied. This was booked in calendar. Here response increased by 25% while 25% of the team still were not able to come up.
But overall response was 75% of manual team members started contributing to automation. For them they were enjoying while 25% of the team still can’t manage for automation.
Automation is not replacement of the manual testing, it allows to reduce manual time and efforts for repetitive tasks, regression and cross-browser testing which may contribute for faster release. Manual team can be free from these things and focus on more critical test cases and areas to ensure quality of the product.