How do you achieve continuous testing in a scenario where the team is sprinting continuously please?

This was asked on Slack and thought it would be useful here:

[alg] Hi, Can anyone in the community point me in the direction of good material for achieving continuous testing (manual and automation) in a scenario where the team is sprinting continuously please? Thanks in advance!

[alan_parkinson]
What’s your definition of continuous testing? I’ve seen a few around that are a little different to each other

[alg]
I am probably mixing my terms here (apologises ).
Basically how to do both feature and regression testing in a sprint where there is no other time to do regression. And when that sprint is completed, that’s releasable and the team moves onto the next sprint. The overall aim is to have done all testing required so it “could” be released.
hope that clarifies things!

[Stephen Jones]
Yeah… so in that case you need a Manual QA and a Automation Engineer. Trying to keep the automation up with active sprints, especially if we are talking about “green-field” environments, is a full - time job. And, depending on the size of the dev team and how fast they are building, maybe even impossible.

[alg]
Thanks, would that change things in a situation where the product was more mature? i.e had a more regression automation and still a need for some sort of manual regression? Would targeted regression be a viable strategy?

[Stephen Jones]
Yes. So, the automation should be used to cover what hasn’t changed to verify the changes didn’t affect other areas of the application. Changes would be expected to fail until the automation is updated. Manual is always needed the first time around changes because the human eye needs to judge the changes. (edited)

[alg]
So a strategy to increase the automation on what is currently there still sounds like it’d be required if there’s not enough coverage of what is currently in place. (edited)

[Stephen Jones]
Yes, and would greatly free-up the manual resources (people) because they can focus on the changes instead of spending a ton of time regressing existing functionality.

[Dimitris Karavias]
Agree with Stephen. One point here is to ensure there’s enough manual capacity to be able to sign of releases. Otherwise you will have to pull the automation engineers from their job to do manual testing and they will never get enough done. It happens all too often.

[alg]
Thanks all makes sense thanks. In my head my current scenario is that for a team that develops feature “b” and there is a bit of automation around that already. The team then decides what targeted manual regression to do on top of that as well as testing the change.
The automation of tests is then targeted to fill the gap between the two areas. And maybe the extra manual regression is minimised to a few essential tests rather than the entire suite.

[coolcat]
A better plan than a dedicated automation engineer might be to get developers to create their own automated tests as they develop features, and make that part of their definition of done.

[coolcat]
That way they will have a greater sense of ownership over them and have more of an incentive to make them scalable and flexible for the future

[Stephen Jones]
You must have a great deal of trust in your developers

[coolcat]
In my experience when an “automation engineer” works on the app separately you get problems like:

  • the development team changes the app in a way that breaks existing automation, so the engineer has to go back and fix things
  • during this process the automation will show a “fail”. but this fail does not represent any actual user-facing problem in the app
  • so as a result people place less faith in any automation that has been produced
  • to mitigate the above, automated tests get written with very lightweight assertions. they get written with the objective of having them always pass, meaning they are less likely to catch any actual bugs. Often the engineers need to sit and watch them run and look out for problems by eye, rather than having them run in a CI suite or similar
  • a single automation engineer is unlikely to have their automation code subject to the same quality controls as the production code, e.g. peer review. One of the reasons for testing is that we accept that developers will make mistakes when writing their production code, I promise you the same applies to automation testing code
  • As was alluded to earlier, manual testing then remains king in the eyes of the team and management, and automated testers will very quickly get pulled off of their task to help with this
  • All the above adds up to a lot of effort being put in to writing and maintaining automated tests, but very little actual value being gained from it

[Dimitris Karavias]
Yes I’d never advocate for a solo automation engineer or even separate team - but there needs to be someone owning & coordinating the effort and coaching the dev team

[Joe Triccas]
Or maybe have devs update the automation as part of their change?

[alan_parkinson]
I agree with Joe, I always prefer having devs owning the automation. The automation code should be treated as production code, and written to the same standards. I do recommend having a tester pair with the developers when writing the automation

[alg]
I think the paring on automation tests fits better for me. As something I’d like to understand, also fits with a whole team approach to testing and quality

To back up what coolcat was saying earlier in the thread about the developers taking ownership I want to reference “Accelerate: The Science of Lean Software and DevOps”. Accelerate is a cross-sectional study of the data from 4 years worth of State of DevOps report surveys (2014-2017). It has academic rigour and has been peer reviewed so its rare to get research of this quality in our industry.
The quote: “It’s interesting to note that having automated tests primarily created and maintained either by QA or an out-sourced party is not correlated with IT performance”.
They go on to say that “None of this mean that we should be getting rid of testers. Testers serve an essential role in the software delivery lifecycle”

1 Like