Revisited: How to Get Automation Included in Your Definition of Done

For our second TestBash Home talk, we’re going back in time to revisit one of the first conference talks delivered by the very talented @angiejones, ā€œHow to Get Automation Included in Your Definition of Doneā€.

I’m looking forward to seeing what has and hasn’t changed since Angie first gave the talk!

We’ll be adding all unanswered questions from the talk here so if we didn’t get to your question, don’t worry, we will find you an answer :wink:

Unanswered Questions:

  • Adrian Gersak: How we can have Automation as part of the Sprint in case the automation engineer is not part of the ā€œteamā€ and we work with external providers or a separate ā€œautomation teamā€ that serves multiple Scrum teams?

  • Jamie what is a better approach home grown automation tools (open source) or a paid tool?

  • Mary Gomes: Could you fully automate a load test?

  • Sam: Have you succeeded in getting automation into DoD in a mixed or Gated Agile project - where automation has a specific COST?

  • Varun: should you prioritise unit test automation over insprint automation, which should you target first

  • Ana Milutinovic: Can you please specify the type of automation? Unit, Integration, UI…? Because I think that the approaches are different when we specify that.

  • Melissa fisher: When you move to automating as you go, you may still have legacy areas of the product to automate - how do you solve this without again going back to a silo’d approach?

  • Elena Ivan: Time is still a big constraint to adding automation in the same sprint, in the same ticket as the feature - in my example, you would add 1-2 days extra - and some colleagues might say that this effort is not needed - how would you overcome this? talking about UI tests

  • Roman Segador: Do you think removing the difference between developer and automation engineer, merging them into the same role, helps on achieving automation in the sprint? do you see any drawback on this approach?

  • Emna Ayadi: What about if the application isn’t visual based (eg. automotive)?

  • Tiffany: How do you know which tests are required and which one are extraneous?

  • Emna Ayadi: How do you define a checklist to tell that your tests are ready for execution?

  • Lilit: Do you recommend to estimate automation efforts and include in user story points to make sure it’s done during the current sprint?

  • Christer Nilsson Ribbing: Do you recommend to create automation for all the different ways you can, for example, add to cart or is it enough with one?

  • TJ Newell: How would you recommend one person QA teams imbed themselves in the teams during sprints?

  • Ludmilla Chellemben: Should we not wait for a feature to be stable before we automate. The button example. What if we decide to get rid of that button? Will it not be waste of time and effort.

  • Laura Kopp: did i miss something, or did the two Build Incrementally examples contradict each other? Don’t build framework for future items but do build tests for future items?

  • Shailin Sheth: What if development teams are divided and testing team is SWAT team who needs to support multiple teams at a time?

  • Neil Matillano: On finding the selectors to use in your automation, would the automation engineer just try his best to find a the unique (most stable one) or it would be better to ask the developers to change the dom, classnames for example, etc…

  • Kim Garcia: if nothing has changed since you first gave that talk, what do you wish could or would change, how would you push that change?

  • Nathan Owen: How would you work to get automation included when working in a waterfall project vs an agile approach?

  • Neha Jain: In Agile team the ratio of Automation QA engineer is less than developers, so if we include automation in DoD then how it can be achieved?

  • Carin: You mention not to quantify test coverage. Agreed. But what do you say to people who ask for the quantified percentage of automation coverage? :sweat_smile:

  • Deb Collins: How will this strategy change when sprint cycles are short, like 3 weeks?

  • Christer Nilsson Ribbing: Why is automation such a hype?

  • Chris Hili: What if there is a different code base for the automation framework?

  • Tea Pot: Is there really anything called Manual testing?

  • Mike Ruttenberg: Re-asking (tech glitch at MOT): In your Definition of Done, how do you deal with devs using most of the time for development, and almost no time for testing at the end of the sprint, and then you want automation added on top. How do you deal with overloading and overshooting the sprint deadline? It would work for Lean, but not for Scrum. How do you suggest this is dealt with?

  • Varun: should manual testers be writing in BDD?

  • Tawakalt: what level of programming do i need to become an automation engineer?

  • Rhys: What coding language is recommended to use whilst learning automation?

  • SIMON: What’s the good tool to test services for functional and load testing?

  • atiqah: Can you suggest the best way to plan when to start automate for a feature? Currently our strategy is to test manual first then select which to automate, but sometime that cause us not being able to finish in that sprint and can only revisit that after some time due to other priorities on hand

  • Karlo Smid: Have you used any AI featured automation tool?

  • Christer Nilsson Ribbing: Don’t you think Vernon and Angie is a great couple?

  • Sanjay: Automation code is essentially code. How do you add the same amount of quality control on your test code as you do for your application code and get buy in and engagement from your development team?

1 Like

Our group is very behind and we are still manually testing with scripts in Excel. What would you recommend to start building automation working with what we have to work with. Are there any tips, resources, etc.

1 Like

The main premise for getting automation into DoD (Def. of Done) is that only the more important scenarios are to be automated.

Presuming the other ā€œtest conditionsā€ for each feature/user story are validated manually, it’s safe to assume that the automated regression campaign is made up of those selected few scenarios for each feature,

Question 1) doesn’t that lead to a false sense of security about the automation campaign? given that regression bugs can be hidden in those non-automated parts.

Question 2) With this ā€œmode of operationā€, can one know the amount of coverage of the automation campaign over the entire product ? if so, how to you measure and maintain that number? do you need to know in order to decide wether you need larger coverage or not…?

Hi there,
thanks for the talk, it was very insightful and touched some of the problems that I encounter in my daily work. Thanks for giving some new approaches!
The question from the previous poster triggers a problem I also have.
We do follow that approach in my company, to per feature discuss in the teams where the automation should go and on which level. So our DoD does actually state what Angie proposed; to have automation on all levels depending on the feature.
But how do we assess now test coverage?
We use line coverage for the unit tests, for all others we assess manually our confidence level for tests per feature, per sprint. So at the end of the sprint we say either yes we are confident in our test automation, or no we arenā€˜t.
Is this already enough? Or do you have other tools/ ideas on to measure that?

Thank you!

1 Like

Awesome talk by Angie Jones! My team has one more question that I don’t see above: For a project that was releasing frequently in waterfall mode and is now going to use agile methods: the regression suite is 60% automated, but the automation testers are still working on the remaining regression automation, I do not have the option of hiring any additional contractors or interns. Should I assign one of them part time to automating in sprint, or just wait until all regression backlog is complete before trying in-sprint automation? It looks like that will take another month at least.

Hey, not sure if someone already answered your question. I have discussed this in my company with my team as well, and discussed what is more important: ensuring that new features are working or that old ones already in production are working.
And we decided that first priority is to cover new features and add them to the automation suite. In case there is not that much to do, we take on the ā€œoldā€ features, check for overage and add tests to the regression framework. We always start with the most critical processes from business perspective.
And in addition, check out which of the cases need to be in the regression framework and on which level.
I hope that helped! I would be curious what you decided to do!

1 Like

Thank you for responding! I think that we will follow your example. new features are definitely a priority and we will attack the backlog gradually. My company is also in the middle of onboarding Tosca and we are hoping that Tosca will speed up the process for us.

1 Like

Hi, If the automation is used for user interface testing, you should have full knowledge of the product and test cases ready. To include automation in testing the very first step would be to include the basic and main functionalities. After that we can automate all of the smoke and then sanity test cases. Once you have the confidence that the build is stable enough, you can add the priority and functionality specific test cases from the manual team in automation, sprint wise.
Only once priority test cases are covered from regression suite, and all the functionalities are covered we can consider automating rest of the test cases.