How do you incorporate automation into planning?

Hi there.

We’ve recently undergone a bit of an Agile transformation, up-ing our agile game! Part of this involved a lot more planning and preparing tickets before bringing into sprint.

We’ve also improved our QA scenario planning. We now have a separate Testing section on all Jira tickets where we have bullet pointed test scenarios that the whole team is aware of (helping the whole team test better). However these still tend to focus on the manual tests that QA will do.

The next step we are busy trialing is putting these test scenarios into a table format, and having columns stating whether the test will be:

  • Automated (new test, update existing test or no automation)
  • Test Type (unit, UI or manual)
  • Who will do the testing (dev or QA)

We’re still working out the kinks but so far it seems to be helping.

I was wondering:

  1. How does everyone else go about about planning automation and manual testing?
  2. How do you decide if something should be automated or done manually?

Thanks, Robbie!

4 Likes

Hi Robbie,

we are kind of going through the same right now… So here some thoughts😉

One metric is the ROI. How many times can I execute this test manually until the investment of automating it pays off ?

I would also look at the business value of the features you are testing. Automate first where there is a high business value.

Change impact - automate first for features of the application which are often changing or often impacted by changes. This is however is a double edged sword, because it could also mean that your test also has to change quite often😉

Also I would keep in mind that some tests are not meant to be automated. For instance if you are interested in testing the user experience you would want to test that manually.

4 Likes

Thanks @samuelm. Appreciate the insights!

1 Like

I think lots of folks start off with agile this way, still relying heavily upon written documentation. As your team works together more and iterates, hopefully you figure out how to improve and streamline this, finding out which parts you really need and which parts are cruft.

While I don’t think I can pinpoint a particular thing that got us to a much lighter process, I’d say that for my team, our stories are super light. They’re usually just a few sentences, basically covering the hard requirements.

During our grooming, people tend to ask about edge cases, questions they might have about how to implement, etc, which might add a sentence or two before we point. We actually don’t measure velocity, but going through grooming/pointing forces us to have these discussions and get on the same page.

As for testing, nothing is called out explicitly in the stories, but we’ve built a solid quality culture - unit tests are a given, and people think about whether integration or other tests higher up the pyramid are needed, both as they write the code and as they review it.

My role on the team is technically an SDET, but lately, the vast majority of my time has been spent reviewing code or nudging/mentoring our newer team members to have quality mindsets.

For a team that’s new to agile and is just starting to adopt automation, my general advice is:

  • have stories to automate a few high level end to end tests
  • make sure all new work has good unit tests
  • strategize how to add integration tests - this can be a challenge since there’s a lot of tooling that might be needed here, whether it’s figuring out containerization, test runners, CI integrations, what and how to stub/mock dependencies, etc.

I generally don’t worry too much about the legacy code, and treat this similar to a strangler fig approach, where the older code will get tests as it gets updated. One clear sign that you’ve built a quality culture on the team will be that when folks have to touch older code and they realize there are no tests, they’ll likely spend some time refactoring and making the code more testable so they have a safety net while they do their new work.

2 Likes

Push some of the work of automation integration and system and end-to-end tests into the planning and into the sprints by making test-automation tasks, and get the “coders” to do some of those tickets while the “testers” help and do most of the automation.

As @samuelm points out, you wont know what is worth automating up front, so forcing a certain “bar” to be met in automation needs to be done very softly, you tend to only know what automations are low cost later on. By pushing some automation back into the “coders” lap, you wil lbe slowing them down, and speeding you up in two ways. 1 They will be doign some of your work and experiencing some of your automation pain, and can maybe help you write those “automated script” or what-ever you automate with better by giving you some coding and project organization tips. But secondly, the coders will start to feel the pain of hard to automate components, and start building “testability” into the product. Much like those little silver areas on circuit boards that you used to see in the old days, those are called “test points”, and got built into the system to make it easier to factory check as well as to repair electronics. up4Wc
Code has these, too, we just don’t use them properly or often enough in programming.

Automation often lags product code changes, so I prefer to only write “smoke-level” automated tests while defect-injection-rate is still high, allowing me to focus on uncovering defects through exploration while having the smoke tests run every build for sanity. Realistically, I “plan” to write half of my automation after we get to the first release candidate.

3 Likes
  1. How do you decide if something should be automated or done manually?

If you know what you want to verify in your software (e.g. an acceptance criteria), then
you can use computers to perform this check. If you want to discover how your system
behaves, then you use exploratory techniques.

  1. How does everyone else go about about planning automation and manual testing?

From the previous answer, you can use the computer-made checks to drive the development
of your product - thus it’s a no-brainer, alongside-to-building activity, made incrementally in the smallest steps, like building itself. For exploratory activities, you can use risk-based testing to uncover scenarios and interesting testing - James Bach talks about it here and Elisabeth Hendrickson talks about it here.

1 Like

Steps in which one can incorporate automation into planning are:

Analysis and planning: This includes important discussions around understanding what to automate, organizing resources, preparing the team on task distribution, setting up a budget, and much more,

Identifying tools and Technology: Once the test plan is reviewed and approved, the team members who are responsible for the design and implementation of the framework should start identifying the right tools and technology for testing projects.

Selection of framework: The test automation framework plays a vital role in the success of any automation project. The automation test framework serves as a baseline, making it easier to create and maintain automated tests.

Defining the scope of tests: In this phase, it is important to organize the test cases into different groups like smoke and regression.

Implement the test cases: Organise the test cases into different packages or modules, making sure that there is proper logging and documentation of each script so that it’s easy for maintenance.

Reviewing: It is important to review test cases to make sure that they cover all the functionalities. Once the review is done, feedback can be incorporated.

Maintenance: Test maintenance is an effort where all the automated tests are updated regularly to accommodate the functional and UI changes.