What are "bad automation habits"?

In module 17 of the STEC, (MoT Software Testing Essentials Certificate | Ministry of Testing) @mirza explains, ‘What is test automation’. In it, he asks a great question.

What are “bad automation habits”?

He gives a few examples of;

  • Trying to verify several responses in one check
  • Tests depending on one another to run
  • Hardcoding parameter values

So community, what bad habits have you seen, or have you overcome?
What are the most common discussion points on ways to do things? E.g. spaces or tabs for indentation

Share your best (horror) stories and help those newer to testing learn from you.

5 Likes

Without a doubt would add few:

  • Jumping into automation without a clear strategy and exit criteria.
  • Without doing adequate research on tools that we should or should not use.
  • Jumping with a mindset lets just automate “everything” and leave “manual testing” cause who needs it now?! :grinning_face:
  • Trying to make tests “Green” rather adding business values.
  • Not having a proper code coverage mechanism in place.
  • Without proper “test reporting” library added.

How we have overcame and could potentially mitigate these risks as a team:

  • Planning, meeting, taking ideas on Miro board.
  • Trying to take team on same page very early on.
  • Assigning duties to an owner.
  • Plan “action items”
  • Monitoring & reporting on those items.
8 Likes

Yikes, so true. And I feel completely seen having done this during my early days of leading automation efforts. :see_no_evil_monkey:

4 Likes

I’d just add this: adding more tests just for the sake of having more tests!

3 Likes

What are “bad automation habits”?
Trying to automate everything with no clarity of the product and its stability Underestimating the effort required for maintenance

Automating only the UI tests which have failed multiple times due to locators, long execution time.

Ignoring the failures and not checking how effective the automation results.

Automation pack becomes bigger as project grows new tests are added but the previous once are not maintained due to time constraint or rush to add more tests

To overcome

Session explaining why 100% automation is not possible and the maintenance effort required

Started API automation along with UI

Revisiting the failed test after release and created tasks for fixing those failures and maintaining the previous tests

Regular communication between manual and automation teams as they work closely avoid duplication of the effort

1 Like

I appreciate I might be not aware, is this course or certification syllables we are establishing, where we can read that? :slightly_smiling_face:

1 Like

I believe everyone in the early career is guilty of that :see_no_evil_monkey: how much we love seeing ‘green’ pipeline :laughing: not because we did not want to do it but because we were amateur! :heart: Thanks to incredible communities like these that everyone can take benefits and lots of learning from! :smiling_face_with_three_hearts: :ringed_planet:

1 Like

Hey @aimantirmizi it’s available here: MoT Software Testing Essentials Certificate | Ministry of Testing, just to note it’s paid content.

1 Like

Where to start with bad habits (from experience)

  • “AutoTest Everything” - asked by my manager
  • AutoTest the simple things - Good cop out, just to say you have ‘something’
  • Not using the “DRY” approach
  • Deleting ‘failing’ tests - to give a better report
  • Not ensuring Single responsibility for methods
  • So over complex, only the author understands (Who tests the tests?)
  • Constant changing of design patterns and Automation software (Low ROI)
  • Only run when you ‘think’ you need to - Get them in the CI/CD pipeline
  • No plan of attack - for reuse and low maintenance (POM for example)
  • Reliance solely on Automation - It is not the ‘golden bullet’
  • AutoTest just for the sake of it - You need to define a ‘gain’ and ‘purpose’

I will leave at that, but I also question “does the test serve any purpose, and if that area under test does fail - would be an actual problem”. Stick to critical path and high risk areas.

1 Like

Automating without testing principles in place.

Just listening to a discussion between Hilary Weaver @g33klady and Suman Bala @sumanbala during Module 17 of STEC.

Hilary has this to share:

Or the one that that gets me every time is when a full stack software engineer, really good at their job, they try to tackle test automation. And then I look at their code and they’re like, sorry.

Because they think, oh, it’s just code. I can just do it. But its also you have to apply the testing principles. So that’s a myth that I see a lot with software engineers that without the testing training that they think that they could just automate, you know, whatever.

2 Likes

We had one framework we inherited that had 300 tests in it. They were 5 years old and kept alive. They’d found 2 defects in that 5 years on what was a core product. When I looked into the framework the test were running nothing like they would in production. But the fact that there were 300 tests gave those outside QA comfort.
So I dropped the framework and got the team to start building new tests with the mantra “I’d rather have 1 valuable manual test than 300 non valuable automated tests”. That was still difficult to communicate as some had bought into quantity = quality e.g. historic conversations like “Are you comfortable its been tested?”, “Well we’ve executed 300 automated tests and they’ve all passed” sounds a lot better to those outside QA than “Well we’ve executed 10 valuable manual tests and they’ve all passed”…but that was the journey we had to start.

5 Likes

Trying to automate everything.

4 Likes

Hiring one contractor to build out a test framework that no one else on the entire engineering team knew how to run and maintain so it got abandoned.

Yep, it was me who hired that contractor hoping it would solve all our automation problems. :see_no_evil_monkey:

Hard lesson learned.

1 Like

Writes automation scripts directly from requirements or assumptions without actually exploring or testing the application manually.

1 Like