How do you "Maintain a Framework" - Automation in Testing Curriculum

Hi all,

We’re continuing our work to create a curriculum focused on automation that is created with feedback from the testing community. We’ve already run a series of activities that have helped us identify a list of key tasks that helped us create this Job Profile .

We’re now in the process of going through each task and analysing them to identify the core steps we take to achieve them. For this post we’re considering the task:

Maintain a Framework

We’ve run a series of community activities including social questions and our curriculum review sessions to identify the steps we need to take to successfully achieve this task and have listed them below.

Triggers:

  • Code repetition
  • New major library/framework version
  • New minor library/framework version
  • Moving to a completely new tool/framework
  • Changing the way we create test data
  • New major programming language version
  • Design pattern change for increased maintainability
  • Maintainability review
  • Adding new capabilities
  • Utilizing new features of a framework/library
  • Fixing flakiness issues
  • Seeking execution speed increases.
  • New capabilities needed for planned new tests
  • Breaking changes from a 3rd party library/framework
  • Moving from tool A to tool B
  • Deprecated framework
  • New team members bring fresh perspective and experiences
  • Change of business requirements
  • Domain language change
  • Updating environment variables
  • Insufficient/broken test data
  • Adding new test to verify the bug fix
  • Access to DB (if being used), test time decreasing using scripts
  • Test data create / clear
  • Test runs results processing
  • Motivation so that whole team can contribute towards test automation: developers as well as Automation Engineers
  • Lack of documentation
  • Lack of readability
  • Performance improvements
  • Tests without a clear knowledge of business rules

High level common steps:

  1. Assess the risk with making such a change
  2. Discuss the change with other stakeholders
  3. Plan the change
  4. Implement the change
  5. Test the change
  6. Test to ensure all these and framework
  7. functionality is still working
  8. Commit change
  9. Update documentation
  10. Inform the team of the changes and capabilities
  11. Monitor if the change has had a positive impact
  12. Collect stats if fixing flakiness or performance
  13. Get feedback from stakeholders
  14. Repeat steps from step 3.

Others:

  • Conduct a proof-of-concept

What we would like to know is what do you think of these steps?
Have we missed anything?
Is there anything in this list that doesn’t make sense?

What do you do when an automated test fails?

4 Likes