We’re continuing our work to create a curriculum focused on automation that is created with feedback from the testing community. We’ve already run a series of activities that have helped us identify a list of key tasks that helped us create this Job Profile .
We’re now in the process of going through each task and analysing them to identify the core steps we take to achieve them. For this post we’re considering the task:
Maintain a Framework
We’ve run a series of community activities including social questions and our curriculum review sessions to identify the steps we need to take to successfully achieve this task and have listed them below.
- Code repetition
- New major library/framework version
- New minor library/framework version
- Moving to a completely new tool/framework
- Changing the way we create test data
- New major programming language version
- Design pattern change for increased maintainability
- Maintainability review
- Adding new capabilities
- Utilizing new features of a framework/library
- Fixing flakiness issues
- Seeking execution speed increases.
- New capabilities needed for planned new tests
- Breaking changes from a 3rd party library/framework
- Moving from tool A to tool B
- Deprecated framework
- New team members bring fresh perspective and experiences
- Change of business requirements
- Domain language change
- Updating environment variables
- Insufficient/broken test data
- Adding new test to verify the bug fix
- Access to DB (if being used), test time decreasing using scripts
- Test data create / clear
- Test runs results processing
- Motivation so that whole team can contribute towards test automation: developers as well as Automation Engineers
- Lack of documentation
- Lack of readability
- Performance improvements
- Tests without a clear knowledge of business rules
High level common steps:
- Assess the risk with making such a change
- Discuss the change with other stakeholders
- Plan the change
- Implement the change
- Test the change
- Test to ensure all these and framework
- functionality is still working
- Commit change
- Update documentation
- Inform the team of the changes and capabilities
- Monitor if the change has had a positive impact
- Collect stats if fixing flakiness or performance
- Get feedback from stakeholders
- Repeat steps from step 3.
- Conduct a proof-of-concept
What we would like to know is what do you think of these steps?
Have we missed anything?
Is there anything in this list that doesn’t make sense?
What do you do when an automated test fails?