Morning all - Question mainly to those who do automation.
Do you find yourself having to demonstrate value youâre bringing, , continued improvements, or other metrics?
How are you reporting on this?
Morning all - Question mainly to those who do automation.
Do you find yourself having to demonstrate value youâre bringing, , continued improvements, or other metrics?
How are you reporting on this?
No, I donât and I am wondering why you should demonstrate that. Especially the âyouâ part. Test automation is part of development. You are developing in a team. Who is asking for that demonstration?
the main goal of test automation is giving developers confidence they havenât messed up so they can make big changes. the cost is development and maintenance. the only persons who can make a quality judgement in this case are the devs.
there are other benefits with other metrics. who is asking and why?
Yes, when we first started implementing AT it seemed to be a concern with middle management. We were taking a Sr Developer away from writing features and moving them to the test team. Our small AT team was asked to hold weekly meetings showing the current test count, number of bugs found by AT, time saved during regression, etc. After about a year the meeting was canceled. We now have 2 teams (out of ~8) that use automation as the value of having AT is catching on, albeit slowly. Coming back to your question, we would add a tag âATâ to our work items and bugs so they could be analyzed by whoever was interested.
Thanks @charlie_from_cny this response resonates with me so appreciate it.
Itâs exactly the thing I am considering here - e.g. things like showing weâre developing the tests and adding new tests as the functionality grows, showing weâre perhaps making them run more efficiently, or how much time are we spending on rework etc.
Hi,
Youâre reading too much into the âyouâ part there.
In the context Iâm working in, the automation is running as part of the product team yes, but more at the UI end (so using the Selenium framework).
From the perspective of management, I want to be able to demonstrate that the automation is improving continuously either in terms of numbers of tests (whilst I know that this alone isnât useful, the assumption being that the new tests are added due to new functionality being added)., or efficiency (running tests faster).
If you donât think this is a useful thing to do then I would be interested in why? And in your example, is it just a case of you just do it, i.e. itâs just seen as part of the development process - i.e. you donât necessarily feel you have to âjustifyâ it?
If you donât think this is a useful thing to do then I would be interested in why?
because you are doing that for management? improving the quality is important for the team, but management shouldnât care about these things. the team should be able to pick their own processes and tools (to a point) and if the team is confident and happy with the test automation thatâs happening, why would management step in there?
Unless the dev team is quite junior and you canât trust them?
you donât necessarily feel you have to âjustifyâ it?
no need to justify. as dev team we have to deliver. and when we come to the conclusion that certain things are stopping us from delivering faster and safer, we might decide to invest more into test automation. without sufficient test automation there is no good CI/CD. delivery is something management should care more about. high quality test automation is often accompanied with good DORA metrics
We needed to justify our test automation for a long time as a tester had wasted a lot of time on such a project in the past. Devs and PO were originally against it, but the test director required us to do it.
This meant we tried to demonstrate the usefulness of our automation work to reduce the backlash and make them understand why we thought is was so useful.
Helpful metrics (without too much effort) were number of bugs uncovered through automation, especially highlighting severe ones found as early as development that could otherwise have been missed. Our devs now really appreciate the quick feedback we can give them on their work through automation.
For highlighting overall effort and progress with a nice picture for reports we have a graph measuring covered components, number of test cases and level of coverage. Comparing snapshots of this graph from different points in time shows our progress.