I was thinking what are the most interesting (top 5 probably) Quality Assurance metrics to be tracked, without taking into consideration test metrics. I know test metrics are part of QA metrics, but I 'm thinking more from a process standpoint, which would consider not only testing, but scrum team as a whole. So, any suggestions on most important QA metrics that are worth being captured?
What we measure depends on what we want to know which in turn depends on the product and its context. Most metrics are dangerous, inaccurate, badly communicated and misused or abused, so I’d say the default position should be to measure things that are known quantities and make very limited decisions based on them (e.g. time taken to fully run a tool if we’re trying to reduce the time to fully run a tool). There’s no top 5 because the concept would be flawed. I could ask what are the top 5 vehicles… top 5 for what? Fuel efficiency? Cost? How high it can safely fly? Resistance to anti-tank rounds?
Things that are often counted such as bugs or test cases or person hours are a complete waste of time to track as a whole but can yield more interesting results when they’re categorised. For example knowing there are “10 bugs” (i.e. 10 bug reports physically written down or typed out) is pointless but knowing what areas of the product tend to generate a higher proportion of bug reports is less pointless.
Mostly agree with @kinofrost. There are some clear QA metrics to NOT use, such as bugs filed.
However, I think there are also some obvious things that can be tracked to improve the quality of whatever you’re testing. What gets measured gets improved, right?
So some things that I would measure:
Client happiness - This can be done via Net Promoter Score
Number of severe bugs released to production and found by customers
Hot spots of bugs - I’ve found the 80/20 rule to be fairly accurate here. You may find that 80% of the bugs are coming from 20% of the components