Making results more visable to the team

So I want to make the results of my testing more visable to the team, Developers and Product Owner.

We don’t have a test case management system, or manual tests for any work that isn’t automated, so where I can I put notes on Jira tickets.

Often automation is either run locally on my machine, or in Jenkins. But it doesn’t currently product fancy HTML reports or anything that presists very long.

I am really interest to know if anyone is facing the challenge if making the results of testing more visable, or if you have solved this problem, what are you doing? How did you get there?

1 Like

I’ve never reported on automated test cases. Their success in the pipeline should be enough + a culture where failing tests are fixed at once. The amount of them is a bad metric that excites managers, but proves very little.

In terms of exploratory testing, I’ve always made my reports public on Confluence. Testing logs for each day. I divided the areas I tested with help of the low-tech testing dashboard (Low-Tech Testing Dashboard - Satisfice, Inc.). This requires little effort from my end, and I’ve found that people liked the simplicity of it. My audience for this report was: non-technical people with stakes in the project. They don’t give a F about the number of test cases, they just want to know blockers and areas of concern. Therefore, I gave them my confidence vote along with this dashboard.

My heuristic is: reports shouldn’t take you too much time, it’s only meta-information at best, and a waste of effort at worst.

3 Likes

I’ve been in that situation, and at that point we made the strong case for implementing Test Rail to story test cases and produce reports which were highly configurable. It also plugged into our Jenkins so test case results could be automatically populated

Once we had report formats sorted, it made it easier to make our testing visible

2 Likes

I have some follow up questions for you Ben.

  1. Who is the target audience for these reports?
  2. What are you hoping to achieve with these reports?
2 Likes

+1 on that! Same with code coverage. I worked somewhere where code coverage metrics were always wanted by management but no-one could ever say why… :thinking:

1 Like

Great questions @christovskia .

  1. The reports are for me in the first instance, but I would like to be able to use them to communicate with my Squad (Developers and Product Owner).
  2. I often test many smaller components of a bigger system across multiple sprints. That all add up to an overall solution. I would like to be able to get a good idea what tests ran against what lower-level components so I can get a picture for higher level coverage. Without me having to do that by hand.
1 Like

This is fantastic, I haven’t read this before and I think it gives me a some great advice on what to aim for. Thank you @maaike.brinkhof !

1 Like

In my world of micro-services and endless layers of configuration and versions, I find individual pipelines only get me so far. And many of my integration tests don’t trigger per build because they require pairs of components to be built and tested together (or more).

So I am after something that crosses multiple automation suites across multiple builds, over time.