Masterclass Further Discussion: Reporting on Testing & Quality

Tonight we were joined by @slucian118 for a masterclass that dived into how to present our work, our results and even our insights in a way that makes that work shine :star:

If we didn’t get to your questions tonight or you’d like to continue the conversation you can ask them here.

Of course, the recording of the masterclass will be available to MoT Pro members in the masterclass section.

What are you going to try from the masterclass in your own context?

A question that Lucian said he will come back to :grin:

  • Are there any examples of the report that follow the author’s advice?

Lucians blog

Follow Lucian on Twitter

Questions we didn’t get to

  1. Question for later: What do you think about asking stakeholders to fill out surveys regularly to measure their satisfaction with the delivery? And using these values from the survey as a foundation for a quality report and displaying on a ongoing basis the quality development? Do you have good examples for appropriate survey questions to the stakeholders?
  2. Thanks Lucian for a great session. What are some of the pitfalls that could diminish the value of test reports?
  3. What specific types of reports have you found to be the most valuable to getting actionable feedback to improve process? Examples, Production Bug report, Release/Sprint Bug Reports, Severity vs Priority, etc.
  4. how would you go about reporting on the effectiveness of testing in your report
  5. i need real example for applying risk in daily status report
  6. A lot of information is passed on verbally when collaborating in a scrum team, ensuring PO kept up to speed in real time. Would you follow up with reports?
  7. For reporting on automated test results, do you think this can be done using tooling or do you recommend manually reporting on automated test results based on the fact that you mentioned you don’t like using templates?
  8. How do you see the relation between test plan/strategy and the test report?
  9. Is there a good book or resource that you recommend for using effective analogies or metaphors?
  10. What would be the minimum set of info a report should include? If we consider an Excel sheet, what would be some of the columns for example?
  11. When is using a template bad?
  12. Microsoft azzure tool is good reporting tool for daily status ? and how i mention risk using that tool by severity or what?

Nice question, and I have to say from the start that this is my personal view on the topic, so please take it as such.

This I believe is somehow outside of the scope of reporting on quality as I framed it for the session. I say this because my frame was that of reporting as output or follow-up of the testing effort. I believe this approach is kind of complementary to what I have covered in during the session. I say this, because this seems to me another option to get stakeholder involvement, and it is not a product of software testing effort.
Also, as I mentioned, quality can mean so many things, and it would be difficult to build relevant surveys for relevant audiences for different perspectives of quality, not to say that surveys are a tool that only captures the segment of audience that wants to reply and interact, leaving a silent minority (or majority) that does not express themselves via survey replies.

One other aspect I believe needing attention is the “survey fatigue”, that will occur at some point, increasing the risk of having not so useful input.

To sum it up, I believe building and administering surveys is a different approach, not excluding a healthy approach on reporting on quality. I would not do this, as administering surveys is not an easy task, being in itself a different skillset and craft, not to mention being effort intensive. I have not taken this approach and I do not have examples on this.

1 Like

Anyone know when tue recording will je available? I was kicked out at tue beginning of the session :frowning:

It will be going live some time this week Bettina, we’re just waiting for captions :grin:

Sorry to hear you were kicked out. Did your connection drop? Usually a refresh of the page works for people but on rare occasions, that isn’t enough. If it happens again, please let us know at the time and we’ll do our best to get you back in to watch it live :wink:

This is a tricky one, because reporting on quality does not always means reporting on progress. I for myself started to get better at reporting on quality when working on test reports, that I kept enhancing. For example, one aspect that is easy to include is mentioning on a high level what risks were explored or uncovered.

Sticking to you question’s context, I believe that reporting is a reality check of the strategy and should be coupled. I see this done by referencing aspects of the test strategy that were touched during the test session or test plan run.

XP (Extreme programming) is for me the most important reference of the link between metaphors ( and I recommend this as a starting point.
Related to books, I found really useful for this approach books like " Perfect Software And Other Illusions About Testing" by Jerry Weinberg.

Initially, no template is really bad. The issue with templates is that many times we forget that templates are more of a guideline, and when this is forgotten, templates become something close to a form that is just filled up without thinking too much about “why am I writing this?” or “how is this information useful?”. This is in my view the main danger or using templates, forgetting to ask the basic questions and running mindlessly on “autopilot”.

Down the line, if a report can nicely fit in a template, it is a perfect candidate for automating it. When a report is automated, many times only the really bad things stick out, and nuances and minor details are easily lost.

And one last aspect about templates is that reporting becomes monotonous, numbling the attention of those who should read and take decisions on them.

@testtrain the recording is now live :tada:

Awesome, thanks Lucian!