Is it useful for your team? Was it the QA specialists’ or the management’s initiative to set them up?
Hey @taniazhydkovaa and welcome to Ministry of Testing!
We use dashboards because of my own initiative, I’ve asked access and set them up myself. Everybody is enjoying them (analysts, security team, testers, devs) and even our UAT-team (business) is using them now for follow up.
Is it useful? Depending on the context of course, I don’t want it to contain a massive amount of information, but it has some links to some queries which have an overview of certain amount of tickets.
We use azure and personally and my co-workers don’t use the “Boards” anymore but more the dashboards because it contains all that we need in 1 place.
Many times, team members are not aware of what testers are doing unless we share the reports.
Settling up dashboards would help the team.
We too have set up QA dashboards for the respective projects.
- Test Case status
- Tickets under each QA
- Pipeline run state
- Total test cases
Once you get a hang of it. It’s fun, create queries and add all the necessary details and share them with the team.
Of course, it is a good idea to stick with QA dashboards. It is not only beneficial for the client who needs to have access to routine progress reports but dashboards allow teams to stay in sync with each other.
Currently our test engineers are embedded within scrum teams, however we do include quality metrics within our dashboards. Back in the day when we were a group, we included a chunk of the typical dashboard info in an email that combined all of the engineering teams info.
Given that dashboards can be relatively straightforward to setup, I’d always encourage it. I don’t necessarily care management’s requirements but I think it helps all of engineering when we have visibility of the other teams involved in the same product/project
No QA specific dashboards in my current role.
I think for the use case @kristof mentions, we rely more on Slack bookmarks or Confluence pages.
We don’t have formal test cases and we’ve shifted heavily left to rely mostly upon automation and ensuring we have good unit and integration tests, as well as e2e where it makes sense, so our tracking of testing is primarily through the Jira story for the code changes, what happens in the related code reviews, and the CI/CD system reporting the results.
We have a Dashboard that shows the status of our Jenkins automation jobs. It’s pretty simple but a good overview to spot if something failed.
It was set up by the testers with the idea of making our work more visible. To be honest I don’t know how much it helped within the team. We mostly show it to upper management now. They like colorful pretty things.
I’m an advocate of a testing dashboard that contains the assessment of:
- How well testing is going
- What the current quality of the product is
The important part of this dashboard is that the tester(s) give a green/yellow/red rating on a feature or quality area and then explain why they gave this rating. The dashboard is updated each sprint. This information supplements the tools that show automation or manual test cases passing/failing and percentage of testing “done”. Information from those tools doesn’t need to be repeated in this dashboard. This report is how testers are required to “put their skin in the game” so to speak. Pass/fail lists don’t tell the whole story.
I have not used a “QA Dashboard” but I have been thinking about how to use a dashboard to show the costs of poor quality as this may help show the need to improve quality. I recently wrote this blog about it A quality costs scorecard – TestAndAnalysis
I’ve used the low tech testing dashboard - my team really appreciated it
read more about it here