How are y’all measuring product quality?

I’ve mentioned in a number of posts that we have a quality gate before we release to production. The quality gate should be a 15 minute discussion between stakeholders to make sure we all agree that the software is “Good to Go”, driven by a brief check list.

Some of the checks are evidenced and some are just verbal checks. This is a summary of the questions the process asks:

  • Are dev happy with the coverage of their unit testing?
  • Are we happy there aren’t any critical vulnerabilities in the code? (Cyber Essentials+ check)
  • Do we cover all the tickets in the release in the testing?
  • Are we happy with coverage of the regression testing?
  • Are the release notes accurate and understandable by a customer?
  • Is the deployment process understood, including any new configuration options?
  • Are there any checks the customer success team need to do once in production?
  • Are there any customer change control processes we need to follow?
  • Are there any outstanding backlog bugs that we should have addressed?

Sounds like a lot but its now a routine with evidence and templates where needed it is usually pretty quick. Outcomes are Go, Conditional Go (a couple of minor actions need to be completed) and No Go.

None of the things in the checklist should be a surprise or left until we actual have the quality gate. It should be just a validation of what we already know.

5 Likes