How do you add quality outside of the “testing phase” of the SDLC?

In my experience, quality cannot be added from a single column on a board or in a few weeks at the end of a project, and it can be almost impossible to retrofit quality. It’s built gradually through small intentional steps, often framed around the board because of its visibility. This approach means one (single) high-quality gate is broken down into lots of smaller requirements from when a card is first written until it is completed and released to users. Adding quality at every step encourages whole-team ownership by making these steps more likely to be shared, collaborative activities.

In response to my previous article, When the tester’s away… the team can test anyway!, I was asked how I’d encourage engineers to think like testers. How to educate the team to perform amazing exploratory testing and how all team members can contribute to system-wide testing, allowing testers to support teams to deliver better products. This article, "Using an Agile definition of done to promote a quality culture ", introduces some coaching questions to start the conversation on where testing takes place, and then goes through each column on the board in turn. It goes some way to answer those questions by highlighting testability at each increment and asking open, curious questions about existing practices which could be improved.

Working at a consultancy, clients are often unaware of the role of testers and the value we can bring. I’d always recommend starting small from a place where the impact will be most noticeable. In my example, this was in bug cards - the acceptance criteria simply weren’t up to scratch. I used my “new girl” approach to ask naive questions and demonstrated how each bug could be written better to detail the required steps to reproduce, existing actions and expected behaviours. Quietly asking somebody to explain the card to me allowed me to document test cases in front of them and force that first pairing interaction.

However long it takes, I’ve been using a definition of done to add lots of small steps to improve testing throughout the SDLC.

What you’ll learn:

  • How we use an Agile definition of done
  • What sort of questions to ask your team to understand their perception of quality cards and features
  • How an example definition of done is built up through a worked example

After reading, share your thoughts:

  • Does your team have an explicit or an implicit definition of done?
  • Does your team use a different methodology or term? How do you add quality outside of the “testing phase” of the SDLC? Please share your experiences.
6 Likes

Nice article!

In my experience anything other than explicit will be forgotten by the team within a couple of sprints :sweat_smile:

One of the most effective teams I worked in pulled up the definition of ready and definition of done each sprint to make sure everyone was still happy with it. This was a great approach and is something I have continued doing in a slightly different way when working with teams that aren’t as mature, usually by suggesting improvements to DoR/DoD in sprint retrospectives.

1 Like

When possible, it may help to include QA/testers in the early phases of project planning, going over the project documents, requirements, those meetings, the UX mockups. QA/testing isn’t simply at the end. Including the QA aspect early on helps catch things like missing requirements, mistaken assumptions, pointing out potential system integration issues, as well as help the QA team do early analysis/planning on test automation (e.g. how to model the workflow for the UI interaction with page objects, etc. - the UI doesn’t have to be ready to start planning out the high level code or pseudo code - you fill in details especially like element locators down the road when UI is finally ready). The early involvement of QA just sometimes might catch bugs before they happen.

I’m there at all the beginnings asking how the ¤&#¤ this is supposed to work, and when they answer i most often say but what about this and what about that, and they say oh we didn’t think of that.

A Defintion of Ready is also important! I see so many poorly written requirements, design docs etc. which really impact the quality of build and test. Project Managers/Scrums Masters/Release Train Managers as well as QAs need to be stronger in saying not ready yet

1 Like

For DoD, it is worth having a discussion at start of project, what it is, why its that and what value it can bring. Here’s a few variations I’ve seen, its okay to use different one’s on different projects but essential to have the discussion and align.

  1. Developer has finished coding - not even in dev yet. This one is not great.
  2. Developer coded and feature in build ready for testing. Has its use.
  3. Developer coded and dev automated tests run succesfully. Higher value.
  4. Developed, automated test in place and verified by tester.
  5. Developed, automated test, verified and testing explored. i.e Done means its fairly safe to go to production - often common understanding of done. Advanced usage but makes a lot of sense.

On the other element, I do tend to find the concepts of SDLC and the idea of a testing phase more associated with waterfall models and not so much used in Agile type models.

There is quite a bit of discussion on holistic testing out there that I have found useful, they vary significantly though and the term far from means the same thing for every but there can be value in applying some elements from those discussions. Here is a visual I have used, it is cyclic even though it looks more sdlc style and rare all of those would apply to a single project but it may have a few ideas of use to others.

2 Likes