For our first talk of TestBash Home 2021, @deament takes to the stage to talk to us about testing against implicit requirements, how to spot them and what shapes them
Weāll use this Club thread to share resources mentioned during the talk and answer any questions we donāt get to during the live session.
I find there are some people who are keen for QA to act as gatekeepers for a release. How do you find this works with reporting your findings without judgement?
I find there are some people who are keen for QA to act as gatekeepers for a release. How do you find this works with reporting your findings without judgement?
what would you suggest as a way to get testers or devs āto read between the linesā or think about the implicit requirements
What do you think are usually the telltale signs of implicit requirements still to be uncovered in a project?
Do you identify any metric / how we can measure testing effectiveness applying this to our development cycles?
Do you ever have disagreement between dev and product when you reveal an implicit requirement?
Do you think, the risks with implicit requirements is higher for ānewly formedā teams vs teams, who work long time together?
Do you find implicit requirements are discovered/identified easier when pairing/mobbing on testing or getting your head down and focusing on the problem?
Where in the time line in a project lifecycle? do you take up the implicit requirements at project start up, SBE? sprint start? exploratory testing?
Is there a good way to handle clients (or management) when implicit/unexpected requirements appear?
Do you advise acquiring some domain knowledge or gain some understanding of similar projects, when the testers senses that the requirements are implicit?
Do you use example mapping and more team meeting to uncover more implicit requirements?
How would you go about explaining implicit requirements to someone who is new to the project, who has not have been there from the beginning?
How early do you think we need to start implicit requirements testing in the testing life cycle? Do we convert every implicit requirement to explicit ?
How would you go about explaining implicit requirements to someone who is new to the project, who has not have been there from the beginning?
Iāll go through roughly half of the questions tonight, then the second half tomorrow
I find there are some people who are keen for QA to act as gatekeepers for a release. How do you find this works with reporting your findings without judgement? I find it can be tricky as you may be forced to give a judgement on behaviour being good or bad, even when you just want to present information on how it is. If backed into a corner, I will give my opinion on how risky something can be if we were to go live with something that could be very problematic for end users. But usually benchmarking my findings or just presenting what I find causes the product owner or the rest of the team to act as a gate keeper.
what would you suggest as a way to get testers or devs āto read between the linesā or think about the implicit requirements
*I find starting discussions with some of your assumptions really helps (with how you, yourself, have read between the lines). Then this helps trigger other people to discuss other aspects - you may find you first discuss browser or device support, but the conversation will probably take you to other aspects of non-functional requirements etc. * I think itās also good to have a go-to of things to start up discussions (HICCUPPS and the non-functional requirements I mentioned in my talk, tend to help me get things going)
What do you think are usually the telltale signs of implicit requirements still to be uncovered in a project? Lack of written requirements (i.e. either they donāt exist or they are very minimum/bare) In discussions about the features, finding people have different understandings of a few things, to me, then implies there could be more things people have different understandings of (so it opens a can of worms)
Do you identify any metric / how we can measure testing effectiveness applying this to our development cycles? I canāt think of any good reliable, measurable metric (as people can act in such a way to make the metric show good results, but testing could still be done poorly). This isnāt the easiest thing to measure, but I do find that when developers actively seek out the input of testers early and others in the team try to get testers more involved - then that is usually evidence that the tester has been recognised to be doing a great job (thatās not to say that if these things arenāt happening, the tester isnāt good - just that the team has recognised the value the tester adds).
Do you ever have disagreement between dev and product when you reveal an implicit requirement? Yes. And thatās where the fun starts - we explain our viewpoints and then come to an agreement on what we think the website/app should do. Iām just happy that the disagreement was brought to our attention so we can address it Iāve found it can be useful to have allies, in past cases the UX designer has been a good one for this, but ultimately Iām not trying to argue that Iām right - more than there could be an expectation for this implicit requirement we should investigate.
Do you think, the risks with implicit requirements is higher for ānewly formedā teams vs teams, who work long time together? I think they are a higher risk for newly formed teams because they havenāt yet gained a massive understanding of what the product owner or business analyst expects/means when they say/write certain things. Over time, you can learn from past mistakes, and have past features etc, built together as a point of reference. There is definitely still a risk of implicit requirements for established teams, but I think having that shared history helps minimise the risk.
Do you find implicit requirements are discovered/identified easier when pairing/mobbing on testing or getting your head down and focusing on the problem? Pairing or mobbing. More specifically: having discussions with someone so you can trigger ideas off each other.
Where in the time line in a project lifecycle? do you take up the implicit requirements at project start up, SBE? sprint start? exploratory testing? You can find implicit requirements all the way from project start-up right up until exploratory testing the website/app in front of you.
[/quote]