Should there be (as little as possible) documentation about how your company thinks about QA?
And what the minimum deliverables should be?
There should be minimal test documentation (checklists, notes on exploratory, ect).
We encourage TDD to have a lot of Unit test coverage
We encourage test automation (where it can be useful)
I’m asking this because I’ve seen teams who do every testing activity and report on everything and in the same company there are teams without a tester who are acting like delivery cowboys , just going to production with everything …
There should be an overall test vision right? So you can refer to it, if a team is underperforming? Or if a new tester is joining the company and asks what the expectations are?
I work in a company where the stack is so deep that one team does not even begin to touch all of it and it covers many disciplines and tool types. So it would be pointless to force a company wide process, and also would ignore where different services lie in relation to whether they are providing a service, or providing innovation. Ignoring velocity when trying to prescribe process is a recipe for friction. There, I have been as typically un Dutch as I can be.
What quality means to one part of the company will differ to another if the parts you look at are fundamentally different in their velocity of change and in their source of change. But the language that they use definitely must be consistent. Because you need cowboys, but you will find that cowboys (scouts is a better word actually) use the service as a platform to innovate. And scouts are good at innovating but terrible at providing a mostly static yet reliable service. And this is where someone working in the more rigid part of a company might thing that other teams are discipled. They are not. They just move at different speeds. Scouts are incredibly good, at never stopping moving. And thus not becoming victims to the next Uber-taxi or Covid, that drives through your tidy business plan.
When management start to understand Wardley Maps, a lot of the stupid comparing of teams suddenly give way to mission and goal thinking. Which is all explained in this talk (also avail on wikipedia, but the YT clip does it better) An Introduction to Wardley Maps (Simon Wardley) - YouTube I’m not saying this it your answer, but if your managers grok this, they will stop trying to compare teams unfairly, and start building a map of your processes that everyone understands.
Oh, and definitely welcome to the Ministry of Testing. Zeer goede vraag.
Do have a search around for related threads, because vision and mission are topics that do come up a lot. And communicating these ways of operating, early, to new starters has probably been blogged about by someone already.
Usually its about finding a balance and if one team finds a good practice then they find ways of sharing their findings with other teams who then decide if its a good match for their team also.
It is worth documenting the good practices and let teams flag why they are choosing not to apply a good practice in my experience. That choice gives a balance alongside empowering teams to make their own decisions.
Take testing for example. If you are in a large company with multiple products both old and new you may find some strong advocates in some of the older products primarily for test cases and scripted testing, that perhaps may be a good practice on that product but its definitely not something I’d want forced on the teams I am working with who prefer more of a discovery role for testers.
This also applies to the decision on whether a team needs a tester. Lets say that team has had a good tester before but that tester coached the whole team to the point where the tester could move and help another team, this is a very valid situation with no dedicated tester.
However you may have another team that has not experienced a good tester and copy that no tester setup because that team are getting good results, this is now partially based on false information as they do not know the value a good tester brings to compare with with the results they get now, here perhaps lies cowboys.
Company level information on multiple testing options, good practices in some contexts and team development journeys can be useful here but that last one is fairly rare in my experience and the importance of the journey often gets lost as people change companies.
It would also depend on your organization’s size and how many products/delivery lines the organization has. Each individual delivery probably needs its own QA/test approach, as it would depend on the market forces and management bets for the business segment.
Perhaps it’s two things? A company-wide understanding of the quality model
I think there has to be a high level view of what testing & quality mean in your particular organisation and this has to be something that everyone in the company understands.
In our company we have a testing & quality vision and this vision is in essence a set of statements that guide the teams and their approach to testing and quality. An example of something in the vision would be “Risk based approach to testing”
The team then defines their test strategy which in turn should implement the vision.
I don’t mind what the test strategy is, providing it implements the vision. The reason for this is that the teams are quite different in terms of numbers of people and skill-sets therefore, I like to give them the autonomy to work out what is best for their team. Looking forward we are looking to define metrics for the vision to see how things are going and potential areas for improvement.
Welcome to the most excellent and *warmest QA community in the world David .
I like that “risk based approach to testing” line you take. The idea that we can test everything without a cost, or that we can truly know how much “unknown” is still to be uncovered at any time is super problematic. One has to use past experience and changes in environment to inform us of the road ahead, along with a mitigation plan. Which reminds me, the mission statement needs dusting off on my end too.
*warmest - only because it’s pretty cold for this time of year still