Our first paired presentation!!!: Charlene Granadosin and Charlotte Bersamin for the last talk of the day. Charlene is another peep from one of my old gigs.
WORD!!! WE HAVE A STAGE PERFORMANCE… live example of “works on my machine”
This talk is about bias. Super important
- Cognitive Bias
- “Are you sure?”
- “Works on my machine”
- Jumping to conclusions based on one approach or known approaches.
- Some keywords to consider
- Personal experience. Testing ideas tend to be deeply personal and are baked in bias. These are all valid tests, but it requires effort to identify the bias to help challenge the approach
- Fundamental Attribution and Innatentional Blindness
- Discounting evidence, misjudgments and tunnel vision allows us to miss cues that may point the need to test more.
Some approached to help tackle some bias that is common:
- Pair and Mob testing. Rotating navigator and testers allow individuals to introduce their personal bias to be explored and considered by a larger group. More filters to view the product.
- Feedback on test approaches and debriefs
- Concept: Testing Bucket. Immediately create a collection of testing ideas to capture whenever you think of them, as something that can be used later on.
- Asking others about how they are testing and what they are testing… find out why
- Mindmapping testing ideas immediately before you even get all the AC’s or requirements. Look to see how the breakdown matches what you expected. But breaking the tests down in a mindmap is an easier conversation piece to gain feedback from others… especially when compared to a list of thousands of test cases
- Personal bias
The bandwagon effect… following along with approaches due to trend, pressure or a desire to conform with the group. Hidebound (new term for me) is the notion that you are already set in your ways. I feel this might be closely related to expertise bias. No deeply knowledgeable tester wants to consider their approaches as stale, and they will naturally trust their instincts BECAUSE they are an expert and from survivorship bias
One interesting approach, using SWOT analysis on your existing approaches. Get a sense of where you plan to implement a potential improvement, and then create a proof of concept on the improvements. Plan a retrospective and the means by which one revisits the process that you intended to improve and make clear decisions as to why to continue, stop or start doing changes.
Halo effect - positive impressions of a person will influence their opinions in another area.
- We have seen this in implementing agile approaches.
- If certain meetings have had ideas shut down, it might influence involvement in future meetings.
- This makes it hard to change up how you test of what you choose to test or call a bug.
- How to remedy the Halo effect: talk it out. Silence is not golden in a culture of improvement and inclusion. Find ways to draw anonymous feedback and ideation. Have meetings without direct leadership present. Create a space specifically for inclusion of new ideas, opinions, concerns, and allow that time to be in the schedule for planning meetings.
Self Attribution Bias
- Attributing success to individuals rather than systemic or external factors.
- Leads to self doubt and lack of confidence in the process or the approach.
- Methods to counter this: Accountability
- Take action on issues that occur. This might include revisiting test approach or automation strategy
- Give credit to everyone involved in the SDLC, and build relationships across involved orgs
- Invest in capabilities and abilities within the team
They are looking to get more research on bias in the workplace, https://tinyurl.com/qaBiasMetrics… get online and let’s gather this info