Confirmation bias experience

So I did a thing at work today. I was asked to try to identify why an account couldn’t be updated. I had an idea and was so convinced that was the cause that I only tested scenarios that would back up that idea. A simple attempt to actually update the said account would have shown me that my idea was wrong in this case but I somehow chose not to do that. Confirmation bias at its finest, I think. I’m feeling terrible now because I’ve essentially wasted a day chasing the wrong idea.

Has anyone else experienced something similar? Maybe then I won’t feel so bad :slight_smile:

5 Likes

Thanks for your honesty and openness, @testerbere.

Absolutely! And it happens all the time.

I’ve lost count of the number of times I’ve gone down a recreation rabbit hole where I’ve kept my focus narrow in an attempt to prove the thing. On reflection I would’ve highlighted something sooner in one of these typical instances:

  1. Usually trying the simplest thing could cause me to recreate it - instead I’d go for convoluted ways to make sure I had it exactly “recreatable”
  2. It’s an environment issue yet would always forget to check that first so end up wasting time going deep
  3. I’d go so deep and narrow where I would’ve likely have benefitted from going short and wide first

Us testing folks can be hard on ourselves. I place a heavy bet you’re not alone, Ere. I hope you can resolve that with yourself and your sense of bad feeling will soon pass.

2 Likes

Thank you very much Simon. I can really relate to the instances you highlighted. Here’s hoping to fewer occurrences of these in the future!

1 Like

Yup had that too. in my case, it happened not as a newbie tester (back then i was keen on exploring what was actually going on); but as a mid level tester (when the knowledge gained had gotten into my head to think “yea, i know this system through and through”). But after experiencing confirmation bias, i did some “introspective” and realized what was amiss (i had let presumption take its toll). So after that, i decided to revert to the true exploratory method, always remembering "it happened before, don’t let it happen again!) Cheers @testerbere , we all did it at some point!

2 Likes

Thanks for sharing, @agw.

That reminds me of a time I fell into the regression testing complacency trap. I wonder if there’s a bias name for it. :thinking:

The sales team rushed into our office to point out that ad revenue had suddenly taken a dip. Turns out myself and my fellow testing colleague had signed off a release without realising all the AdSense had disappeared. Both of us were so used to running regression that we no longer used the regression test checklist. We literally didn’t see what wasn’t in front of us – the missing AdSense. Was a big learning day for us both!

We resolved the issue quickly and through a post-mortem, it actually led to a push for automated checks for the essential features and a renewed energy of not taking things for granted or becoming complacent with regression tests. On reflection, it was very likely the start of moving towards a whole-team approach to quality instead of relying on QA folks to sign off a release.

1 Like