Tonight we had one of my idols in testing, @lisa.crispin, for an Ask Me Anything about Whole Team Testing.
It was clear from the number of questions we had before starting the session that we wouldnât get to them all! If we didnât get to your question or youâre catching up on the Dojo and have thought of questions youâd like to ask Lisa, please share them here
The developers I work with already do some testing, particularly when the testers workload is too high. We always say that the developer should not test anything that they themselves have worked on. Do you agree with this? Is there any other situation where the developer should NOT be testing?
Do you think there is a risk that quality could decline if non-testers do the testing? Is there anything we can do to mitigate these risks when we adopt a whole team approach to testing?
Do you think there is a minimum percentage of testing that should be done by testers?
And now for the unanswered questions, seriously people I think this is the record for most amount of questions asked!
In general i feeling testing is greatly misunderstood by alot of non testers. Do you feel this barrier needs to be clarified before a team can do whole team testing?
When you mention whole team testing do you mean exploratory testing by the team vs members of the team contributing to running checks/automated checks?
Why do testers seem to have a number of issues around encouraging a scrum mentality of the team owning testing?
How to deal with developers who have too strong opinion about how to test and how much to test and where?
How did you first meet Janet Gregory?
Do testers belong to the development team, to the âtesting teamâ or both? Is there anything that bring the testers under a common vision to work altogether while they also work in the dev teams?
My devs are amazing when we ask them to help out with automated testing but itâs much harder to get them to help out with manual testing. They say theyâre ânot good at itâ. Whatâs a good response beyond âno one is, at first, and i really need your helpâ?
Does Extreme Programming still have a place in software development or has it been taken over by newer methods?
How can we carefully explain to developers that their unit tests are not enough?
10.How can QA collaborate with the developers or the team while preparing the test strategy? What are the various challenges that might occur during the test execution? Test reports vary according to the nature of the project, team but what are the fundamental things that should be addressed in the test reports apart from test cases and bug reports?
How important is the language that the team uses? Terms like resources instead of people, talk about developers and testers rather than the (delivery) team, testing versus checking, being done with testing etc?
Developers think testing is boring. What can I do?
How do you deal with reporting in an agile team? Is it monitoring and comparing trends? Or is there more? What if the team does not ask after reporting?
I am in a team with a âwhole teamâ approach to testing. How do I know? How do I know if we start losing this?
On th ABTesting podcast, Alan and Brent advocate developers testing their own code, what is your opinion on that, and where does it leave the âtest teamâ?
What was it like working on Agile Testing: A Practical Guide for Testers and Agile Teams?
How did you first get into raising donkeys?
Developers, Iâve found, hate the very concept of UI testing. How can we pitch it to them to make it more palatable?
Is there a newish shift away from Devâs not testing or not working with testers? Every lead developer I have worked with encourages testing within the dev team & supporting the testing resource.
What is the best way to âsellâ whole team testing to the âbusinessâ to make it more likely to get the go ahead?
What is your opinion about the future of QAs? Maybe they will start to work with quality of IA or transform in a developer with others kinds of skills ?
We have testing activities (monthly patch testing) that run in parallel to our sprint testing which is in large part un seen work and takes the testers away from the team for 3-5 during a sprint, how can we bring that testing into the team?
How to convince developers about involving testers in unit test review? Devâs mention that thereâs lang barrier and also too much time waste in pairing up while unit testing?
Before I started, the whole team was already accustomed to regression testing before every release. Because of that, theyâve become entitled to try to influence what goes into the release and to change features during regression testing. What is the balance between having the team involved in testing but not dictate last-minute changes?
UAT Testing - Iâm currently in a debate with my PO as she believes that UAT testing should be done by the testers, but we already do regression testing and I believe we should set up a structure outside of the team for âfresh eyesâ. Are either of us right or is a combination of the 2 the way forward? Itâs a mobile app so easy access to it
How do you gauge the maturity of your testing team/practice?
Youâve moved the world of testing forward considerably. Thank you, Lisa! If you consider creativity as a driver, where do you seek inspiration from outside of the world of product development? How do you map that to the world of testing/product development?
Letâs assume that you already have a situation that the whole team is testing. Letâs say that it works well. One challenge is to keep it this way. Once you achieve it, what is the next thing? How deep do you dive into testing as a team? If there is no limit, it kind of make sense that there is no exact boundaries between testers and developers in such team in future.
How does offshoring work with this - a lot of the UK banks still see Offshore testing as the answer!
Laura asked a question about how to get the whole team involved in performance testing.
A technique that worked well for my team recently was this:
Work together with the product owner to work out the goals of the performance testing
Use this to define particular test scenarios (ideally prioritised a bit)
If youâre completely new to a tool, work out how to script, debug and run one of the easy-to-medium scenarios yourself first, and start to understand what information youâll want to capture in the actual test run
Pair with a developer on scripting and debugging a scenario, to share your knowledge
Divide and conquer scripting the rest of the scenarios, with plenty of opportunities to review progress and course-correct. We do final pull request reviews too, which helped. Identifying potential âtricky bitsâ and unknowns in the scripting and making a conscious decision about who was going to tackle those also worked well.
Divide and conquer running the final test scenarios. I created a template for each test run, for capturing the important information (e.g. start and end time, checklists for test data setup, placeholder tables for capturing key metrics, etc.), so the devs could rattle through running the scenarios. Then I was able to focus on working through the results and digging into the reasons behind the behaviour, to draw conclusions and recommend further work to consider.
I hope that helps! Iâd be interested to hear more about your situation and would be happy to talk in more detail.
I do think that is good practice for the developers to not test their own code. Itâs hard to distance yourself from your pretty baby. That said, Iâve seen developers who are pairing and using a small exploratory testing checklist for guidance do a good job of basic testing on their own story. Just as they use unit tests and TDD to ensure code correctness, they can use some manual exploratory testing for this too. Someone else should then test it at the feature level.
Thank you! It was my first time doing performance testing so it was only when I saw the question this evening that I realised I did have something specific that I could share from the experience - so thereâs no blog post or article yet. But the positive feedback is definitely great encouragement for me to make it happen
Unfortunately, most non-testers have a sketchy understanding of testing, at best. I think we should try to help them learn about it. Especially execs and managers need to understand the value of testing, and how an investment of quality pays off in the long run. Everyone goes around saying they want the best quality, itâs like mom and apple pie, but if they arenât willing to let teams have time to integrate testing activities into coding activities (both are equally important parts of software development), that lip service doesnât help.
Work to make benefits of testing your own team is doing more visible. Also make the problems caused by lack of testing visible. This can be as simple as highlighting critical bugs in production. Testing isnât the way to fix those - building quality in is. Testers can help the team learn ways to shorten feedback loops and develop high quality code from the start.
One way to help execs understand might be to show the opportunity cost of things like time spent triaging and fixing bugs in production (and communicating with the irate customers) at the expense of time to build new features. And, we donât get a lot of chances to show customers the value our product offers, if we blow that chance because we deliver the wrong things or deliver buggy things, itâs bad news for the business.
Whole team testing includes all testing activities, and it especially includes baking quality into the product from the get-go. I didnât get into this during the AMA, but testers can play a vital role in helping the deliver team and business stake holders achieve shared understanding of the purpose of each new feature, how it should behave, how we will know it is successful in production. We can add value with lots of different testing activities for various quality attributes: accessibility, security, reliability, usability, those ilities go on and on. We invest in regression test automation to free up time for value-add activities like exploratory testing.
If Iâm understanding this correctly - Iâve seen a lot of âagile transitionsâ where developers got training in technical practices like TDD, product people got ScrumMaster or product owner training, and testers got⌠ignored. Itâs natural for them to have a lot of fear about suddenly being stuck on a cross-functional Scrum team thatâs supposed to take responsibility for quality and testing. This is where we need managers to step in and support testers in many ways: training, time to learn, and making sure they are equally valued members of the delivery team.
I was recently in a software testing job interview where manual testing was pretty much seen by a senior manager as a commodity. Automated testing was more respected, but Iâm not sure exactly why they should automatically make that distinction because in my experience if automation is badly done, then itâs arguably worse than commoditized manual testing as youâre going to rely on automation and if it is like a wonky crutch, some day it may give out and youâll come crashing down.
On some level it was like the organization saw pure manual testing as a cost, but automated testing as more of a value. Obviously there are subtleties to the manual v automated debate, but as it was in a job interview it wasnât time to broach them, so I suggested quality assistance, i.e. somewhat of a flavour of whole-team testing, as something that might give a halfway house.
They seemed willing to consider this as to them it meant people with a manual skillset could add value in helping ensure automated testing created by non-testers was up to the mark, while possibly upskilling themselves to be able to work on automated testing at some future point.
To relate to your quote, with a little persuading they could possibly come to see how investing in testers and seeing the discipline in value-adding way could help them move the business forward as they were in the ironic position of having low confidence in the software they were producing but due to how they saw manual testing and testing in general, they couldnât really see an outâŚ
I havenât run into this a lot in my career. In my younger days I expect Iâd have locked horns with them! I hope today I would be smart enough to listen to them and see what they have to say. Maybe they have some good ideas. Maybe, as they talk, they will realize they have some areas they arenât so sure about. You may find an opportunity to suggest trying some small experiment. You will always meet resistance. Sometimes we can use that energy and turn it around for good.
Back in 2000, I tried to get Brian Marick to write a book on testing in Extreme Programming with me, but at the time, he didnât have experience working on an XP team. He did encourage me to go forward with the book, and I ended up writing it with Tip House. Soon, Brian introduced me to Janet. She was working as a tester on an extreme programming team in Calgary. She had the good fortune to work with some Thoughtworkers who were among the pioneers of XP. Janet became the âtesterâ for our book. Weâd send her our chapters, she would try the techniques with her team and give us feedback as to how they worked. It was a huge help to us!
After finishing the book, Janet and I kept corresponding and helping each other as being a tester on an XP team was still a rare thing. We both attended XP 2002 in Chicago. I canât remember if we first decided to collaborate on a talk, or if we first decided to collaborate on writing an article, but before long we were doing both together pretty frequently. In 2008, my editor asked if I would write a new book about testing in agile. Tip didnât want to write another book. Luckily for me, I was able to talk Janet into it! We complement each othersâ experience and skill sets really well. Now weâve started the Agile Testing Fellowship, weâre still doing tutorials together, and who knows what will be next!
This is a great question. Iâve experienced benefits both ways. When I was part of the cross-functional development team reporting to the development manager, I truly felt like part of the team. I was fortunate to have managers who valued testers as much as other team members and I was seen as a senior team member and part of the leadership.
In another job, I was part of the testing and support team, reporting to a test/support director. Helping with support was a big benefit, it helped me know what problems customers experienced and helped us improve our testing and focus it in the right places. Reporting to a director who had equal rank and authority to the development director was also an advantage there. The company culture did not value testers, though the development management grudgingly agreed they were necessary. Our director made sure that we were equally supported and valued. We were embedded in the development team and worked as part of that team. Because we were so few testers compared to the size of the team, developers did a great deal of testing work. It ended up being a great collaboration. We all learned from each other and our product was better for that.
In large companies with many delivery teams, Iâve seen the need to have, at the very least, a testing Community of Practice leader who ensures that testers get together to share experiences, knowledge, tools and such regularly, and makes sure they get all the training and support they need.
Question: My devs are amazing when we ask them to help out with automated testing but itâs much harder to get them to help out with manual testing. They say theyâre ânot good at itâ. Whatâs a good response beyond âno one is, at first, and i really need your helpâ?
As I mentioned in the AMA, it is important to share the pain of manual regression testing with everyone on the team. Divide those checklists or scripts up among everyone including developers.
For other types of manual testing, I think a lot of this is just a bit of fear from developers that they donât know how. Thatâs why I did the fun exploratory testing workshop I described, using personas and charters but testing kidsâ toys and games. Then followed up with more serious workshops testing our app.
Having testers pair with developers frequently also helps developers learn more testing skills. Even if you pair on writing production code, youâll be writing unit tests and hopefully automating tests at other levels too, so as a tester you can explain how to specify good test cases.
Once when pairing with a dev on my team we had the idea to put together a short âexploratory testing checklist for devsâ. We pinned it to our Slack channel. It encouraged developers to remember to try more manual testing before declaring a story done. I also laminated Elisabeth Hendricksonâs Testing Heuristics Cheat Sheet and left copies around the work area. I would see it get used occasionally.
Q: Does Extreme Programming still have a place in software development or has it been taken over by newer methods?
XPâs creators never intended for a thing called âExtreme Programmingâ to be around for years and years. I had a conversation with Kent Beck back in 2001 at a testing conference where I asked why they had picked such a terrible name. He said âOh, in 10 years people will just be calling this good software developmentâ. Sadly, that hasnât really happened. But many of the XP practices, such as TDD, CI, refactoring, and indeed testing, are established development practices today. We see different frameworks for managing projects, such as kanban versus Scrum, but high-performing teams are doing most if not all of the XP practices.
In some contexts, unit tests could be enough! My approach has been to get everyone on the team together, talk about what is going well and whatâs not going so well, is our code the level of quality to which we committed? What is our biggest problem? What is a realistic, timely, measurable goal to make that problem smaller? Letâs think of an experiment and measure progress towards that goal.
This is one area where I find models like the test automation pyramid helpful. Unit tests are the solid base of the pyramid. We can look at that model and talk about where we ware now and where we want ot be. I would venture to say that teams doing test-driven development with good coverage at the unit level will have code that is significantly higher quality than teams doing no test automation and probably higher quality than teams who are doing some automation through the UI level. That doesnât mean itâs good enough. We should always be trying to improve.
Since Iâm not a coder anymore and I donât write unit tests, Iâve found it doesnât really work to evangelize about how great it would be to automate tests at all the levels. Look for ways to get the whole team to talk about it. As I mentioned in the AMA, get More Fearless Change by Linda Rising and Mary Lynn Manns and go work at being an agent for change.