A while ago, I looked into test strategies. Creating a good strategy is, apparently, not covered enough in software testing blogs and resources. We, the tester community (As a whole, not you), tend to give a lot more weight to test execution than to developing a strategy.
That, however, is a rant for another time.
What I noticed is that there are at least three different kinds of documents relating to creating scenarios, and on further investigation, about testing in general.
- Descriptions with a lot of technical terms, jargon, and external references. Most, but not all, of these documents are related to some kind of certification. I assume that they were created with the intent of being confusing. (The presumption is, if they were clear and to the point, selling the course to go with the certification that explains these documents would be more difficult)
- Tons of external references with a lot of repeated content and very little in terms of how to do whatever you want to do. These documents are mostly "standards" or blogs related to said standards, and reading them leaves me rather angry about the effort it took to get through even a minor point.
- Engaging documents, frequently using the terms “we” or “you”, these documents actually describe the process of doing whatever they want to describe. While they almost always include examples, they usually hold context-based disclaimers. (link to example: Note the following phrase, “In Rapid Software Testing we define Test Strategy as …” I never took a RST course, though I really want to, but the use of “we” gives me a feeling that I’m part of the conversation.)
Reading several versions of these three styles of documents got me to thinking. Most methods of test reporting which are considered normal in our world leave me with negative emotions. Some documents confuse me, some bore me, some make me angry to the point that I stop reading (or writing, I’ve written more than a few of these myself).
In one of my teams, this same point occurred to me, but related to test development (test plan, test cases, test summaries), which led me to testing stories and reporting stories. The idea was to create a testing story that other important people would actually want to read. And in that aspect, in that company, I was very successful. One line of feedback that I will likely never forget was that a programmer believed that the system had never been tested so well. My take was not that our product was never tested as well, but he felt that the testing had improved.
In other words, if our communication with our team creates a positive emotional response, then our teams will be more responsive to our feedback.
This is not where my rabbit hole ended. It is, in fact, a very deep hole. But it is the point in my thought process where I want to engage you. So I have the following questions for discussion:
How do you (or how can you, if you do not do so now) communicate your testing stories in a way which can engage your audience rather than alienate them?