Activity 6.1 - Practising generating test ideas

Time: 1 hour

Purpose: The craft of coming up with test ideas and novel ways to explore systems and ideas is one that requires practise. Whilst initially putting together a select list of heuristics to run against a system can be rewarding, the real value comes from observing the system and selecting the right heuristic for the right time for maximum effect.

This activity, which can be run multiple times if you wish, enables you to practise and reflect on your testing and test ideas to help improve and sharpen them.

Introduction: This activity requires a product to test and utilises the exploratory testing approach. So ensure you have something to test, either a feature at work or pick a different application.

Task: Pick an exploratory testing session and run it as you normally would. Start by testing without the use of explicit heuristics like mnemonics, sayings, models, etc. Once you have run out of ideas, then begin to bring in explicit heuristics to help you generate new ideas.

During the session make notes of the different test ideas you come up with and what you discover as you execute each test idea.

Once completed, reflect on and answer these questions:

  1. What test ideas did you come up with?
  2. How did you come up with them? Did they come to you naturally or did you use something like a mnemonic or phrase to create the ideas?
  3. What heuristics did you use and why did you feel they were suitable for the testing?

Share your answers and your testing notes on The Club so we can share ideas.

Here’s my story and notes by the way. The story ended up actually being relatively small, but using a particular mnemonic did make me think of other ideas that had completely bypassed me earlier.


What test ideas did you come up with?


  • Verify that new button appears on the global nav throughout every UI service
  • The global nav button conforms with the design and function of other buttons
  • The page matches the wireframes given
  • Displayed information on page is accurate to the documents
  • What does the download button do when I press it
  • What happens if I navigate directly to /applications/pdfs


  • How are the download links spoken in VoiceOver
  • How is the rest of page handled in VoiceOver
  • What does the Lighthouse audit reveal about accessibility on the page
  • Does it note any security issues

How did you come up with them?

So for the first few, a lot of them were based on knowledge of the story, or pre-existing knowledge of the system. But some of the occurred during testing. For example, it was only by chance that I happened to look at the URL when first logging onto the page, and noticed that the URL was /applications/policies, which struck me as weird. When I then checked the PDF, and saw that they were stored at /applications/pdfs, it then triggered the question of what would happen if I just navigated directly to the /pdfs/ URL?
One other thing is the Lighthouse audit. I actually only discovered this the other day during a QA meeting on the Motability team. But after noticing Accessibility as one of the terms in it, it reminded me of seeing the scoring within the audit during that meeting, and that prompted me to run it myself.

What heuristics did you use and why did you feel they were suitable for the testing?

I think the story on the whole isn’t a particularly techy one. We weren’t introducing any actual new components, just adding on an additional component to another component. All of the UI elements used already existed in other parts of the application, they were just being rearranged in a different way.
Because of that I decided to focus on the CAN I USE THIS mnemonic, particularly looking at accessibility. Being an organisation whose primary customers are the differently-abled, accessibility is a pillar for us, and we try and maintain a high level of it throughout the website (So it surprised me that I found what I found actually, definitely something overlooked there.)

Testing Notes

First thing I noticed on the homepage is that the global nav bar now has an additional option for website info. Fits in perfectly with the rest of the buttons. Hovering correctly highlights it. Clicking on it still brings about a border bigger than the button, ticket already raised for this.

Now on the page. The button is darker to show that’s the page we’re on, and the text in the button is bold.
The URL slug chains from /applications/ instead of being at its own unique slug.
Currently only two items on the page, one for website T&Cs and the other for the cookie policy. Both of them show the last updated date, and the download size, as well as including a download button.
Pressing the download button doesn’t actually download anything, it just opens a PDF in a new tab.

When on a PDF, I can see the URL is now /applications/pdfs/pdfname.pdf instead of /applications/policies, so I tried just navigating to /applications/pdfs to see if that loaded anything but it just redirected back to the homepage.

The button on the nav bar can be seen in the following UI pages:

  • Homepage
  • Policies
  • NOT Feedback (this needs raising)
  • Content Management
  • Vehicle Catalogue
  • Search Applications
  • Application
  • Fleet
  • Handback
  • Performance

Aside from feedback, I’m happy that it shows up everywhere it should do.


With VoiceOver enabled on my Mac, I tabbed through to verify that the download links were descriptive, but all they noted were that they were downloads, with no link to the document they were downloading. Need to raise a ticket for this.

When first navigating onto the page though, it does correctly call out “Website information” at the very start.

Running Lighthouse audit reveals an accessibility score of 94, though does highlight an issue that within our global nav bar, we have a small break before the bottom set of buttons ( <li> elements), but these are wrapped in a <div> instead of a <ul> which might prevent some screen readers from correctly calling it.
I believe that this change could be made whilst still maintaining the same design.

Another note from the audit mentions that we’re using a library version with a known high level security vulnerability .