Your post raises a lot of issues. I have been doing accessibility testing for more than 20 years, so I have wrestled with them for a long time.
Firstly, depending on the website’s functionality, it may not be necessary to test with a screen reader at all when doing a WCAG audit. All the WCAG success criteria are written such that they can (indeed, must) be tested by inspection of the user interface and the source code. In practice, it is good to test a couple of the success criteria (such as SC 4.1.3:Status Messages) with a screen reader, but it is literally just a few and even then you must still check the code.
Browse mode
One of the reasons screen reader behaviour varies is because some have a “virtual cursor” or “browse” mode and others don’t. JAWS and NVDA do, and you will notice that their behaviour is very similar. Voiceover on macOS does not have a “browse” mode, but it has a weird concept of left-to-right navigation that most people find incomprehensible. Mobile screen readers also do not have a “browse” mode.
Heuristics
Another major reason is that screen readers use heuristics to improve the user experience when websites are coded badly. JAWS does this a lot, whereas NVDA uses very few heuristics, if any. NVDA therefore gives you a more “true” user experience (which is why we use it for testing), whereas JAWS tends to give a better user experience.
Specifications
Yet another reason is that the HTML, CSS, ARIA and JavaScript specifications change continuously, but the browsers and screen readers don’t all adopt the new features at the same time. Some never adopt certain features. 20 years ago, the HTML specification got updated perhaps once every five years. Now it’s updated every five days.
Accessibility support
Then there’s the difficulty of “accessibility supported technologies”. WCAG says that your conformance claim must only rely on accessibility supported technologies. However, WCAG explicitly avoids stating how many assistive technologies must support a particular technology in order for it to be considered to be supported. Is it sufficient that a website only works with JAWS and NVDA? There is no way to know.
Despite all this, I am not saying you should not test with screen readers. I am just saying you don’t need to if you are only doing a WCAG audit. If you’ve got time, definitely test with assistive technologies and do user testing with disabled participants.
Licensing
Regarding licensing, we spend thousands of pounds a year. All our testers have licenses for JAWS, ZoomText and Dragon in addition to all the free products. We also have a smaller number of licenses for products like Read&Write. My view is that if you’re a professional, you do things professionally, so you just pay what it costs. Or tell your management they need to pay - do they really want to be regarded as amateurs?
Automation
Automated accessibility testing is useful, but there are a lot of gotchas. Firstly, let me say unequivocally that the claim that “axe-core can find on average 57% of WCAG issues automatically” is absolute bull. The true figure is anywhere between 0% and 100% depending on what the specific issues are on your website. We test upwards of 100 different websites every year and we use axe as a “safety net” after doing the manual testing. I estimate axe finds 20% to 30% of the issues, but increasingly it only finds the least important issues.
Furthermore, you can’t take the results of most tools at face value because they report false positives. They also find genuine issues and report the wrong cause and/or make the wrong recommendation for fixing it. Analysis of the results can take a long time. I could talk about this all day, but I had better stop here.
Steve Green
Managing Director
Test Partners Ltd