Manual Accessibility testing

I know this keeps rearing it’s head, but that’s a good thing as Accessibility falls on to everybodies radar more frequently.

As automated testing only finds 55% of the issues, I am looking for advice as to how to go about the manual testing of accessibilty (for websites). I know it comes in many forms and want to get some framework in place to execute it confidently in my workplace.

My initial strategy is:
Screenreader Testing (‘Jaws’ or ‘NVDA’ for Desktop, ‘Voiceover’ for Mobile)
Standard keyboard only testing
Colour Contrast tool (although these don’t appear to check existing content)
Speech Recognition (suggestions welcomed)
Checking images for missing alt text

I know that the above is just scratching the surface, so if you can help me add some content/fill the gaps, then I will be eternally grateful.

2 Likes

For a full review I like using the Accessibility Insights for Web (browser extension). It includes automatic scans, some other test options and guided assessment. It makes sure you don’t forget anything while switching between tools and you can save the report.

About your initial strategy:

  • WCAG Color contrast checker can evaluate the content
  • For Speech Recognition Windows comes with some built-it. If you are looking for commonly used software that would be Dragon NaturallySpeaking

There are many great tools out there. Some of them can do just one thing, some of them more. Helpfulness will depend on your preferences and/or target to be tested (some are blocked by sites).
I love bookmarklets to support my manual accessibility tests. Here are some:

4 Likes

Before you can answer that question, you need to decide what your objective is, because you will only know what types of testing to do and which issues to fix when you have done so. It might be one of many things, such as:

  • Doing the best you can in the available time with the available skills and no budget for professional support.
  • Achieving conformance with a specific WCAG version, usually 2.2 level AA.
  • Achieving conformance with some other accessibility specification, such as EN 301 549.
  • Achieving the best possible user experience for people with a wide range of disabilities.
  • Protecting yourself from the threat of legal action. Realistically, this is only an issue for organisations in the US.
  • Doing no more than meeting your contractual obligations, whatever they are.

If the answer is anything other than the first one, you will need a substantial amount of professional support. A lot of testers will “have a go” at accessibility testing, but to do it properly requires a wide range of skills and knowledge and thousands of hours of experience and support from senior accessibility consultants.

Experience
To put that in context, we don’t employ anyone with less than 10 years’ full time accessibility testing experience, which we regard as entry level. People at that level still make a lot of mistakes and their work needs to be checked and fixed by someone much more senior.

The tests you mention are a start, but I don’t like to see people testing with screen readers and other assistive technologies unless they have a lot of experience of running or observing user testing sessions with disabled participants. Without that experience, you have no idea what will and won’t be a problem.

Learning
Self-learning is a poor strategy. We had no choice 20 years ago, but there are some good training courses now. Take a look at Deque University, the IAAP WAS course and Sara Soueidan’s course at https://practical-accessibility.today/

Tools
FWIW, automated tools don’t find anything like 55% of the issues - it’s closer to 25%. You probably got that figure from a bullshit Deque article that claimed 57% based on a completely invalid analysis.

Accurate WCAG conformance testing can only be done by inspection of the code and user interface. You don’t need any assistive technologies to do it. But we do use a wide range of single-purpose bookmarklets to help with it. However, I am reluctant to recommend any because pretty much every one of them has bugs or unexpected behaviours, so you have to be able to recognise when they are giving the wrong result. That equally applies to larger automated testing tools, all of which are buggy.

2 Likes

I appreciate your insight but would respectfully point out that with the volume of software being produced, requiring ten years accessibility testing experience is going to severely restrict the amount of testing that is done. Ie instead of raising the standard it actually does the opposite - zero testing rather than limited testing.
There is also a question in my head about how one gets ten years experience if an entry level position in your company requires ten years experience!

My estimate of ten years is based on the quality of work I see. Most of it is very poor. In the last ten years I have only encountered perhaps four people whose work I would allow to be delivered to a client without it being thoroughly checked by someone more senior. All of them had 20+ years’ experience.

For example, a few weeks ago, we gave a fairly straightforward 5-day testing project to a contractor with 8 years’ experience who we have known for a long time. When I reviewed the results, I spent more than a day recording and fixing 45 errors that I fed back so they could learn from them. I fixed a lot more errors, but I couldn’t afford the time to document and count them all. This might sound terrible, but this tester is actually one of the better ones.

I make no apologies for setting a high standard. While some aspects of a WCAG audit are subjective, most of the tests have a “right” answer. The question is how many of those is it ok to get wrong? Someone with only 5 years’ experience will likely get about 50% wrong. Someone with 10 years’ experience will get perhaps 20% wrong.

Some organisations would be happy with those figures. However, as an outsource testing company our results need to be as close to 100% correct as possible. Internal teams can set the bar lower, and in practice no one will check their work so they will get away with high error rates.

It’s not just us
If you look at other top-tier accessibility consultancies like Nomensa and Hassell Inclusion, you will find that everyone they employ is highly experienced. I’ve got 22 years’ experience but probably wouldn’t get into Tetralogical, such is the calibre of their team.

Career path
I share your concerns about the career path. We would recruit several more people if we could find candidates at the right level. We do no sales or marketing and we turn away lots of enquiries because we are always at full capacity. I obviously wouldn’t waste those opportunities if I had a choice.

In my view, testers with less than five years’ experience simply shouldn’t be working for outsource testing companies like mine because they get too much wrong and need too much support. We have tried it and it was a disaster. That was disappointing not least because we successfully trained a succession of functional testers straight from university.

Accessibility is very, very different, partially because you need a deep understanding of the WCAG and ARIA specifications, which run to thousands of pages. There’s nothing like it in the functional testing world. You also need strong HTML, CSS and JavaScript knowledge, assistive technology experience and human factors knowledge. There’s a vast amount to learn.

There isn’t really a good answer, but I feel that beginners should start at an organisation where they can add some value, but the inaccurate results don’t matter too much. This might be in the public sector, where the low salary scale means it’s impossible to recruit highly skilled testers. Wherever it is, they need a lot of support from someone (ideally a team) with significantly more experience.

1 Like

This blog post focuses on screen readers:
https://mindfultester.com/test-idea-number-1-part-1

1 Like

Many thanks again for an informative post from the frontlines. I’m in the ‘internal teams’ camp - advocating for quality but working with the typical budget/knowledge constraints that represent the majority of IT projects, I suspect.

1 Like

It may not be apparent if you only work for one organisation, but accessibility objectives and practices vary massively. For the last 6 years, a new law (PSBAR for short) has meant that public sector organisations must meet WCAG 2.2 AA. Internal teams know it and it’s written into contracts with external developers. Budgets and time allowances have been increased to make it reasonably viable in most cases. Conformance is monitored by the Government Digital Service and enforced by the EHRC with the ultimate sanction of prosecution under the Equality Act.

However, there is no such pressure in the private sector. What we invariably hear is “we develop with accessibility in mind”, which basically means they don’t do anything at all. At best, there might be one accessibility advocate with no time or budget and little or no lasting influence.