Do AI tools really support Accessibility Testing?

Read our latest article by @AdyStokes, “AI-assisted accessibility tools: pros and cons,” and explore how AI tools are being used to evaluate and enhance accessibility testing while recognising the crucial role of human expertise in ensuring inclusive outcomes.

From generating alt text to simulating keyboard navigation, Ady examines the capabilities and limitations of AI tools and why they are most effective when combined with human input for comprehensive and accurate accessibility testing.

What you’ll learn:
:mag: The pros and cons of AI-assisted tools like alt text generation, semantic analysis, and contrast testing.
:star2: How AI tools can save time and effort but still require human oversight to address context and nuance.
:arrows_counterclockwise: Why upcoming WCAG updates may leave a gap in AI tool coverage and what this means for accessibility testing.
:bulb: Examples of AI tools currently available and their potential impact on accessibility evaluation.

After reading, share your thoughts:

  • Have you tried AI-assisted accessibility tools? What worked well, and what didn’t?
  • How do you think AI tools and human expertise can best work together to improve accessibility testing?
3 Likes

The article states “It’s clear that artificial intelligence can greatly enhance the automation and speed of accessibility evaluation and testing”, but that should come with a heavy caveat regarding the accuracy and thoroughness of the results. And I still wouldn’t agree with the statement. I have looked at the tools mentioned in the article and I’m pretty unimpressed.

Deque
The “computer vision” feature in Deque’s tool looks useful for some colour contrast measurements, especially where text has a non-uniform background. However, the user still has to decide what window sizes to do the test at and they need to put pages and components into all the relevant states. I don’t know if there’s a way to record all that so you only need to do it once for each page. The other so-called AI features in the video demo look trivial and don’t even look like they would save any time compared with using existing non-AI tools.

Accessibility Desk
Their Self Assessment tool doesn’t even work - I get the message “The reCAPTCHA was invalid”, but there isn’t a reCAPTCHA. The source code shows they are not using the frictionless v3 version, but they are using v2, which should be visible. It isn’t.

Stark
I haven’t looked at Stark’s tools because they aren’t aimed at testers. They look like a crazy expensive megalithic development environment that only deep-pocketed huge organisations would use, i.e. the kind of organisations almost none of us work for.

AI-powered image recognition
This is of no use at all to an accessibility tester. We can see what the image is, and the text alternative should almost never be a literal description of it. The appropriate text alternative depends entirely on the context in which the image is used.

Improved analysis of semantic structure
This might be useful as a safety net after testing a page properly, which is how we use tools at the moment. But I would never use it first. If the tool has a crawling capability, it might be useful for doing a partial test of a large number of pages. But if you have to run it manually on each page the benefit is far lower.

Intelligent simulation of keyboard navigation
I’m deeply sceptical of this one. Again, it might be useful as a safety net.

Automated analysis of content readability and understandability
This could be useful for some people, but it doesn’t help with aa WCAG audit unless you’re one of the few people test for level AAA conformance. Even then it’s not clear whether the tools are doing the specific test required by level AAA or if they are just doing a general readability test.

Who are AI-based tools useful for?
My current assessment is that AI tools are almost entirely useless for professional accessibility testers who know what they’re doing. They might be of some value to non-professionals who are “having a bit of a go” at accessibility testing, but they aren’t going to help those people do anything like an accurate WCAG audit. That may change over time, but not any time soon.

Automated crawling tools
AI can extend the capability of automated testing tools that crawl websites, but that will also result in more manual analysis to do and more false positives to remove. As such, it will probably increase the tester’s workload, not reduce it.

@sarahdeery Thanks for sharing this insightful article, @AdyStokes! :star2: The blend of AI-assisted tools and human expertise in accessibility testing is a game changer. Here’s my take:

Experience with AI-assisted Accessibility Tools:

I’ve experimented with AI tools for alt text generation and contrast testing, and they certainly help speed up the process, especially when dealing with large-scale applications. However, I’ve noticed that AI tools sometimes miss the context of images, particularly when they require nuanced descriptions or when images have multiple elements. Human oversight is crucial in these cases to ensure the descriptions are accurate and meaningful.

AI and Human Expertise Collaboration:

AI tools excel at handling repetitive tasks, like checking contrast ratios or generating simple alt text, which saves a lot of time. However, when it comes to interpreting more complex scenarios—like dynamic content or intricate UI interactions—the human touch is irreplaceable. A hybrid approach, where AI handles the automation and humans provide critical context and judgment, would lead to the most accurate and comprehensive accessibility testing outcomes.

Looking forward to exploring more examples of AI tools and how they’ll evolve alongside WCAG updates!

1 Like

The heading was a bit misleading even though I know we are in the Testing sphere context.

I recently started an accessibility training course and right at the start they say initially move away form audit criteria and consider the actual value of being accessible for everyone and not just those with disabilities. This blindsided me a bit as I’ve been covering accessibility basic “checks” for years and maybe failed to grasp the much much broader understanding of the goals and value.

That may be why I picked up on the AI assisted accessibility tools rather than testing tools for accessibility evaluation.

The former though has really peaked my interest, the combination of IT an AI empowering millions across the world to do things they could not dream of a few years ago and many others, myself included take for granted as part of our “normal” lives.

So me I’ve taken that step back but shortly I will be looking deeper into the testing tools and that AA audit report and its 55 WCAG 2.2 AA Success Criteria which is a bit daunting at this point.

If anyone else is using that 55 criteria report, I’d appreciate some feedback in near future.