Hello all,
I think the word ‘audit’ can put people off, or be more intimidating when they ask for one or get results from an accessibility audit, particularly one tied into WCAG as it pushes the legal and formality of it.
I don’t want to get away from the seriousness of accessibility and importance of fixing issues, but I wonder if theres a better word that can be used that doesn’t make teams/products/services feel like they’re in trouble!
This is particularly important for a topic that usually gets a backseat or de-prioritised, so anything that can help with people adopting and engaging with accessibility would help…
We have used ‘review’ before, for more design/wireframe testing.
Keen to hear others experiences.
4 Likes
Ah yeah, I can understand why ‘audit’ can put people off.
The teams I’ve worked with did indeed feel like we were gonna get in trouble as soon as any type of audit was due to reveal something.
Hows about some of these ideas*:
- Accessibility Test Exploration Sessions (ATES)
- Software Testing Accessibility Review (STAR)
- Accessibility Review Tests (ART)
- Accessibility Deep Dive (ADD)
- Accessibility Gap Analysis (AGA)
- Single Time Accessibility Test Explorations (STATE)
- Find Accessibility Issues and Opportunities (FAIAO)
*No GPT tooling was used in the brainstorming of these ideas. Just good old human brain power.
5 Likes
I’d take a step back.
The legal side is now pushing companies many whom have never properly considered the needs of a reasonable percentage of potential users before to get an audit.
If there is not much care about these users and wonderful human beings within the company from owners to management to the team then that’s exactly what it is, an audit.
I would not change its name if that was the case but it is likely worth going on a route to change that, to understand and see the value in this potential increased user base, owners might switch from seeing accessibility as a cost to a lucrative opportunity.
Get some training that portrays the human side of things and the value it brings to your users and to the business, invite people with accessibility needs to come and use the product with the team so they can see first hand both the value and the challenges.
I can almost guarantee everyone on the team knows someone who would benefit from improved accessibility.
2 Likes
Slight tangent but if we consider some other audits for comparison.
Security I found most developers actually want this, often the auditor offers skills the team may not have and the team values the input and learning of this activity.
GDPR - this was a harder one for me likely due to the padded guidelines and not everyone embracing the value of it. Maybe had I given it more value consideration I would not have seen it as a pure audit activity.
AI usage audits, coming soon. I did a course recently on the EU one, what was very different from the GDPR one was that it was pretty much all common sense, do no harm, don’t interfere with people’s rights etc. The audit can generate good ideas and better understanding so less concerned on this one at this point.
2 Likes
I’ve been using the term benchmarking, it aligns well to a view of “let’s see where we are” rather than “let’s see where we went wrong”.
2 Likes
I think, let’s not distract for too long on the specific labels we attach to a task, but… The fact we are is really positive. It means we are not sleeping on the job, and if it’s possible to do some kind of non-destructive temporary rename it could engage people afresh. But yes, it’s an audit, it’s a calendarized and a time-bound activity.
I do like 2 of Simon’s suggestions: STAR and STATE, they might work well in powerpoint slides or in document headings. Great question @ayesha.saeed . And if nobody has welcomed you here yet, please do not think you are not most welcome. Keep these kinds of questions coming.
2 Likes
I like this and the framing it provides.
Thanks all for your thoughts!
I like STAR @simon_tomes, the acronym itself also sounds more positive
. And benchmarking is good that its more neutral too @cakehurstryan .
And yes users are the most important and powerful part of the testing @andrewkelly2555. Its interesting what you said about other audits being an opportunity to learn, and we try to encourage that too. We don’t just dump the audit on them and leave them to it, we try to coach teams through the results and break down the fixes to be manageable. They often ask really basic questions on accessibility in these follow ups too that adds to their wider learning. Maybe we push the educational angle more in the wording
.
Thanks for the warm welcome @conrad.braam. My teams and I have done so many audits and get mixed responses. Some see it as part of the process, and others are put on the backfoot and get quite stressed about the idea of it. Most of what we do in accessibility testing reaches far beyond (and before) testing and so thinking about the wider processes and language becomes more important, as accessible and open language often means more adoption of what you are essentially ‘selling’.
3 Likes
Since the beginning of time, accessibility tests have been called accessibility tests or accessibility audits. To call them anything else just confuses matters. You could call it an assessment if you really want a different word, but I don’t see the point. I strongly advise against using any new words.
As soon as you start using different terminology, it raises the question of how what you’re doing is different from an accessibility test or audit. You just look silly if you have to explain that it’s the same, but you’ve given it a different name that isn’t widely understood.
Another important factor is that a WCAG audit is a very specific thing. It’s a test against all the relevant WCAG success criteria (usually WCAG 2.2 AA). If you leave anything out or add anything extra, it’s no longer a WCAG audit - it’s just an accessibility test. That’s fine, but an “accessibility test” can be anything you want it to be, so you must have a shared understanding of what you are and are not doing and why.
I agree with the use of “review” for wireframes and creative designs - we do the same. You can do very little testing at this stage other than colour contrast, although you can write coding guidelines to hopefully ensure the developers get everything right first time (ha, ha!).