Accessibility Reports

Hiya,

At my company we are in the process of writing an Accessibility Report. My question is after these have been created what do other company’s use them for?

For example, other than using it for its purpose we are using it as a guide to see what percentage of WCAG Guidelines we meet and we can feedback this to our customers.

Let me know what your company does as we are looking for other ways to use this information too.

Thanks

5 Likes

To start on a negative note, calculating the percentage of WCAG Guidelines you meet isn’t useful because it’s got no statistical validity. In common with all other testing metrics, WCAG success criteria are not fungible. The only values that have any validity are 0% and 100% - anything in between is meaningless.

Possible uses of the report include identification of training needs and opportunities for process improvement. In pretty much any aspect of life I don’t believe it when people say things like ā€œI will be more careful next timeā€, ā€œI won’t do that againā€ or ā€œI will try harderā€. Behaviour doesn’t change unless you do something to change it.

I see this all the time, with companies and individuals making the same mistakes again and again. Even if an individual does improve, the improvement doesn’t happen across the team or organisation, and of course there is no way for new team members to avoid making those mistakes.

Only process change can do that, such as building (and actually using) a code library of accessible components, building automated accessibility tests into your CI process, getting an accessibility specialist to write coding guidelines at the wireframe or creative design stage, testing at the component / template stage etc.

In the case of multimedia, all the necessary accessibility features can be planned from the start (eliminating failures and reworking later), but it requires both training and process change. I just wrote a training course on the topic.

In the case of content creation, you might decide to create writing guidelines to ensure meaningful link text, front-loading of information, presentation of information as lists rather than long paragraphs etc.

3 Likes

Thank you so much for your answer, our customer has asked we are X% accessible this is where our idea came from to compute it via how many of the WCAG Guidelines we meet. Would you have any other suggestions ?

1 Like

Sadly for me and so many others I have rarely been involved in a project where WCAG and Section 508 were priorities.

As Steve pointed out percentage of compliance isnt very meaningful except as a general ā€œfeel good/badā€ metric. (0% or 100% compliance being the exceptions of course) if that 10% non-compliance is in a critical function or affects a particular ability area, that stat becomes effectively 0%

So you might consider breaking down the report to cover various workflows, features and compliance types. If the Accessibility Test was performed by a consulting company specializing in that work, I would hope their evaluation deliverable would go into much of that detail.

I have always refused to give percentage accessibility levels because it’s statistically invalid. I’ve been asked plenty of times, but it’s not negotiable - I absolutely will never do it. Frankly, I am astonished that anyone would think you could represent such a multi-faceted property as a single number.

For starters, the level of WCAG conformance is not necessarily correlated with the quality of the user experience, so which one do they want a number for. What are they going to do with it?

Secondly, you might fail a particular success criterion in ten different ways. What happens if you fix nine of them? You still fail it, so your numerical score doesn’t change even though the website is now significantly more accessible than before.

You don’t have to give your boss, client or other stakeholders everything they ask for. But you do need to be able to explain why you’re not going to do it. And if they ask for any kind of software testing metrics, the answer should almost always be no, I’m not doing that.

I run ā€œa consulting company specialising in that workā€. Our guiding principle is that test reports should contain as little information as possible, while meeting the client’s needs. Anything extra is waste that the client is paying for unnecessarily.

The issue of reporting on ā€œvarious workflows, features and compliance typesā€ rarely arises because we recommend incrementally testing small areas rather than doing one ā€œbig bangā€ test. The client fixes the issues where they were found and applies the fixes throughout the application (ideally by fixing code libraries).

It’s certainly possible to do a detailed impact analysis, showing how accessible each area is to each user group. But what are you going to do with that information? Most of our clients have to become fully WCAG conformant, which means fixing everything. The analysis isn’t even useful for prioritisation because other factors are more important, such as maximising the efficiency of development resources.

1 Like

We use VPATs to generate Accessibility Conformance Reports. They used to be mostly to answer customer questions about the accessibility of our tools. We are working on creating a workflow to turn the filled out VPATs into JIRA tickets.

Emailing the report to teams turned out to be a dead end. The information was often not shared with the whole team and the content was not easily understandable for team members. While the report probably gives a nice overview for clients they are completely useless for bug reporting. How do you address partial compliance with e.g. captioning? Bug reports need the exact pages that have videos with no or incomplete captions. This information is in the filled out VPAT.

2 Likes

Yeah I wouldnt expect much traction from an emailed report. Unless work items were attached, that is. By creating work items/defects/bugs, it helps illustrate the scope of the report and its effect on the product. Just the report in email is a passive thing. That makes it easy to ignore or for people to wait for it to become their problem. With actual work objects, it demands greater attention.

Hello!

I’ll tell you what we do, with the preface that any non-conformance means that you aren’t compliant…

OK? Let’s move on from that.

If directly answering your question, you should use the report to create an accessibility statement and publish this on your website. It should list all of the issues and a timeline for when they will be fixed.

On the other hand we have a dashboard for an overview that can be bubbled up.

What we wanted to do was have an obvious and easily digestible idea of the quality of something. The failures can obviously go in a report, with the dashboard as the executive summary.

Each failure is assessed for its individual impact. That impact has a rating from negligible to critical.

From this any bugs are raised with the associated impact, which should inform their importance.

We then create a dashboard with the minimum amount of information needed. This dashboard uses a medal rank calculated based on the impact of the problems found.

There is one medal rank for each of the 4 principals. If a principal has only negligible or no issues at all it can be Gold. Silver is minor, bronze is moderate. Any critical issues mean no medal and there are big problems.

Splitting the ranking across the 4 principals lets you see easily where the strengths or weaknesses lie.

Say there’s gold Perceivable and bronze Understandable, it could mean that all the colours and fonts on the site are great, but non of the labels on your form are associated with the input fields.

3 Likes

Most organisations put all the non-conformances into a huge Word document with sections for each WCAG success criterion. I hate that format and so do the clients I have asked. I have seen these reports exceed 250 pages!

We use a spreadsheet instead, with each row containing one success criterion and each column containing the results for one page or component. We use both colour coding and text to convey the status of each cell in addition to the test results and recommendations. This has the advantage that you can look down a column to see everything about a page, or look across a row to see the results for that success criterion across the website. That’s especially useful for identifying common or similar issues on different pages.

There’s much, much more to it than that and we use a lot of VBA to perform special functions. For example, I’m working on a feature that will put all the non-conformances into a CSV that can be imported into Jira or another bug tracker for clients who want that.

One major drawback of the Word report is that it only records non-conformances. If nothing is recorded for a success criterion on a particular page, you don’t know if it was tested and passed, or if it wasn’t tested at all. In our spreadsheet report, we record passes, non-conformances and also if a success criterion is not applicable (and why). If a cell is empty, we know that test has not been done.

Thanks for sharing, @sles12.

I didn’t know what a VPAT was so I looked it up. I see it stands for ā€œVoluntary Product Accessibility Templateā€.

Voluntary Product Accessibility Template (VPAT) is a template containing information regarding how an Information and communications technology product or service conforms with Section 508 of the Rehabilitation Act of 1973

1 Like

I am currently working on our strategy how to handle accessibility reports. For context, we are a company that makes websites for clients and increasingly these clients ask us to fill in a VPAT so they can publish a report. We have actually always done accessibility testing but these reports are a more recent development.

There are several steps to my process:

  1. I test the product and fill in a spreadsheet I created for that purpose. Each bug/issue gets reported. The focus is on the developers in this stage because we are still fixing issues. So I report the issue, give code samples, etc. I also state the WCAG rule that is being violated.
  2. Devs fix bugs, then i retest.
  3. Update the spreadsheet.
  4. Create the accessibility report: This is still based on the VPAT format. It is a big table where i enter for each rule whether the site passes or not. If not, i describe the issue in a language that hopefully helps people with access needs understand what existing issues there are with the site.

Note: Yes, ideally, each issue should be addressed. But there are many reasons why some remain (for a short or longer time). And having these listed and described helps people understand where they can expect difficulties in using the site. This is in my opinion the great benefit of accessibility reports (other than motivating the developers to fix the issues).

2 Likes

This has give me lots of ideas how to improve our processes. Thanks for sharing! I really like the idea of incremental testing. We are currently changing our processes to do this and also introduce more automated tests that run long before i am even given a component or page or even site to test.

2 Likes

I noticed today that Rachele DiTullio has an Accessibility Spreadsheet shared, and was updated today.

It has some very interesting ideas in there, I very much like the per component results.

2 Likes

Hi @sles12
I stumbled on the VPAT recently and the fact that it is a checklist immediately caught my attention.
Noting that although it has a WCAG version it is presumably a little biased towards the US market? Ie the US Rehabilitation Act of 1973. Would you recommend this as a resource for UK-based companies or might there be better options.
@AdyStokes - I don’t think I’ve seen you mention this but would you recommend this format please?
Many thanks, Dan

1 Like

Thanks for tagging me Dan, I missed this thread the first time around so interesting to read it now.
To answer your question first, yes it was created initially for the US in 2001 I think as a way for US companies to demonstrate conformance to section 508 of the Rehabilitation act. But it has come to be an accepted way of demonstrating conformance to WCAG (Web Content Accessibility Guidelines) which are not focused on a single contries laws or acts.
When I speak to companies who don’t know where to start I recommend getting someone in to create a VPAT (Voluntary Product Accessibility Template) as a first step. From their they can create tickets for the most impactful findings as well as creating a company Accessibility Statement. Both of those can be used to demonstrate where a company is on accessibility. In some sectors they are mandatory for supplying services, for example educational support software.
So in short, yes I would highly recommend using one but if you are not used to conducting accessibility testing or reviews be aware that automated tools cannot fill one in for you. I created a checklist with testing hints for WCAG a while ago. I haven’t reviewed it for some time but it should give a reasonable ideas of how to test each guideline. Please excuse my hosted site as it has quite a few accessibility issues itself. I really need to find the time to migrate it some day.

2 Likes

Thanks very much @AdyStokes - good to hear that this is a useful addition. In terms of ā€˜getting people in’ though, do you feel that accessibility testing (like fully-fledged penetration testing) is beyond the capability of the average test team though (or ideally IT department as a whole)?
How far would you expect teams to get without formal training I guess is the question, as that would help set expectations with line managers

1 Like

To be honest that’s quite a big question as it is hard to define an ā€˜average’ team but I will say that just like a penetration or security testing specialist, an accessibility specialist will find things an ā€˜average’ team wouldn’t. That doesn’t mean the team can’t address, for want of a better expression, the low hanging fruit.

The top 6 findings of the WebAIM million homepage survey, every time are low contrast. Missing alternative text on images (which is not necessarily an issue. Not having form input labels. Empty links. Empty buttons. Missing document language. None of which are particulary hard to test for or fix.

There’s so much help out there for testers, designers, developers etc. that poor accessiblity has not excuse apart from lack of awareness. Once you are aware not doing accessibility is choosing to exclude people in my opionion. It might sound harsh but it is true.

1 Like

Absolutely agree with your point that the motivation for the project team as a whole should be about avoiding exclusion. In terms of ā€˜average team’ and an example of testing a website - there’s a divide between those who test with dev tools open and those who test purely with what’s in front of them. We have more of the former types of tester and I don’t think the latter group would be that useful in this instance, without training.
This is something we want to develop further though and perhaps it would be useful to go through the checklist ourselves as a measure of where we are?
Our developers of course would also need to incorporate these same measures into their code, since it will otherwise get very tedious testing something that was never designed to pass!
Again, thanks for your invaluable insight (and just to mention - I am aware of an A11Y list of tools for various aspects of testing - unhelpfully though, we have conflicting security requirements at work that block many of them!)