How to do Accessibility Testing
Posted by Steve Green on 19 January 2013.
This blog is paraphrased from a lengthy reply I gave on LinkedIn to a professional tester who had been told to do accessibility testing on a web-based CRM system. He had no experience and neither the time nor the budget for training (unsurprisingly, this was an offshore testing company). His question was basically "what free accessibility testing tool can I use to identify all the accessibility issues?"
Automated tools cannot do the job
Before I go any further, let me state categorically that you CANNOT do this kind of testing properly using just tools. They can help, but the testing must predominantly be done manually. My recommendation is that you outsource this work or engage a specialist accessibility consultant. (apparently there was no budget for this)
It takes a long time to learn
If you have not done it before, it is way too complex to learn in a short time. In our company a new consultant would typically pair with an experienced one for many months and undergo a great deal of training (books, conferences, online courses etc) and observe many user testing sessions with disabled participants. Only then would they be ready to undertake their own projects.
It is essential to have a deep understanding of HTML, CSS and JavaScript, plus other technologies that your website is built with, such as WAI-ARIA. It is also necessary to be able to use a screen reader and tools such as the Accessibility Toolbar and FireBug.
OK, where do I start?
The starting point is usually to test for compliance with the Web Content Accessibility Guidelines (WCAG) 2.0. These are a large set of technical tests designed to identify pan-disability issues. They tell you how accessible a website *should* be. At the AA level there are 38 checkpoints, each of which needs a variety of tests done. There is no script for doing this - you have to work out the techniques yourself.
And then?
To tell how accessible a website *actually is*, it is necessary to conduct scenario-based user testing with a variety of disabled participants. At Test Partners, we have built up a large database of people with all kinds of disability, but it will be hard work to find even 6 or 8 if you have not done this before. Of course you also need to be skilled in moderating user testing sessions and be sensitive to the needs of the participants.
An expert review with assistive technologies (screen reader, screen magnifier, voice recognition software etc) can provide a thorough assessment of a website. This will not identify all the issues that user testing does (especially cognitive issues), but the structured approach will cover areas that the user testing missed.
You haven't mentioned automated tools
Very true, and there's a good reason. These tools typically spider a website and run a set of tests against every page they find. It sounds like a silver bullet but there are many drawbacks to this approach. The cheap tools are not very good at all, and the expensive ones still have many drawbacks, not least of which is that they are expensive. Other issues include:
- The spidering can be a problem - one tool I evaluated could only find 5,000 pages on a website that contained more than 200,000.
- The tools can only assess about 25% of the WCAG checkpoints - the other 75% are subjective and require human assessment.
- If your website contains forms, you need to set up the tool to enter the necessary data at each step, which many tools cannot do.
- Automated tools work very fast, but the result is a vast amount of reports - more than you could read in a lifetime. There will be huge numbers of false positives - there always are. On cheap tools all you can do is exclude those tests, which reduces your coverage even further. On expensive tools you can tune the parameters and heuristics but that is time-consuming. And to identify false positives you need the skills to manually assess every checkpoint.
This is not what I wanted to hear
Sorry, but this is how you do the job properly. Undoubtedly some people will recommend using an online tool - they always do - but such tools only do a tiny fraction of the job and without experience it is not possible to make a sensible assessment of the results.