Rebuilding The Web

Articles, advocacy, discussion and debate about the many problems of the Web and the challenges of rebuilding it.

Can checklist accessibility be harmful?

Accessibility checklists based on WCAG or Section 508 guidelines were intended to help make Web sites accessible. These checklists are meant to ensure that the process of accessibility checking is done consistently and comprehensively. So how can accessibility checklists be harmful?

What's wrong with accessibility checklists?

The wrong tool for most users

In the hands of accessibility experts, accessibility checklists are useful. Unfortunately, these checklists have been incorporated into automated accessibility checkers that are used by those with little or no accessibility training. Checking accessibility is a process that does not lend itself to automation. Most accessibility issues require manual checks and so training and experience in the field of accessibility is required, or tools must be tailored to specific accessibility-checking tasks.

Below is a screen shot of an automated accessibility checker from a popular authoring tool showing the large number of checks that need to be performed manually. Without the training necessary to do this, users will simply skip over anything that is optional and that they do not understand, because many users assume that if they don't understand something then it must not apply to them.

List containing Section 508 and WCAG checkpoint descriptions with the word MANUAL next to each item.

Accessibility checkers based on checklists cannot check for many common accessibility problems

Accessibility checkers based on checklists give non-expert users a false sense of accomplishment. For example, the most common accessibility problem is incorrectly written alternate text for images. The mere presence of any type of alternate text at all will cause most accessibility checkers to give a pass to the checkpoint that requires alternate text for images, even if the alternate text causes compression problems as shown in this example:

  1. <p>I <img src="heart.gif" alt="Image of a heart." /> you!</p>

And the same accessibility checkers will raise an error if alternate text is left blank, even when it is appropriate to have a blank value for an image, as shown in this example:

  1. <p><a href="#top"><img src="up.gif" alt="" />Top of page</a></p>

Checklists are making accessibility a frustrating experience

For many authoring tool vendors, accessibility checkers based on checklists are a convenient alternative to building interfaces that encourage authoring accessible content in the first place. This is because accessibility checkers are easy (i.e. inexpensive) to incorporate into products and allow these vendors to market their tools as being compliant with accessibility standards.

For content authors, the frustration arises from the fact that the same authoring tool vendors provide authors with tools that undermine accessibility by creating markup that applies formatting to content when what is required is semantic (i.e. accessible) markup, such as headings. Content created in this way will therefore fail accessibility checkers.

Prime culprits are color pickers and font selectors as seen below:

Screen shot of a color picker and font face/size selector.

These color pickers and font selectors generate semantically barren content such as the following that is devoid of accessibility:

  1. <span style="color:red;font-size:20px">Breaking News</span>

As a result, authors become resentful towards accessibility checklists and impatient with the principle of accessibility, because they are punished for using color pickers and font selectors that they were initially encouraged to use.

So what is the alternative to accessibility checklists?

The solution is authoring tool interfaces that encourage the creation of accessible content in the first place, and that let authors review what they have written much as an assistive technology would process the same content. Authors can then easily make and review any accessibility adjustments that are necessary.

User interfaces that encourage accessible authoring

The following are just a few examples of how authoring tools can encourage accessible authoring:

Color picker and font selector tools have got to go
Once color pickers and font selectors are removed from authoring tools, content authors are far more likely to use accessible markup such as headings.
Make users select the type of table they are creating

Until an acceptable alternative for layout tables comes along, when a content author creates a table, make them choose between a data table and a layout table.

Screen shot of a toolbar with a button to insert a layout table and a button to insert a data table.

Use proper labels

The way that fields are labeled can influence what users type into those fields. For example, are you more likely to enter alternate text for an image (correct) or a description of an image (incorrect) in a field labeled "Image description" below?

Screen shot of a dialog box labeled 'Insert/edit image' that contains the following fields: 'Image URL', 'Image List', 'Image Description' and 'Title'.

Make users specify how they intend to use images

Encouraging users to avoid accessibility by letting them turn off accessibility features such as the prompt below is not the answer, and does not help authors create more accessible content:

Screen shot of 'Accessibility Properties' dialog box containing a checkbox with the label 'Show this prompt when inserting images'.

Instead, when inserting an image, the user should be prompted to decide if the image is used for decoration or not. When used for decoration, the UI would prevent alternate text from being entered. When used for non-decoration purposes, then alternate text would be required before the image can be saved.

Screen shot of 'Image properties' dialog box. Field 'Decorative image' contains the value 'No'. Next to 'Alternate text' field is '(Required)' and the field is enabled.

Screen shot of 'Image properties' dialog box. Field 'Decorative image' contains the value 'Yes'. 'Alternate text' field is disabled.

Display content for review the way assistive technology may see it

A machine working through a checklist cannot determine if content makes sense. Only a human can do that. So let's give the human the ability to view content the way users of assistive technologies may view content. For example, the following screen shot shows an authoring interface using images and a data table. Both images have alternate text, but is the alternate text likely to be useful or confusing given the way it is written?

Screen shot of an authoring tools. Content consists of an Google Chrome icon followed by text 'is growing market share rapidly.', icon of Internet Explorer followed by the text 'is losing market share the most.' and a table showing market share for five browsers over a priod of 3 years.

The screen shot below has the answer to that question. It displays the same content in a way an assistive technology might see it. The alternate text for both images is displayed in place of the images and it becomes readily apparent that the alternate text is not accessible. Both alternate texts need to be edited if the sentences they appear in are to make sense to an assistive technology user. The data table is also displayed, in linear fashion as an assistive technology may process it, and here the headers are properly associated with the appropriate content cells, ensuring that the table does make sense to a user of assistive technology.

Screen shot of an authoring tools. Content consists of text 'Small icon.' followed by text 'is growing market share rapidly.', text 'Small icon.', followed by the text 'is losing market share the most.' and a data in sentence form showing market share for five browsers over a priod of 3 years.

Displaying content the way that an assistive technology might see it is preferable to using an accessibility checker not only because it is far more engaging for authors, more empowering; it is also the only way to spot some accessibility issues (such as comprehension problems) that checklists simply cannot spot.

Conclusion

For most users, checklist accessibility is not useful and may even be harmful. Automated accessibility checkers based on checklists can be confusing and can fail to identify common accessibility issues that can affect comprehension of content. A better alternative is to build tools which encourage the authoring of accessible content in the first place, and which reveal the content being created as an assistive technology might see it, allowing authors to really understand what accessible content is, and to make any necessary corrections to achieve it.

Public comments

1. Posted by Richard - web accesibility testing
on Thursday 2010-06-10 at 09:19:51 PST

There is nothing wrong with accessibility checklists and they can't be harmful (other than where there are faults in the checklists or they become subjective). As you rightly point out automatic checking is very limited. In fact automatic checking can only ever fail a web page, it can't pass one.
You are right to focus on authoring tools and in particular WYSWYG editors which are generally woefully inadequate when it comes to semantic and accessible coding, and users love to be able to copy and paste from e.g. Word (and why shouldn't they) which often results in spaghetti code with little or no semantic structure.

2. Posted by Robert
on Friday 2010-06-11 at 07:27:46 PST

The following list is taken from the screen shot in your article that I recognize from Dreamweaver:

- Non spacer IMG with equivalent ALT
- Non spacer IMG needs LONGDESC
- Color is not essential
- Colors are visible
- GIFs do not cause the screen to flicker
- Use clear language for site's content
- Clarify natural language usage
- Ensure sufficient contrast between foreground and background colors

There is no way a non-accessibility expert will know what do to with any of these checklist items. Even if someone does understand say the last point "ensure sufficient contrast between foreground and background colors", how are they going to know where the problem is and how to fix it in Dreamweaver? I don't see any benefits to checklists!

3. Posted by Cliff Tyllick
on Sunday 2010-06-13 at 12:11:31 PST

Robert, you might be right in pointing out that checklists themselves are not the problem. The real problem is the automated accessibility checker. Not only do these checkers shift the burden from the authoring tool (to properly support creation of accessible content) to the author (to ignore what the authoring tool makes easy to do and figure out how to apply semantic markup), but also they mislead neophytes into thinking that achieving accessibility is harder than it really is (because they flag so many points for rechecking simply because an automated checker cannot analyze them).

Most automated checkers are a waste of time. It takes more time to interpret the checker's output than it takes to personally evaluate the page's accessibility and correct any errors found. To me, automated checking is useful only when you're dealing with a large site with multiple authors. If the report shows a lack of semantic markup in one particular portion of your site, then you know that you need to look more closely at that content and follow up with its authors about any problems you discover.

On the other hand, if the developers of authoring tools were to take Vlad's suggestion seriously and present more content in a way that lets the author see the impact of the code on screen readers, tab order, and other factors, then those tools would better support the creation of accessible content. So more authors would get it right the first time and fewer authors would need their work reviewed.

4. Posted by Régine Lambrecht
on Monday 2010-06-14 at 06:18:40 PST

I agree that the automation of accessibility testing is the problem. Personally, I use automated testing to identify pages with tables, forms, scripts and other potential accessibility issues. But I check the identified pages manually.

It occurs that clients prefer to trust a cheap service offering automated accessibility testing, so you have to anticipate the debate and explain your choice of semi-automated check (with help of testing toolbars or tools that detect pages with potential unaccessible code) by giving examples of what cannot be automated (ex: the quality of an image alternative). I always could convince my clients that full automatic accessibility testing is a myth.

How would you evaluate the part of automatic, semi-automatic and manual check in a WCAG compliance analysis, in percents ?

5. Posted by bella k.
on Monday 2010-06-14 at 07:26:53 PST

@Richard said "There is nothing wrong with accessibility checklists and they can't be harmful"
Ok, there is nothing wrong with making an accessibility checklist but as soon as you use this checklist, it becomes harmful for most people (used manually or in automated testers). Kinda like guns - there is nothing wrong with gun ownership, but lots of problems with gun use.

6. Posted by David Coghill, Webdragon
on Monday 2010-06-21 at 23:22:23 PST

While automated accessibility will never be perfect, quite often it is better than nothing.

In the example given (I "image of heart" you), the fact that the alt text is not perfect does not negate the fact that it is better than nothing, and in fact it could be argued that the cognitive barrier is just as high as for users with vision/images (having to mentally translate the image to a part of syntax within a normal English sentence: is it "heart", is it "love", etc).

At least having these accessibility-focussed elements within the interface gives more exposure to the issue of web accessibility than would otherwise happen. For organisations that really do care, there is expert advice available.

Comments are closed for this article.

Main menu

Check out the a11y bugs project that aims to help browser / tool vendors fix accessibility bugs.