Industry variation in Accessibility

Written By: Peter Abrahams
Published:
Content Copyright © 2007 Bloor. All Rights Reserved.
Also posted on: Accessibility

Sitemorse industry comparison Local government web sites on the whole are better than mobile phone sites. Are there lessons to be learnt from this?

SiteMorse Criteria by Industry Sector
functionalcode qualitymetadatalevel Alevel AAPDF
Finance88144539888
Train67149339988
Mobile Phones69529579992
Retail486234710075
FTSE57333229479
Central Government5603289076
Local Government5522077676
Banking67525249385

SiteMorse is a provider of automated web site testing tools. They use these tools to run surveys on a regular basis of different industry sectors in the UK. I have gathered the summaries of the latest reports together in the attached table to see if I could spot any patterns or trends.

The industries covered are: Finance, Banking, Trains, Mobile Phones, Retail, FTSE 100, Central Government and Local Government.

The figures in the summary are the percentage of pages that fail a particular test, so in all case low is good and zero is what is desired. There are six tests the first three are general HTML coding tests and the other three are specifically related to accessibility.

  • Functional covers errors such as broken links and files not found. These are very likely to cause immediate problems for the user so I was surprised to find that between 4 and 8 percent of pages failed. Of all the tests this is probably the easisest to fix and important to fix so I would hope to see the value at, or very close to, zero. I suspect that the number is so high because the sites are not aware of the problems because they do not do regularly testing across the site.
  • Code Quality checks pages for errors originating from the operating code and are frequently violations of the W3C or IETF standards, commonly affecting the visual display of your site or irritating visitors. Examples including missing attributes or missing close in particular these can cause unexpected results when use in different browsers. Percentage varied from 52 (local government) to 95 (Mobile phones). My feeling is these numbers reflect the lack of good coding tools for HTML. A good editing tool should be able to ensure that most of these errors do not occur. It is also probably caused by a lot of cut and paste activity so that an error will propogate very rapidly; use of well tested templates should be able to dramatically reduce these numbers. Although zero would be ideal the dynamic nature of web sites will probably mean that some errors will slip through but single digits should be the aim.
  • Metadata checks for missing titles, an HTML violation, as well as the optional keyword and description tags plus e-GMS data for government sites. These simplify navigation for users but are also important for search engine optimisation. The results from 20 (local government) to 49 (trains) reflect the understanding of some sites of the importance of search optimisation but given the benefits of this information to commercial sites 49 shows a lack of understanding and even 23 (retail) is surprising. Local government does better because it has been pressured by central government to implement e-GMS which leaves no excuse for Central Government (32).
  • Level A does the automatic checks based on the Web Accessibility Initiative (WAI) standard. Here we see a distinct difference between Government (Local 7 and Central 8) and the rest (ranging from FTSE 22 to Mobile Phones 57). Again this comes from central government pressure but also from the realisation that government web sites have to be inclusive as they have to serve all the population, whereas commercial organisations do not feel so obliged. What it also shows is that if some effort is put into this area considerable improvements can be made.
  • Level AA covers the next level of accessibility standards. The results 76 (local government) to 100 (retail) suggests that very few organisations have attempted to jump this bar. Local government has done better probably as a spill-over from the work on level A.
  • PDF checks the accessibility of any Acrobat files. My suspicion is that no one has done much work in this area and the files that have passed have done so by chance rather than intent. Local government (76) documents tend to have more words than mobile phone (92) that are full of pictures and this explains the difference.

The picture is a fairly gloomy one and the only lessons to be learnt are:

  • The local and central government level A results show that if there is a will then things can be done right, or at least much better.
  • Local government is best across the tests which suggests that thinking about content governance and setting up standards will provide a better solution overall.
  • The authoring tools that are being used today do not enforce good practice and do not even give much assistance. My article The Guide Dogs for the Blind Association finds a CMS shows an example of an authoring tool, Rythmmyx, that does assist and enforce good practice.
  • Automated testing tools, like SiteMorse, and manual testing are required see Manual website inspection is necessary but not sufficient

Finally can I strongly urge you to feedback information to webmasters when you see something that is wrong on their site. I do this on a regular basis and in most cases have been thanked for the information and given promises that changes will be made and over time the improvements do filter through.