Accessibility Fora

Written By: Peter Abrahams
Published:
Content Copyright © 2005 Bloor. All Rights Reserved.

Over the last few weeks I have been following accessifyforum.com, a set of fora discussing accessibility issues. The contributors are a committed group of practitioners involved in ensuring websites and other IT solutions are accessible. It is an extremely useful place to go to get an informed set of opinions on accessibility issues.

I have been reading and researching two related issues through the forum. Firstly, how to commission accessible web sites and, secondly, how to use automated testing tools.

I recently wrote about website surveyors so it was interesting to discover that the Disability Rights Commission (DRC) has commissioned the British Standard Institution (BSI) to develop a Publicly Available Specification (PAS) called ‘PAS 78: A Guide to good practice in commissioning accessible websites’. A PAS is less formal than a BSI Standard, which enables it to be produced more quickly and the first version is planned for November; it will be reviewed after two years and may be turned into a full standard at a later stage.

It is interesting to note that the original working title for the specification was ‘Guide to good practice in the design of accessible websites’. The change in the name from design to commissioning was agreed at the first meeting and reflects the same concerns that led me to write my article; there are plenty of standards and guides for good design but a lack of help in how to ensure those standards are understood and abided by. A draft of the PAS guide has been produced and is out for review at present.

According to Julie Howell from the Royal National Institute for the Blind (RNIB), who developed the first draft, It is intended to be a document that commissioners can understand and can discuss with web design project managers. For example heavy reference is made to WAI guidelines, usability testing, automated checking tools etc. I hope that the ‘etcetera’ includes the use of tools that automatically generate good code, and also includes discussions on tools used for on-going maintenance of the web site – getting the web site accessible is in itself not an easy task but then keeping it up to scratch is the real challenge.

I look forward to the publication of the PAS and will write an article on it as soon as it is available.

The second issue I have been following was about the use and abuse of automated testing tools, in particular the decision by the Public Sector Forum (PSF) to stop publishing the SiteMorse league table of local authority web sites. The PSF felt that the automated tests were not providing an accurate reflection of the comparative accessibility of the different sites; this is because the tests cannot check for all the accessibility rules and in fact can, in some cases, produce false negatives. The PSF provided a series of examples of problems with the automated testing and they are all absolutely valid.

However, they do seem to have thrown the baby out with the bath water because they have not provided any other way of comparing government web sites. It would seem to me that it would have been more useful to have a method for discussing the inaccuracies in the tests. The forum itself could have provided a discussion area where webmasters could explain why their position is unfair.

Such a forum would have produced some interesting discussion. My suspicion is that a website near the top of the league would be shown to be meeting most of the accessibility guidelines. My contention being that you cannot develop a high scoring website unless you are taking accessibility seriously and if that is the case you will take note of all the guidelines. At the other end of the scale, if the site scores very badly then much of the reason will be real errors that occurred because of a lack of accessibility awareness. Also, a significant movement, up or down, is likely to be a reflection of a real change in the accessibility.

Around the middle of the league table, the position may be unfair in either direction; it would then be useful to have some manual testing and discussion to reduce the impact of false negatives and reflect the importance of adherence to guidelines that can not be tested for automatically. I think that would add some baby lotion to the baby and make the league a useful indicator of which sites are doing well, which sites are trying to improve and which sites are not pulling their weight.

I will continue to follow the fora and report issues that catch my interest.