Content Copyright © 2007 Bloor. All Rights Reserved.
Also posted on: Accessibility
I am still a relative newcomer to the accessibility world. The advantage this gives me is that I can understand the difficulty some people have in understanding what all the fuss is about. This has become particularly clear in the last week or so when I have been researching IAccessible2 (IA2) and the accessibility of the Second Life (SL) client.
IA2 is the new standard API for Assistive Technology, such as screen readers, to communicate with applications like browsers or editors. It has been developed by IBM and donated to the Free Standards Group. The need for a new standard arose because the existing API, MSAA (Microsoft Active Accessibility), was insufficient for todays needs.
But my first question was why do you need an API for Assistive Technology (AT) at all? People not immersed in accessibility look at AT technology, especially screen readers, and marvel at the quality of the voice and how it manages to pronounce words correctly (most of the time) and very clever it is. However the really tricky part is extracting the information from the application. This is not just the text but also relevant contextual information such as:
- This text is a heading.
- These bits of text are actually part of a table and these are the row and column headings.
- This is a hyper link and this is the long description.
- This is a pie chart and here are the sizes and descriptions of the slices.
- This text is green and underlined.
The AT got this information either through MSAA, which when called would provide some of this information, or by non-standard methods of going below the covers and looking at other information (e.g, the base HTML). As the applications, browsers, editors etc., brought out new releases the non-standard methods tended to break and the AT had to be modified before the disabled person could use the new release. Also any new application, for example the SL Client, would only be supported as far it supported MSAA and any extensions would not be supported at all unless the AT vendor added extra code and support.
In fact I have missed out one bit of the jigsaw and that is the proper creation of the content. For example if a Word document does not use styles to define headings, then an editor/reader application can not know it is a heading and can not pass this information on to the AT. So although it may look like a heading on the screen to a sighted user it is just another bit of vocalised text to a screen-reader user.
To make AT work effectively requires all three parts to work together and understand each other:
- well formed content that is understood by
- an application that supports a standard API to
- the assistive technology
The requirement by many municipalities, especially the Commonwealth of Massachusetts, to support accessibility of open standards such as Open Document Format (ODF) meant that there was a real need to improve on MSAA. IA2 is designed to meet that need. With IA2 an ODF editor or viewer can describe to AT any artefact that can be defined using ODF; so the IA2 API includes calls that give descriptive information about tables and their rows, columns and cells. This means that an AT that understands IA2, talking to an editor that supports IA2, can render, or vocalise, anything in the ODF document. This is a significant improvement over MSAA where the AT would need to go under the covers.
The IA2 specification means that the AT will be able to interface with the application purely through the API and will not need to go under the covers. This is at least the intent, the problem obviously is if the application includes an artefact that is not defined in the IA2 specification. For example the SL client includes a whole variety of artefacts—avatars, landscape, objects etc.—that are not defined in IA2 at the moment.
This is a never ending story; the current release of IA2 provides the ability for consistent content, application and AT based around ODF. In the immediate future the requirement will be for the development of applications and AT that supports this new standard and we have seen some announcements in this area already.
The next step must be to look at what other artefacts should be defined and included in the IA2 specification. SL and other multiverses that are in existence must be an important area to look into so that AT does not lag too far behind the rest of technology. The great benefits of IA2, being an open standard, is that it can be extended as required by the market. In the case of multiverses the specification should not only include the IA2 API for the AT but also a standard for proper ‘tagging’ of the artefacts.