Accessibility and Search Engine Optimisation

Written By: Peter Abrahams
Published:
Content Copyright © 2007 Bloor. All Rights Reserved.
Also posted on: Accessibility

In my recent article Massive business case for accessibility I explained that one of the potential benefits of making a site accessible is that the search engine rating goes up. The reason is fairly obvious, the search engine crawlers read through the pages in a similar manner to a screen reader and so if a screen reader cannot see the information nor can the crawler. I came across a particularly severe example of this recently; on every page of the site the companies name was in a spinning logo, the logo was a gif without any alt text, and the company’s name was not anywhere else on the page. The result was that neither the screen reader nor the crawler could see the company name, Googling on the name gave no hits on the website, I had to ring the company up to find out how to get to their site!

I have realised that there is another relationship between the accessibility and search engines. For people with disabilities it is very important that the search engine gets them to a relevant page quickly; this is because, in general, their impairments means that it takes longer to surf the net and so every unnecessary click or page read is time-wasting and frustrating.

This means that search engine optimisation is part of making a site more accessible. External search engines should give the user the page that is most relevant to their request. Internal search engines should be tuned to ensure that the most relevant page is presented.

Having got this far I now have to discuss the arcane art of search engine optimisation (SEO). The search engines all vary on how they rate pages and what information they look at so there is no easy answer. However the following broad statements I believe to be true:

  • The words in the title of the page are highly rated.
  • The URL is also highly rated.
  • The text content is analysed and words that are repeated are considered more relevant and many common words are ignored altogether.
  • Web masters can add meta data to pages (these are special tags that the user of the browser does not see but the crawlers can look at if they so wish). The two important ones are description and keywords. Crawlers do look at them but tend to give the information a low rating. (This is because they have been misused in the past).

So when a page is developed we should think about the words and phrases a user might use when searching. We should then make sure that the most important ones are in the title, then ensure that all the other words are in the text, preferably several times, and finally think about the meta data description and keywords.

A word of warning—there are some standards around meta data. The first is that meta data content should not be more than 1024 characters long, as the rest will just be ignored. The second is that keywords, according to the standard, can be separated by either spaces or commas, however some search engines will assume that a string of words, without commas between them, is intended to be a phrase and will not deal with the individual words. Best practice is therefore always to use comma separators.

Finally an idea for consideration. Keywords are not used much by external search engines but they could be useful for internal searches. The content developers can ensure that the keywords are relevant and valid and could ensure that the really important words for a particular page are in the keyword list. The internal search engine could then use the keywords as a very precise way to get a user to the right page. This emphasis might alter the list; for example there would be no point in including the company name in the keyword list, but product names could be vital.

The careful use of search engine optimisation techniques should greatly improve the accessibility of the site.