Why not make the iPhone more Accessible?

Written By: Peter Abrahams
Content Copyright © 2008 Bloor. All Rights Reserved.

The iPhone is the must-have technology of the moment—unless you are unable to use it. A user must be able:

  • To see fairly well, not 20-20 vision but something close. If the user is blind they cannot see the controls on the screen and therefore cannot control the device. If the user has a vision impairment the controls may not be sufficiently clear and again the device is of no use.
  • Hold the device steady with one hand whilst controlling it with the fingertips of the other. Whilst ruling out its use by a tetraplegic user it also appears to rule out ladies with long nails (the nails get in the way of the fingertips and trying to control the device with the pads of fingers or nails does not seem to work well if at all).

If you search for ‘iphone accessibility’ on the apple site you will find a list of standard features that can make the iPhone more accessible. It is a useful list but leaves a significant number of holes.

It might be argued that the iPhone is a very visual device so a blind user would not want to use it. This is wrong because there are a large number of features that will appeal to blind users that are not available on other devices. These include the excellent integration with diary and address book on the the Mac, integration of the iPod function into the phone, functions relating to GPS (a friend of mine puts a to-do on his iPhone to shop at a particular supermarket and when he is in the vicinity it automatically reminds him).

So the question becomes how could the iPhone be made more accessible without a complete redesign?

My suggested answers are based around the other input and output facilities of the iPhone: sound input, sound output, the movement sensor and the vibrator, as well as suggestions for alternative pointing devices and grips.

The iPhone uses a capacitive touchscreen which recognises the capacitance of a bare fingertip, it does not recognise a fingernail or a plastic stylus because they do not conduct electricity well. It is possible to design stylus pointer devices that do have a capacitance and would be recognised. The simplest would be a metal stylus held in the hand but some general purpose device that works independently of bare fingers might be more useful. This device could be used with a prosthetic hand, at the end of a pointer held in the mouth or a variety of other similar devices. It could also prove useful to the apparently able-bodied users who cannot use bare fingertips, including the lady with long nails but also workers in cold or dangerous situations that have to wear gloves. In fact there are third parties who sell such devices but they are not mentioned in the Apple documentation.

The typical way to use the iPhone is to hold it in one hand and control it with the fingers of the other. Some people do not have two hands and others need to use one hand for something else so a variety of stands, straps, clips and grips should be available. Again these are available from third parties but not marketed as accessibility aids.

The iPhone has superb sound quality which could be used for a screen reader but at present is not. Such a function would make the iPhone accessible to a large number of people with vision impairments. People who can see well enough to identify the buttons on the screen, but not well enough to read the text, would benefit from a screen reader that read out the area being pointed at and then activated it if pressed again. The Mac operating system, OS X Leopard, has the VoiceOver technology so Apple have the basis for developing a iPhone VoiceOver.

If the user cannot see the screen at all then VoiceOver will not be sufficient and they will need another method to control the device. Voice recognition is the most obvious possibility, again Mac already has Speech Command so building them into the iPhone should not be difficult.

Another possible input that is unique to the iPhone is the motion sensor. Any motion of the device can be detected; the standard phone will pick a landscape or portrait presentation depending on how the device is being held. There are games that use this to control an on-screen car. My thought is that it could be used to move the focus—for example tilt up, down, left and right could be the equivalent of cursor keys, whilst rotate right and back would be the equivalent of enter.

The vibrate feature can obviously be used for warnings as it is at the moment. It could also be used to give information, for example in conjunction with GPS and the motion sensor the user could rotate the device until it strongly vibrates then walk in that direction.

Whenever I look at devices or features designed to help people with disabilities my next thought is how could this also help a larger population. The obvious population in this case is car drivers where hand and eye free control is becoming a must and voice output of SMS is becoming the norm.

It would seem to me that:

  • Apple should do more to provide basic accessibility for the iPhone.
  • Specialists in specific disabilities should provide further functionality via the Apple App Store.
  • Information about all the accessibility features from Apple and third parties should be available in one place.

If you have devleoped an app or add on device that could make the iPhone more accessible, or even if you just have an idea for one, please add your comments to this article.