iPhone 4S Accessibility

Written By: Peter Abrahams
Published:
Content Copyright © 2011 Bloor. All Rights Reserved.

The iPhone 4 and iOS 4 combined to create a highly accessible smart phone. iPhone 4S and iOS 5 have added new features that improve the accessibility and ease of use for people with disabilities.

The first indication of this is the iPhone 4S user manual which is provided in two accessible formats, plain HTML and tagged PDF. All the accessibility features are gathered together into one chapter of 15 pages covering:

  • VoiceOver
  • Routing the audio of incoming calls
  • Siri
  • Triple-Click Home
  • Zoom
  • Large Text
  • White on Black
  • Speak Selection
  • Speak Auto-text
  • Mono Audio
  • Hearing aid compatibility
  • Custom Vibrations
  • LED Flash for Alerts
  • AssistiveTouch
  • Universal Access in Mac OS X
  • TTY support
  • Minimum font size for mail messages
  • Assignable ringtones
  • Visual voicemail
  • Widescreen keyboards
  • Large phone keypad
  • Voice Control
  • Closed captioning

Most of these were available in previous release of iOS and made the iPhone an accessible device for many users with disabilities. In iOS 5 the two big accessibility additions are Siri and Assistive Touch. They both move assistive technology forward into new areas and at the same time suggest further improvements that could be made.

Siri is the fun extension so I will discuss Assistive Touch first.

Previous versions of the iPhone were not accessible to people with limited dexterity, for example the volume buttons on the side require a significant force to press down, also gestures such as pinch can be impossible for people with limited or no hand control. Assistive Touch enables all the button controls and any gesture to be controlled with just a single touch, either from a finger or a stick. When Assistive Touch is switched on a small semi-transparent circle appears, touching that brings up a series of menus that include mute, volume up and down, pinch and multi-finger gestures, each of these can be chosen and operated using just one finger or an equivalent pointing device.

Assistive Touch and VoiceOver do not work together; given that Assistive Touch has a visual interface and VoiceOver is designed for people with visual impairments this initially does not appear to be an issue. However, it is a limitation for people who use VoiceOver because they prefer having text read to them: people with dyslexia, or people who find reading difficult, or people with some vision that enables them to see the buttons but not to read the text.

Assistive Touch has another possible use: it could allow Apple to remove some of the physical buttons, this would simplify the design and build, and probably increase the reliability. If this was done then Assistive Touch would have to work with all functions of the new phone including VoiceOver. Assistive Touch is a significant new function that ensures access for people with a range of disabilities who were not supported previously – further improvements could support even more.

Siri is the voice-activated ‘humble virtual assistant’; press the home button for a few seconds, or raise the phone to your ear, and Siri starts up and asks ‘What can I help you with?’, you can then make request such as:

  • ‘What am I doing on Saturday?’
  • ‘Call my wife’
  • ‘How many calories in a bagel?’
  • ‘Remind me to defrost the chicken when I get home’
  • ‘What is the time in Vienna?
  • ‘Book lunch with my son next Friday at one’
  • ‘Who are you?’ with the response ‘A humble virtual assistant’.

If you are in the USA there are further request types such as ‘Find me an Italian restaurant in Pasadena’. Siri is in beta and Apple have not made the connections to suitable information bases outside the States yet. If the beta is a success, as I am sure it will be, then I assume that Apple will extend the information base to other countries.

This is certainly a significant accessibility feature as the voice activation makes it easy to use for people with vision impairments or limited manual dexterity. Not only does it simplify inputting the request but it also includes significant intelligence as to how to fulfil the request. Many people may not know how to find the time in Vienna but Siri can find the information. As Siri improves over time this intelligence will become a major benefit as users will not have to understand how to search the web and access apps to get the required results. This will be useful for most people but particularly for newcomers to technology, the elderly, some of whom are technophobic, and people with cognitive impairments. It is a step towards making technology transparent to the user.

Siri highlights an accessibility issue that has not been considered much, to date. If someone cannot speak clearly enough for Siri to understand then they will be denied the benefit of the intelligence. This will effect many, but not all, people with a significant hearing impairment, as well as people who have any disease that makes clear speech difficult or impossible. This is the first example I have come across of accessibility issues for people with speech impairments but I would expect other applications in the future to have similar issues. In a future release Siri should provide an alternative input channel besides speech, the obvious one would be text but a really exciting one would be sign language.

Overall the new accessibility features in iOS5 on the iPhone 4S are impressive and provide the base for further enhancements in future releases.