Humans possess many abilities for interacting. Sometimes these abilities are reduced through environmental factors, injury, disability, or natural degradation from aging. The key to accessing electronic devices is to use the other abilities that we have when the preferred abilities are not available to us.
Seeing enables us to use visual interfaces such as computer screens, VCR programming etc., and to be able to locate buttons. For example, being able to tell if a number pad has the "1" key in the top left or bottom left.
An inability to see anything on an electronic device might come from it being dark, from our eyes being occupied elsewhere (e.g. while driving a car), or from disease or injury which causes blindness. Alternative means of interacting in this case include speech input and output, using lists of interface elements (instead of having to feel around for them), and tactile displays such as Braille.
A difficulty in seeing electronic devices might result from it being too dark, from us leaving our glasses behind, from having color vision deficiencies, or from disease or injury, which can create a very wide range of visual abilities (from losing half our vision, to having patchy or blurred vision, to having tunnel vision etc.). Alternative means of interacting in this case include those for an inability to see (speech input and output, using lists, and tactile displays) as well as changing the way something looks (enlarging text, changing colors or font types etc.).
Hearing enables us to hold spoken conversations, for example to be able to converse with another person on the telephone, to listen to music, to watch TV, to hear devices which beep to tell us when something is finished (like microwave ovens).
An inability to hear the sounds coming from an electronic device might be from working in a noisy office or shop-floor, from being in a noisy bar, or from disease or injury which causes total hearing loss. Alternative means of interacting in this case include showing visual events for audible events (e.g. a flashing light as well as a beep), giving written text and annotation for spoken text and incidental sounds (e.g. closed captioning).
A difficulty to hear the sounds coming from an electronic device might come from being in a moderately noisy environment, or in a necessarily quiet environment (e.g. a library), or from disease or injury which causes decreased hearing (e.g. conductive hearing loss), or interrupted hearing (e.g. tinitus - a constant ringing sound in the ears). Alternative means of interacting in this case include those for an inability to hear (visual events for audible events), and changing the way something sounds (for example making it louder, changing the pitch), or connecting it directly to a hearing aid.
Speaking enables us to hold face-to-face or over the telephone conversations, to give speech input (for example to place a collect call using a computerized operator).
A difficulty or inability to speak (and be understood clearly by the listener) might come from being in a noisy bar, or from being in a quiet library, from just returning from dental surgery, from having a strong accent or dialect, or from disease or injury which prevents the mouth functioning in the normal way. Alternative means of interacting with electronic devices in this case include using a keyboard or keypad to enter information instead of or in conjunction with speech input.
Being able to touch and manipulate items with our hands enables us to press buttons, to move switches and dials, to make gestures etc.
An inability to touch and manipulate might come from being too far away (e.g. too high or too far to reach, like a small child at a vending machine), or from disease or injury which causes an inability to reach (e.g. paralysis below the neck). Alternative means of interacting in this case include using remote-control devices, and using speech input.
A difficulty in touching might come from having gloves on in cold weather, or from disease (e.g. cerebral palsy which might make hand-control difficult) or injury (e.g. having a big bandage on one's hand). Alternative means of making input in this case include those for an inability to touch (remote control, speech input) and being able to confirm selections (e.g. giving speech or highlighting feedback, and then requiring confirmation of that selection using another button that is separate and distinct.
Our ability to understand something is determined by the skills and knowledge we posses to interpret and process what we experience. The ability to understand can be reduced by such factors as load (e.g. how many tasks we are doing at one time), stress (e.g. if we are panicking, or under time-pressure), or from fatigue (e.g. being awake too long or expending too much mental effort). It can also be caused by confusion when something is not communicated at our level (e.g. an engineer might understand a term, which a layman does not; a physician might understand a medical term, which an engineer does not). Reduced ability to understand can also come from disease or injury which affects mental processes. Alternative means of interacting in this case include changing the way something looks (e.g. colors, text sizes and fonts), changing the level of the language (e.g. simplification), changing the way speech is presented (e.g. making it faster or slower etc.), and using a remote control which simplifies the interface.
Combinations of different needs
In addition, people might have combinations of reduced abilities, for example not being able to see and hear well. These combinations of reduced abilities require us to use our other abilities in sometimes unusual or innovative ways. There may be a need to use a combination of the strategies above, or even different strategies altogether.
Our ability to understand each other comes from having a common language. However, when we move to a foreign country, for a short period (vacation or a business trip), or a long period, we may be unable to understand, or at least have some difficulty in understanding the local language.
The need for general good design
Note that the alternative means of interacting with electronic devices described here can be augmented by good design: as an example, speech input might be an alternative when something is too far to reach, but if moving the interface elements to within easy reach of everyone is possible, then that is a far better and more universally acceptable solution.
There are many different ways to change the way an electronic device behaves to take account of the varying needs of users. The following describes strategies for addressing the needs of users with a wide variety of abilities and limitations.
If the user cannot see the device, make it say things so they can use their ears
Things that cannot be seen can be said, using synthesized or pre-recorded speech. For example, items on a touchscreen interface can be pressed and their name and contents can be said aloud; buttons names on a device can be said aloud so that a user can explore an interface before selecting items. Because of the need to auditorally explore the interface it is important that items touched are not immediately selected… and an alternative means of selection is necessary: this can be by requiring confirmation by pressing another button (that is off to the side / edge and can easily be found and used without pressing other buttons) or by pressing and holding the button down for a short delay time until it is selected.
If the user cannot understand things that are said by the device, let them change the way it says it
Synthesized speech is malleable in the same way that text is. It can be made faster or slower, the pitch can be raised or lowered, the basic voice can be altered (e.g. male / female / robotic etc.).
If the user has difficulty seeing the device, let them change the way it looks
Text is malleable depending upon the constraints of the visual interface. For example, fonts can be enlarged, changed between serif and sans serif, made white on black or any other color combination. A visual interface may be constrained in the maximum size of the text, the colors that are available, and the clarity (resolution) possible.
If the user cannot locate the buttons on the device, let them use a list with only 3 buttons
If the user is unfamiliar with or cannot reliably find or remember where all of the buttons are on an interface, an alternative is to put all of the items onto a list. The list has all of the interface items (buttons, switches, text etc.) arranged logically from top to bottom. The list can be shown visually, or auditorally. The list can be accessed by using 3 buttons (up / down / select), or by sliding a finger along an edge (e.g. a touchscreen). The list works because it takes a two-dimensional interface and makes it one-dimensional. Although this is a cognitively more complex interface strategy, it does allow access by people who are unable to locate interface elements independently.
If the user has difficulty hearing the device, let them change the way it sounds
Sound contains properties that can be altered, such as volume (loudness) and pitch. Modifying these can help users who are unable to hear a device operating normally. In addition, it is possible to directly connect hearing aids to sound sources, providing a better listening system (e.g. headphone jack connection or telephone hearing aid T-coil connection).
If the user cannot hear the sounds from the device, show the sounds visually
Any sounds that a device makes can be shown visually, for example by making a display or indicator light flash when a sound is made. Spoken text and sounds can be shown in "caption" form, enabling someone who cannot hear at all to have access to the same information as people who can easily hear.
If the user cannot be sure of pressing the right button, allow them to Confirm button presses
If someone cannot reliably press individual buttons (for example because they are too small), then it might be easier to confirm button selections. This can be done by highlighting or saying out loud with synthesized or pre-recorded speech what button was pressed. For example, when using a cellular phone outside in the winter when you have large gloves on: you could press the "dial" button but miss and press the "cancel" button - but you would not have really pressed it; instead you try again until the "dial" button is highlighted, then press the confirm button which is off to the side of the phone away from all of the other buttons.
If the user cannot provide speech input, allow them to use buttons instead
If someone cannot speak but an interface uses speech input, an alternative can be to press buttons or keys. For example, at the beginning of a computer controlled operator telephone call, a starting prompt could be "press '1' to use your touch-tone phone to control this call, or say 'OK' now to use speech control".
If the user cannot reach or touch the device, let them give commands by speech
If someone is unable to reach or see an interface, they can control it using speech input together with appropriate output (either audible (speech) or visual). Words spoken by the user are interpreted by the device and used as commands to control the interface.
If the user can see, but can only use one or two switches for input, let them step around the buttons using scanning
If someone can only use one or two switches (for example if they are paralyzed from the neck down), it is possible to control the interface by having each item highlighted (or said aloud) one by one. When the one that the user wants is highlighted, they can select it using a single switch. With a double switch they can use one switch to advance the highlight, and the other to select. The latter has more flexibility and control, but not everyone can use two switches which is why single switch is available. Note: It is possible to scan using auditory feedback, but it would be more likely that a user would use speech output and a list to interact with the device.
If the user wants to use their own customized type of input and output, let them use Remote Control
If a user cannot reach an interface, they can control it remotely using an infrared link. The infrared link works in a standard way so that the remote control can be used to control an electronic device. The user points the remote controller at the device and the device sends the remote controller the available commands for itself. The commands can be accessed as a list (which enables them to be converted to hand-held speech output, Braille output, large text output etc.), or as a graphical image with buttons similar to those on the device itself. The remote control can be configured to meet the needs of the individual user: different displays and levels of information can be shown on the remote control, making the interface simpler to use.
If the user cannot use the standard language, let them change the language to one they can understand
All of the above means of interaction are of no use if the standard language of the device is not the user's own and they cannot understand it, or have some difficulty in understanding it. Multi-language support will depend upon the location of the device: for example, in the USA the two most predominant languages are English and Spanish, but if a public information device were placed in a tourist area, then Chinese, Japanese, German, French, Italian etc. would be useful to allow visitors access to the information.
How and what to change on an electronic device to make it usable by someone with reduced abilities... The following quick reference table is also available as linear text.
|Device Problem||Description of Problem||Solutions|
|Cannot see the device well||
|Cannot see the device at all||
|Can't locate buttons on the device (can't see where they are)||
|Can't read the device well||
|Can't read the device at all||
|Can't hear the device well||
|Can't hear the device at all||
|Can't hear the device well and can't see the device well||
|Can't reliably press only one button at a time on the device||
|Can't touch the device / Can't reach the device||
|Can only use one or two buttons on the device||
|Can't talk to the device||
|Can't understand the device||
|Can't understand the language||
For more information on this document contact:
Chris Law Trace R&D Center
S-151 Waisman Center
1500 Highland Avenue Madison, WI, 53705-2280
This is a publication of the Trace Research and Development Center which is funded by the National Institute on Disability and Rehabilitation Research of the Department of Education under grant number H133E30012. The opinions contained in this publication are those of the grantee and do not necessarily reflect those of the Department of Education.
© Copyright 1998, Trace Center, University of Wisconsin-Madison, USA.