john Gill technology header image

Guidelines for the Design of Screen and Web Phones to be Accessible by Visually Disabled Persons

Edited by John Gill
December 1998

With contributions from Lucy Albu, Dominique Burger, Marine Ferré Blanchard, Djamel Hadjadj, Valerie Johnson, Marco Mercinelli, Helen Petrie, Adrian Picton and Ted Pottage


Glossary of Terms

1 Introduction

2 Blind and Partially Sighted Persons

3 Screen and Web Phones

3.1 ADSI Protocol

4 Access Issues for Visually Impaired People

5 Strategies for Adaptation

5.1 Terminal Adaptation

5.2 Adaptation of Services

6 Guidelines

6.1 Displays

6.2 Keys

6.3 Pointing Devices

6.4 Output

6.5 Open Software Architecture

6.6 Services



Appendix - Web Page Design

The VISTEL Consortium

Glossary of Terms

ADSI Analog Display Services Interface - Adaptation Box In the VISTEL: project context it is a specific box (hardware and software) developed to assure the management of the interworking between the screenphone terminal P100 and the annexes devices.

CPE Customer Premises Equipment

DTMF Dual Tone Multi Frequency

HDML Handheld Devices Mark-up Language

HTML Hyper Text Mark-up Language

LED Light Emiting Diode.

MiniTel: The commercial name of the French screenphone terminal for the access to the TéléTel: services.

Screen Phone An enhanced telephone with a display and some functional keys. It is usually based on a programmable microprocessor, RAM and not volatile memory. It can have a small keyboard and additional peripherals such as a card reader, a printer, a RS232 interface. Most screen phones can act as clients for supplementary or Telematic services based on the ADSI protocol.

Screen Reader Software translates a visual display into one which is meaningful with non-visual output.

Soft Key A function key with a special position on the keyboard, that can be associated to a function label on the screen or to a menu item. The activated function is defined by the application within its context, and can change during the dialogue session.

WAP Wireless Application Protocol

Web Phone A screen phone which also permits access to the world wide web (internet)

WML Wireless Markup Language

1. Introduction

Nowadays the services provided on-line (internet for example) allow people to book a flight, check on a package, get a stock quote, buy a book from their home or office.

Some time ago such kind of services could be accessed only using a personal computer (PC): you had to plug cables in, turn on the machine, install and configure the operating system and all the software needed to communicate. You had also to install additional hardware (modems), avoiding any conflict with the already working hardware in the PC. All such operations are complex and prone to errors.

This is the reason why the main computer manufacturers are developing network computers: just connect the cables and everything is ready to work. But even such equipment can be considered too much just to access currently available information services, especially if the customer does not need to use computers: what is enough is just a phone with improved output: a screen plus a keyboard to be used when it is needed to write short sentences. Such equipment is called screen phone, or when the built-in software allows the use of standard internet protocols, web phones. Some years ago, these boxes were too expensive, but now, while prices are rapidly decreasing, these new equipment are exciting niche markets and are going to be targeted at mass usage.

As with all terminals, failure to take into consideration the needs of visually impaired people means excluding them from new services, and thereby increasing the gap with "normal" people as well as excluding a remunerative and growing piece of the market.

This publication intends to show designers how such terminals can be made so that they are accessible by visually impaired people, addressing the needs of low vision and blind people; many of the recommendations are also applicable for deafblind persons and people with dyslexia. The information is based on the results of the VISTEL: project, whose aim has been to adapt screen phones to make them easily accessible for visually impaired people.

2. Blind and Partially Sighted Persons

In most developed countries the prevalence of people whose vision is such that they need to use non-visual methods for operating a telephone is about 1 per thousand. However the number of people with low vision such that they would find it impossible to read the liquid crystal display on a screen phone, such as the Philips P100, is in excess of 13 per thousand.

Most visual impairements are acquired late in life; about two thirds of people with a visual disability are over 75 years old. Macular degeneration accounts for about half the visual impariment in developed countries; typically this will result in loss of central vision.

For those under the age of 60, the largest single cause of visual impairment is diabetic retinopathy. This is often correlated with a very poor sense of touch, which makes learning and reading Braille difficult.

In excess of half of the visually impaired population will have an additional impairment; about 35% will have a significant hearing loss.

Braille readership is about 0.02% of the population, and is largely made up of people whose onset of visual impairment was early in their lives.

The most common forms of colour blindness are inherited and are associated with the inability to discriminate red and green wavelengths. Because these defects are inherited as recessive traits, the incidences are much higher in UK males (c. 8.0%), who possess a single X-chromosome, than in females (c. 0.5%), who possess two. Total colour blindness is extremely rare.

3. Screen and Web Phones

A web or screen phone is composed mainly of:

  • a receiver, for both audio input and output;
  • a telephone bell;
  • a keyboard: usually a main numeric keypad plus an alphanumeric standard keyboard (normally hidden inside the box of the phone and ready to pull out when it is needed and a screen to show information): -

Additional components can be a smart card reader (to simplify transactions and to store information) and soft keys (usually near the screen). Some LEDs can also be used to indicate for example that the phone is ringing.

A web/screen phone can be used as a standard Tel: just type in the number to dial and lift the receiver. When it is used to access new information services, it can communicate using the IP protocol too, protecting the information with protocols like Secure Socket Layer 3.0, if needed. The software in the phone can also allow dialogue with the built-in smart card reader, which can contain personal data about the user and service dependent information.

The interface, of the service to the user, can be constructed using the Java language (the personal Java run time environment is integrated in the phone) allowing applications to be easily created. This version of HTML supports the frames for better presentation.

3.1 The ADSI Protocol

The ADSI (Analogue Display Services Interface) is a "screen phone" Telephony standard that has been defined by Bellcore in the USA.

ADSI supports two operating modes: Feature Download (FD) and Server Display Control (SDC). Each has a different use. Both modes depend on the Abstract CPE for their operation.

The Abstract CPE is the application developers view of what the CPE (Customer Premises Equipment) looks like. It allows the same Feature Download script or Server Display Control application to work with any ADSI phone regardless of its display size. Each phone has an area of memory that implements the Abstract CPE.

All information and soft key labels to be displayed are written into the phone's Abstract CPE. The phone then maps the Abstract CPE to its physical display and soft keys. The scroll keys on the phone allow the user to see any information the display is not big enough to show.

Feature Download (FD) is used to make normal use of the telephone easier by giving the user a visual interface.

A Feature Download script is a program downloaded to the phone to allow it to be customised. The phone can hold in its memory up to four Feature Download scripts though only one can be active at any one time. When a network event (e.g. reception of a caller display message or detection of dial tone) occurs that is catered for in the active script, the script takes control of the display and presents the user with a number of options. This may be while the phone is on or off hook, connected or not connected to another phone or server.

Thus if the phone has detected that the customer uses the "Call waiting" service and there is a call waiting the user may be given the options to take the call or ignore the call. The user selects the required option by pressing the soft key associated with the label on the display. This causes DTMF (Dual Tone Multi Frequency) to be sent to the network to carry out the option selected. The DTMF is the same as would be sent if the operation was carried out manually. Further options may be presented to the user if appropriate. Thus in the example above if the user took the waiting call the further options might be: "Back to party one" or "Conference the calls".

Server Display Control (SDC) is used to display text and soft key options to the user while on-line connected to a server. The addition of a display with soft keys improves the usability of voice/touch-tone services and thus enhances the service possibilities.

An ADSI server is typically an interactive voice response system with the addition of the ADSI signalling capability. The server can send the user a mixture of screens and voice prompts. When the user presses a key an action (e.g. send a DTMF digit) that is associated with that key is performed. For example an ADSI SDC session could allow a user to manage their voice mail via an interface not unlike a normal answering machine, with keys labelled Play, Skip, Erase etc.

The same server can be used for both ADSI and voice / DTMF dialogues. If the phone responds with a DTMF 'A' to an initial short beep of ADSI CPE alert signal (CAS) then the server knows that an ADSI telephone is being used. The server then talks to the phone using the ADSI signalling protocol which defines a way of inserting bursts of 1200 baud FSK data into a normal voice call. The user only hears the alert tone that precedes the data. When the phone detects this tone it mutes its earpiece or speaker until after it has acknowledged correct reception of the data using DTMF. The protocols are very similar to those used for Caller Line Identification and CLASS services.

Server Display Control is only available when connected to an ADSI server. The server can reside in the Network or can be CPE connected to it. It will probably have connections to external databases.

The abstract CPE contains the structure of information and soft keys labels to be displayed. How it is displayed depends completely on the mapping rule of the telephone. The mapping rules are governed by the size and layout of the display.

To handle large characters and graphics a phone would have to be developed that could display them. However a large screen would make the phone very expensive. Larger characters could be displayed on the current screens, but these may require the user to scan and scroll a lot to read the display.

As the way the information to be displayed / conveyed is controlled by the phone it is not necessary to add to the protocol to accommodate blind users. However the way a particular service is designed and presented could greatly affect the ease of use. This would need to be addressed on a service by service nature with service provider.

It should be born in mind that ADSI SDC will normally be used to supplement an existing IVR (interactive voice response) service. Also ADSI FD will mainly be used to aid access to network services that can be access using a DTMF telephone.





Usually an LCD graphical display (colour or monochrome).

Minimal resolution is VGA 640*480 pixels.

It can be backlit.

Electro luminescent display

Visual indicators

Usually LED indicators, or LED bars

Acoustic output devices

Usually it is just a beeper but it can be a more complex device


Voice synthesiser chip


(Keyboard, Keypad, Buttons)

Two separate keyboards: numeric one, on the front of the phone, and a complete foldaway alphanumeric keyboard.

Pointing devices

Touch screen

Pen point

Slots, sockets, external connections

Smart card reader

PC Cards devices (PCMCIA)

Sockets for external power supply or connection to other devices such as printers.


Voice input can be used in voice recording equipment

Table 1 Main components of screen-web phones




IP v. 4


HTML 3.2 + Frames

No cascading style sheets

HTTP 1.1


Personal Java runtime environment

Supports the following Java APIs: Telephony API, Smartcard API, security API, SSL API



SSL 3.0




Multimedia support

GIF89A, JPEG, AU and WAV files

Various fonts

Fixed and proportional, with and without serif

Table 2 Software support of screen-web phones

Since a web/screen phone is a terminal, the usual guidelines about accessibility of terminals apply. Attention must be paid in the way the user can select objects on the screen: keyboard should be used too and web pages should be designed in order to simplify automatic adaptation.

telecommunication services provided via screen phones are generally based on some very simple interaction paradigms that can be modeled using a few text based interaction components.

This has been the case for a great variety of text oriented services available via VT100 phone terminals, ADSI screen phones, or the very popular MiniTel: in France. It will still be the case with services delivered over the internet via HTML interfaces if they adhere to the guidelines for accessibility (see Appendix 1).

Thus these services can be made accessible to visually impaired persons using Braille displays, speech synthesisers or enlarged screen displays. VISTEL: has produced these guidelines which can also be applied to adapt or design of a great variety of text-based services and/or terminals such as banking machines or public information terminals.

4 Access Issues for Visually Impaired People

Before tackling the problems of the user interface, some thought must be given to how blind and visually impaired people are introduced to the technology.

Screen based technologies such as screen phones are often designed to be self taught using hands-on exploration of the user interface, backed up with printed manufacturer's instructions. A great deal of information about what such a piece of technology can do, and how to do it, is communicated implicitly using visual clues.

Both the implicit and explicit printed information are problematic for blind and visually impaired users. For instance, whilst the concept of separate screens or pages of information is a structure easily understood by someone who has had access to books, there is no corresponding experience for the concept of navigating through a structure of parallel menus. Thus, experience of the VISTEL: project has shown that people who already had experience with personal computers quickly grasped concepts such as menus, but many of the others had considerable difficulty in understanding such concepts when presented through a speech synthesizer.

Compared to the young sighted population, visually impaired and older people are likely to be relatively inexperienced at browsing and hands-on exploring as methods of learning unfamiliar technology. It was found that there is some, understandable, reluctance even among computer users to 'play' in the way that computer literate sighted people will do. Overall they may be more fearful, and require more information and prompting in their learning phase than designers might anticipate.

The quality of audible cues, and particularly the quality of the speech synthesiser become very important, particularly for those users users who are hard of hearing (e.g. deafblind and elderly) and thosefor whom the language is not their mother tongue. It should be noted that in tests, using the default setting on a medium quality synthesiser, even those people accustomed to synthetic speech needed a training period where the speech was interpreted for them. This was particularly so where an obscure rather than familiar term had been used: As one subject with some hearing loss pointed out, "it's not perception that is important so much as recognition".

Given the points above, it is not surprising that it was found that some degree of person to person training is almost always required. For those users familiar with computers, the training period will be relatively short, perhaps under an hour. For others the training period would need to be considerably longer.

One important advantage of a training period for those who would use the screen or web phone in their own homes is that it would allow for changes to be made to the audio settings (such as timbre and pitch of speech) to suit the personal requirements of the user. This may be crucial for those with some hearing impairment.

The training period generally needs to include an explanation of what the technology is designed to do, why this might be useful to the user, what the basic operating principles are (i.e. navigating through lists), and a guided hands-on tour of the hardware. Further training should be-hands on and structured around tasks.

Visually impaired users face particular problems with the user interface of the screen or web phones.

Several general categories of activity can usefully be identified as necessary for successful use of the phones. These activities are often largely dependant on interactive visual cues which are unavailable to visually impaired users. In addition, the design of the phones often assumes a degree of manual dexterity and hand-eye coordination which may be reduced or absent, particularly in older users. Visually impaired users may also need longer to complete tasks. A further issue is the question of how visually impaired people can access information about options without actually activating the options themselves.

The design of the user interface will need to address these issues for each of the activities, providing both modified visual information, and non-visual information (e.g. tactile or audible cues and speech).

Some of the activities necessary for successful use of the phones include:

  • Accessing manufacturer's instructions
  • Reading information on the screen
  • Locating / distinguishing between keys
  • Knowing what is the function of a key or dial
  • Using correctly located keys without error
  • Entering and checking text
  • Knowing whether an action has been successful or not
  • Using the handset
  • Using hands-free mode
  • Inserting a smart card
  • Locating oneself in the information structure
  • Making sense of the information structure
  • Knowing what are the available options
  • Knowing how to locate / navigate to an option
  • Knowing how to activate an option
  • Knowing the status of the phone (e.g. waiting for input, processing an instruction)
  • Good interface design will allow a variety of interaction techniques, and flexible interaction metaphors.

5 Strategies for Adaptation

5.1 Terminal Adaptation

Usually the terminal needs to be adapted developing and integrating special software modules:

  • An event/interrupt Interceptor intercepting data displayed on the screen and any key pressed
  • A translation module to create a textual representation of the data on the screen
  • A system for enlarging text on the screen, with the ability to scroll if necessary
  • A screen reader that manages user commands and allows navigation on the screen. This would have a "screen reading" mode to allow the user to move around the screen without moving the cusor or performing an operation.
  • Drivers allowing output through larger screen, speech synthesisers and Braille displays
  • A communication module in case other soft ware components are installed on different devices

5.1.1 All the modules installed on the smart phone to adapt

Different modules can be installed on different devices according to the following scenarios:

There are obvious advantages due to a better integration and a lower cost of the solution. Nevertheless the terminal hardware and operating system has to satisfy to some basic requirements:

1. The terminal must be open and programmable
2. The operating system has to provide hooks for low level communication with the adaptation modules
3. Internal memory must be sufficient to host the different SW modules
4. One or more external ports have to be available and accessible in order to connect external adaptations:

  • One serial/parallel port to connect the speech synthesiser
  • One serial/parallel port to connect the Braille display
  • One keyboard/serial port to connect the keyboard
  • External screen port to connect an external screen for screen magnification

More than one port should be available if different external adaptations have to be connected at the same time (e.g. speech synthesiser and Braille display).

5.1.2 Different modules installed on two different devices

If some of the previous hardware requirements or software flexibility are not satisfied, different software modules have to be implemented on an external device (Adaptation Box).

In any case the screen phone has to provide at least one external port for connecting the two devices, and the phone operating system must allow interception of screen and keyboard events. A connection module has to be implemented on both devices, implementing a common communication protocol.

It is advisable that the adaptation box is manufactured as a docking station for the screen phone, allowing easy connection and acceptability in a home environment.

The adaptation box can host different ports for external adaptations. It is usually based on standard hardware modules and a standard operating system, allowing faster development and easier maintenance.

This solution has the advantage of being independent of the particular screen phone and so easier to maintain or upgrade, used for adapting of different types of equipment (e.g. cash dispensers).

5.1.3 Soft keys Interpretation

One typical issue in adapting smart phones is the usage of soft keys. These keys are associated to different functions in different services and in different information pages. The current function of each soft key is usually indicated by a label or an icon in a special corresponding area of the screen.

The adaptation solutions can be the following:

  • The user has an operation mode for browsing the description of soft keys without activating them
  • The first pressure on a soft key provides a description, the second activates the key. Pressure of any other key cancels a previous selection. This facility could have the option to be disabled by the user when they became more familiar with the system or services so that one press of a key would activate the function, but announce the name of that function simultaneously

5.2 Adaptation of Services

5.2.1 Service Adaptation on the Terminal

Services can be adapted at terminal level using script files and a recognition mechanism: a special module recognises a specific service and performs some actions in order to rearrange the output in an suitable format, for example masking not important information, describing images and tables, etc.

The advantage of using this method is that no intervention of the service provider is needed. The disadvantage is that it is necessary to write a script file for every interesting service to adapt.

5.2.2 Modification of services

Not all service characteristics can be dealt with at the terminal level. This implies the need to adapt services modifying directly the way in which a service is provided. The service provider would have to be directly involved in the adaptation activity.

Some service characteristics that can be dealt with only at the service level are the following:

  • Graphical banners
  • Menus presented by graphical means (e.g. icon lists)
  • Textual description of images
  • Frame based presentation of information
  • Short timeouts
  • Status line
  • No information when in input mode

5.2.3 Service Adaptation By a Translation Server on the Network

Some services can be adapted accessing them by a centralised translation server. The server accesses the requested service and rearranges the information pages "on the fly" presenting them in a suitable format. This method is very useful especially for adaptation of internet services based on the use of a proxy server. However the cost of running the proxy server would have to paid by some organisation.

6 Guidelines

The following general guidelines should be followed:

6.1 Displays

Good visibility of display is most important both for people with reduced vision and for fully sighted users in unfavourable light conditions:

  • Good contrast with the background should be provided for text and graphics. Display should give a minimum contrast ratio of 3:1 for positive displays, and a maximum of 1 to 20 for negative displays.
  • Provide for adequate, adjustable illumination of LCD display whenever possible;
  • Use back illumination of LCD screens whenever possible (a simple regulation of contrast or brightness is not sufficient)
  • Anti-glare provision to avoid reflections whenever possible
  • Possibility to tilt the display
  • Any anti-glare reduction measure should not decrease the quality of sharpness of characters nor darken too much the background, blue has been found to be a good background colour
  • Combination of blue, green and violet should be avoided
  • Colour alone should not carry information
  • If a dot matrix display is used a cell of 9x7 is the preferred minimum, with the addition of four rows to accommodate line spacing, lower case ascenders and descenders and accents

Particular attention has to be given to the choice of screen fonts for the contents:

  • Large fonts should be used whenever possible (10, 12 points minimum, better 16), a minimum of 3 mm high for 500 mm viewing distance
  • Light weight, extra bold and condensed typefaces should be avoided, especially when used in reverse mode

e.g. light weight, extra bold, condensed

  • The possibility to chose and enlarge typeface should be provided
  • Serif fonts (e.g. Times) are less readable than sans serif (e.g. Tiresias Screenfont)
  • Avoid typeface with numerals with tails curling up

e.g. 356890 is more readable than 356890

  • Distance among lines should be at least 20% of typeface dimension
  • Upper and lower case type is easier to read than upper case only
  • Do not use more than 3 luminance levels in the screen

If any, all icons should have a textual description in order they can be read by a voice synthesiser. Do not use similar images whose meaning can be deduced just by their details.

6.2 Keys (Keyboard, Keypad, Buttons)

The general goals in designing key shape are the following:

  • The finger should locate the key without hitting other keys
  • Hitting other fingers should be avoided if keys have to be used for multi-finger typing
  • The distribution of pressure should indicate the location of the finger on the key
  • The force of pressing the key should be distributed to the proper portion of the finger

General guidelines concerning the keys are the following:

  • Use clearly recognisable shapes. Choice of round versus square keys may be a cultural variable (e.g. round or oval keys are more popular in Germany than in the USA) whilst keys for scrolling through menus should be shaped to indicate direction
  • Top of keys concave or at least flat
  • Provide tactile identifiers on keys
    Raised dots on number 5 in numeric keypads
    Raised bars on F and J keys on typing alphanumeric keyboards
    Tactile markings on cursor keys
    Consider printing or engraving key labels to give texture
    Provide for clear tactile separation of keys when used in arrays
  • Good contrast between keys and body of the equipment
  • Keys should be at least 2.5mm apart
  • Sharp contrast for labels on keys using readable big fonts, minimum 14 point
  • Automatic repetition of key function should be avoided or user configurable for letter/word echo
  • Provide clear tactile feedback (snap action)
    "a gradual increase followed by a sharp decrease in force required to actuate the key, and a subsequent increase in force beyond this point for cushioning"
    Force 0.25 to 1.5 N, or 28g to 142g
  • Prefer usage of travel keys
    Membrane or zero travel keys should be avoided
    Travel 1.3 to 6.4 mm
  • Provide audio feedback (click or beep) selectable by users whenever possible
  • Provide for key illumination whenever possible
  • Use anti-glare material for keys
    Rubbery material is often preferred by users
  • Volume and contrast controls should have tactile markings

The functional organisation of keys should adopt the following guidelines:

  • Provide structure of the layout of keys to assist user orientation
  • Function keys should be clearly separated from the numeric and alpha keys
  • Use different visual (e.g. colour) and tactile cues (e.g. shape, dimension) for different groups of keys, function keys or single buttons
  • Most important keys should be larger
  • Use strong colours or bigger dimension only for a few most important keys (eg. red for "cancel")
  • Avoid multi-function keys whenever possible
  • Avoid using contemporary combinations of keys
  • Position of keys should allow easy usage by right and left handed people. This should include having keys duplicated where necessary.
  • No function should require operation with both hands or couple of fingers
  • Adopt standard national typing layout (or QWERTY) for alphanumeric input tasks
  • The telephone rather than the calculator layout for numeric keypads should be used
  • Selection and execution keys should be near each other

Some special considerations have to be taken for "soft keys". A soft key is a key with no fixed function associated to it. Its function can change according to the application context and is identified by a function label in a corresponding position of the display or by a menu item. The activated function is defined by the application within its context, and can change during the dialogue session.

Some general guidelines regarding soft keys management are the following:

  • There should be a way to have the meaning of a soft key "vocalised" in order to allow visual impaired people to recognise their function before selection. This feature can also be useful for normal sighted users in dark conditions or while performing tasks that do not allow looking at the display. One press of a soft key should trigger announcement of function, two rapid presses should activate the function.
  • The vocalisation should match the text on the screen but may also amplify it
  • If a soft key is selected which does not have a function associated with it, appropriate sound feedback should be given.
  • Soft keys labels should always appear at the same position on the display
  • Soft keys should possibly be positioned in the proximity of the display in correspondence with the position of labels
  • Soft keys should be used consistently in applications (e.g. use always the same key for similar functions such as getting help, deleting, confirming, going back and forward)
  • Soft keys labels should be self explanatory, avoid using abbreviations where space permits

6.3 Pointing devices

Usually, pointing devices require fine coordination of hand movements and eye. If a device is targeted at a mass market, including elderly people and impaired people, there should always be an alternative method for selection and data input.

General guidelines for pointing methods based on directly selecting objects on the screen are the following (touch screen, pen point screen):

  • In order to allow finger pointing of objects on the screen the selection area should be 2.6 sq. cm. to result in over 99% accuracy
  • The on screen selection area can be reduced for pen input, but very small selection areas provide worst visual feedback (obscured by pen point and hand) and can be difficult to select by people with even slightly trembling hands or poor close vision
  • If touch screen is used for typing on a screen keyboard, the keys on the keyboard should be shifted by +0.41to 0.54 cm. In this case error rate can be less than 1% also with on screen keys reduced to 2.27 cm.
  • The best tilting angle for typing on a screen is 30°-45° from vertical (typing is easier at 30° while reading is fastest at 45°).
  • Visual and acoustic feedback should be always provided from the equipment
  • Use recognisable metaphor for on screen objects to select (e.g. light switch, 3D button)
  • Last key selection algorithms improves performance: the system accepts selection from the last valid key touched prior to finger raising; if finger is raised outside a screen "sensible" area no selection should be made
  • Last key selection algorithms coupled with acoustic feedback can allow visually impaired people to explore the screen with finger prior to making a selection

6.4 Output

Textual interaction components are in four main groups:

1. Texts which are purely passive information chunks
For instance a title, a section of an informative text, a warning message are all texts,(a warning is a text with a higher level priority)

2. Buttons - For instance GO BACK, OK, CANCEL, NEXT are buttons
Various mechanisms can be used to activate buttons. Soft keys or programmable keys are one of them, very common in Screen Phone context. Soft keys however are problematic for visually impaired users using speech adaptations, as these users can not rely on their memory of a stable layout of the keys and could therefore easily perform wrong operations

3. Input buffers used for input of information
Input buffers make it possible to pass information to the server (name, number, .). Input buffers represent a key element in transaction with a server. Thus their adaptation has to be very clear and secure

4. Menus which present the user a possibility to make a choice among several items


The presentation of data on a Braille display, or using a speech synthesiser, or magnifying the data on the screen, has to put the components in a sequence. The order in which these components are presented is important for the understanding of the global meaning. As the number of components present in a screen is generally limited, two basic keys (up and down arrows) constitute a simple mechanism to explore the screen.

Specific Guidance

Since a visually impaired user can not apprehend the screen globally, a part of its meaning and clarity is lost. Some extra guidance is therefore useful to compensate this handicap. This guidance can be given using textual comments just before an interaction component or just after. Messages should be informative but concise, and should be presented in small amounts which are easy to retain. This is also useful for dyslexic and older users.

Non verbal feedback

Non verbal feedback is extremely useful in non-visual interfaces in order to convey feedback rapidly. It can concern feedback on events that the users is expecting and wants confirmation about - like going to the next or previous line or element - or to give low level warning like "no more element after this one". Using beeps is a simple and low cost method to achieve this. Beeps are also compatible with the different types of adaptation based either on Braille, speech or data magnification.

Non-speech sounds should be used for all messages/feedback which are ubiquitous throughout the system. In particular they should be used to indicate:

  • The end of a list
  • Page down
  • New input field
  • A key which has no function at that particular time

A speech explanation which can be turned off once the user is familiar with the beeps may also be useful

6.5 Open Software Architecture

The Screen Phone should support an adaptation strategy based on a general architecture.

The Communication Module has three main functions :

  • To transfer the information displayed on the screen to the adaptation software as soon as this information appears.
  • To inform the adaptation software when a full screen has been completed
  • To emulate any of the actions usually performed on the phone device to control the dialogue with the service.

The Adaptation Software has to optimise the presentation of data, which often implies reconstructing the initial data on the screen (different order, additional comments, rephrasing, etc).

The Non Visual Interface handles interactions on the specific peripherals (Braille terminal / speech synthesiser / adapted screen / keyboard)

This architecture can lead to different hardware solutions. For instance, the adaptation software and the non-visual interface can be run by an external device or run internally by the screen phone.

6.6 Services

Services which are organised in sequences of independent and stable screens
A mechanism shall be provided to make easy the identification of the screen by the adaptation software. For instance an identification key can be displayed at a given location or a message can be sent to the adaptation software or a clear title in HTML context

Screens which are composed of a few basic textual elements
A mechanisms shall be provided to make easy the identification of theses elements by the adaptation software. For instance the location on the screen, the text attributes, cursors can provide useful indicators if used coherently. HTML tags are obviously an excellent mechanism in the context of the Internet. The HTML code shall be accessible if available. Graphical elements should always be completed by a textual description


Clarke, A. (1996) Human factors guidelines for designers of telecommunication services for non-expert users. Loughborough: Loughborough University. (Available on CD-ROM from Anne Clarke, Husat Research Institute, The Elms, Elms Grove, Loughborough LE11 3BN, England (Tel: +44 1509 611088; Fax: +44 1509 234651; Email:

Gill, J. M. (1998) Access prohibited? information for designers of public access terminals. [accessed 22/10/07].

Gill, J. M., Silver, J., Sharville, C., Slater, J. & Martin, M. (1998) Design of a typeface for digital television. In: Placencia Porrero, I. & Ballabio, E., eds. Improving the quality of life for the European citizen. Third Tide Congress, Helsinki, June 1998, 248-252. Amsterdam: IOS Press.

Gill, J. M. (1998) The use of electronic purses by disabled people: what are the needs? [accessed 22/10/07].

Johnson, V., Petrie, H. & Mercinelli, M. (1998) An investigation of the user needs for screen-based telephony for people with visual impairments. In: Placencia Porrero, I. & Ballabio, E., eds. Improving the quality of life for the European citizen. Third Tide Congress, Helsinki, June 1998, 248-252. Amsterdam: IOS Press.

Roe, P. R. W. ed. (1995) Telecommunications for all. COST 219, The European Commission. [accessed 22/10/07].

Thorén, C. ed. (1993) Nordic guidelines for computer accessibility NNH 4/93. (Copies available from Nordic Committee on Disability, Box 510, S-162 15 Vällingby, Sweden (Tel: +46 8 620 18 90; Fax: +46 8 739 24 00).

von Tetzchner, S. ed. (1991) Issues in telecommunication and disability. COST 219, The European Commission. [accessed 22/10/07].


This group of 18 countries is concerned with access to telecommunications by disabled and elderly people.

This site includes many conference papers as well details of devices available for visually disabled persons (eg Braille displays, speech synthesizers, web browsers).


This is the main European website concerned with designing information and communication technology systems so that they are accessible to everybody including disabled and elderly people. This site contains a wealth of information including demographics of disability in Europe, relevant standards as well as legislative aspects.


Gives details of the disabled and elderly projects supported by the European Union tele-communications Application Programme.

Trace Center
This is the main American website concerning access to public access terminals by people with disabilities.

Appendix 1

Web Page Design

To assist people using non-visual browsers (usually with output to a speech synthesizer or a Braille display), web pages should provide:

  • Text alternative to graphics
  • Text-only pages must also be updated
  • Link to description of sound file
  • For video, use captions, text and sound tracks
  • Image maps should have alternative text-only page
  • Option to switch back and forth (text-only and graphics)
  • Forms are usually inaccessible
  • Minimise the use of tables
  • Use standard HTML formats, tags etc.
  • The pdf (portable document format) is inaccessible
  • Avoid non-standard data structures and viewers
  • Use colours and background patterns that contrast
  • No multiple hypertext links in a single line
  • Make sure that text, image, and background colours contrast well and that colour is not used as the sole means of conveying important information
  • Considered adding keyboard shortcuts to important links
  • Encode list structures and list items properly
  • Make sure that document structure is supported by the proper use of structural elements
  • Avoid ASCII art. Replace it with an image and alternative text
  • Use the ABBR and ACRONYM elements to denote abbreviations and acronyms
  • Avoid Java based forms since they can be inaccessible

See Web Accessibility Guidelines

Further details can be obtained at

A basic check for accessibility can be done using WebXact ( which will indicate aspects of a web page which are likely to cause problems for blind users.

Other Developments

A great emphasis is put nowadays in the accessibility of web services (e.g. stock exchanges) from lightweight mobile Telematic terminals such as mobile telephones, pagers and personal digital assistants. All these devices are characterised by small textual or semi-graphical displays. Common web services based on HTML are not well suitable for such kind of devices. Incidentally, the solutions to this accessibility problem is also very well suited for speech or Braille output by textual browsers for visually impaired people.

The Handheld Device Markup Language (HDML) by Unwired Planet is a language alternative to HTML. It has already been adopted by several research centres and several HDML servers have been already installed. HDML is designed to have a user interface model suited to small textual displays and limited keypads, to support integrated phone features, such as voice, to make efficient use of client device and network transport. There are several web services that can be of interest for visually impaired people as for mobile terminals users. Designers of such services should consider HDML as a possible alternative to HTML for the design of information pages.

The new Wireless Application Protocol (WAP) is also being defined. It provides specifications for a contents distribution architecture suited to mass market wireless hand-held devices. This new technology can also incidentally be very important for allowing accessibility by speech or Braille output browsers designed for visually impaired persons.

WAP defines the Wireless Markup Language (WML), which is based on HTML and readable by a micro browser in the terminal analogous to a standard web browser. It utilises proxy technology to filter and adapt on the fly existing HTML pages on the web. The translation operation on the proxy should take care of all the characteristics of a specific user interface. The WAP approach seems to be promising. WAP is still under development and at the moment the specifications have not yet produced a concrete implementation.

Further information on WAP can be found at

The VISTEL Consortium

Marco Mercinelli
, Agostino Appendino
Centro Studi telecomunicazioni SpA, via Reiss Romoli 274, Torino 10148, Italy.
Tel: +39 011 228 6123, Fax: +39 011 228 6190,

Antonio Vannucci, Franco Abet, Susanna Rauber
telecom Italia, Via E Gianturco 2, Rome 00196, Italy.
Tel: +39 06 368 82904, Fax: +39 06 368 82944,

Orlando Paladino, Alessandro Locati
Unione Italiana Ciechi, via Borgognona 38, Rome 00187, Italy.
Tel: +39 06 699 41693, Fax: +39 06 678 6815

Beppe Urso, Daniele Monti
Origin Italia Srl,Piazza IV Novembre 3, I-20124 Milano, Italy.
Tel: +39 02 6672 2403, Fax: +39 02 6672 2288,

Dr John Gill, Lucy Albu, Linda Newson
RNIB, 105 Judd Street, London WC1H 9NE, UK.
Tel: +44 20 7391 2244
Fax: +44 20 7391 2318

Dr Helen Petrie, Valerie Johnson
University of Hertfordshire, Sensory Disabilities Research Unit, Psychology Division, Hatfield, Hertfordshire AL10 9AB, England.
Tel: +44 1707 285058, Fax: +44 1707 285059

Guy Minier
Centre National d'Etudes de Télécommunications, Département DSV/SRV, Technopole Anticipia, 2 Avenue Pierre Marzin, Lannion 22301, France.
Tel: +33 2 960 52058, Fax: +33 2 960 51849

Dominique Burger, Marine Ferré Blanchard, Djamel Hadjadj
Institut National de la Santé et de la Recherche Médicale, Boite 23, 9 quai Saint-Bernard, Paris 75252, France.
Tel: +33 1 442 73435, Fax: +33 1 440 71585,

José Ramón Torregrosa
Organización Nacional de Ciegos de España, Ramirez de Arellano 21, Madrid 28043, Spain.
Tel: +34 1 415 0600, Fax: +34 1 415 0558,

Jan Francyk
University of Wroclaw, Instytut Telekomunikaciji Akustyki, Wybrzeze St Wyspiankiego 27, Wroclaw 50370, Poland.
Tel: +48 71 229558, Fax: +48 71 320 3070,

ISBN 1 86048 018 7

© Copyright reserved, 1998.

Published by Royal National Institute of the Blind on behalf of the VISTEL: Consortium.


VISTEL: is a project from the Telematics Programme, sector Disabled and Elderly. The VISTEL: project was supported by the European Commission by the TAP-DE programme; For further information contact Egidio Ballabio, rue de la Loi 200 (BU29, 3/20), B-1040 Brussels
Tel: +32 2 299 0232, Fax: +32 2 299 0248,



John Gill Technology Limited Footer
John Gill Technology Limited Footer