throbber
K. Nordby et al. (eds.), Human- Computer Interaction
`© IFIP International Federation for Information Processing 1995
`
`Epic Games Ex. 1023
`Page 1
`
`

`

`395
`
`An adaptable userinterface
`
`
`
`Figure 1 --Emulated Videophone with additional text exchange and picture display/annotation
`functionality.
`
`In order to study the relative contribution that
`different media make to the successful exchange of
`information, and to study the accessibility of
`different media for people with disabilities, a
`number of goals wereset.
`
`1) To be able to construct interfaces where the
`users, particularly users with disabilities, would see
`one single “application” on the screen of the
`computer. Although each function (i.e.
`video
`exchange, text exchange and picture display and
`annotation) was handled through a separate window,
`this was not supposed to be apparent to the user.
`This was designed to avoid confusion arising from
`information being handled by one window being
`obscured by another window.
`
`2) To develop a system that could be readily altered
`to move,resize or hide the part of the interface that
`handled a particular medium.
`
`3) To develop an interface where parts of the
`display area could be allocated to additional
`software elements that would provide a user with
`disabilities with additional assistance, for example
`an on-screen keyboard.
`
`The flexibility of the resulting system allows the
`layout of the screen to be altered to reflect the needs
`of the user and the information to be exchanged.
`It
`is possible to restore to each window its
`conventional boundary features. This allows each
`window to be resized and repositioned. Once a new
`layout is approved, it can be frozen and the window
`boundary features removed.
`The parameters
`governing each layout can be storedfor later re-use,
`
`so task or user specific layouts could be recalled,
`and the same system tailored to the needs of a
`variety ofdifferent users.
`
`This became particularly important when the needs
`of users with disabilities were considered. Many
`people have a complex mix of physical and sensory
`impairments that not only result in difficult in
`speaking, but also affect their ability to use a
`conventional computer keyboard orpointing device.
`For this reason, a set of devices and software were
`assembled that could be added to the basic system
`to adapt the service interface to allow it to be made
`useable.
`
`Because information could be exchanged in a
`number of media, and because each mix of
`impairments and the resulting disabilities are unique
`to each user, it is impossible to predict in advance
`exactly how the system should be adapted. Forthis
`reason, a structured approach to testing the
`possibility of providing access to the emulated
`service was devised. This is described below.
`
`PILOT STUDY
`
`Method
`Six attenders at a rehabilitation day care facility
`agreed to take part in the pilot study to test the
`service and to adapt
`the interface to makeit
`accessible to them.
`In each case, the following
`procedure wasfollowed:
`
`1) A conversation scenario was constructed that
`involved the user in a discussion about a visit to
`New York State in the USA. The scenario was
`
`Epic Games Ex. 1023
`Epic Games Ex. 1023
`Page 2
`Page 2
`
`

`

`396
`
`Part Five
`
`Case Studies: Application and Practice
`
`loosely scripted, in that the data to be exchanged
`and the media to be employed were given, but the
`users were free to “package” their questions and
`responses as they wished. A memberofthe research
`team took the role of the “travel agent” throughout
`the study.
`
`2) Each user in the study had previous experience
`of text processing and drawing and painting
`applications on the Macintosh (the preferred
`machineat the rehabilitation centre). A number of
`the users had taken part in earlier studies involving
`the use of the text telephone. None had any
`experience of using a videophone. They wereall
`given hands on training in the use of the various
`functions of the system, including a practice run
`through the scenario script. Then an attempt was
`made to run through the script as a “real”
`conversation. Where a user encountered a problem,
`a “helper” (another memberof the research team)
`handled that part of the conversation. The problems
`were logged.
`
`The experience of using the service was
`3)
`analysed.
`Each problem encountered was
`considered, and an adaptation to the hardware or an
`additional software assistive function was proposed.
`The user interface layout was re-designed to
`accommodate any changes that were necessary.
`
`4) The system, complete with adaptations was
`tested with the users. If problems remained, the last
`step was repeated once more.
`
`SUBJECTS
`The subjects in this study had the following
`disabilities that affected their ability to use the
`emulated service.
`
`the ability to use hands.
`
`6) Subject D: Pre-lingually profoundly deaf with
`the consequence that speech is very difficult to
`understand.
`
`Results
`In all cases it was foundthat the user interface could
`be adapted so that the users could handle every part
`of the conversation script. This was achievedafter a
`maximum of two attempts to adapt the interface.
`
`The problems encountered and the adaptations made
`for each subjectare listed below.
`
`1) Subject R: Without adaptation, this user was
`unable to take part in a conversation using any
`functional part of the service. Adaptation of the
`service involved adding a switch interface that
`triggered an on-screen scanning array. The on-screen
`scanning array scans through objects in the scan
`window. The scan can be started by a switch press
`and stopped when the scan markeris over the object
`that
`is desired.
`In this case, a full set of
`alphanumeric and punctuation characters were
`provided in addition to a set of mouse emulation
`functions. This allowed the user to type and to
`perform mouse functions on the screen objects,
`simply by pressing a switch with a movement of
`the head. This was so slow that text prediction
`software was also added tothe service interface.
`
`The text prediction software presented the user with
`a list of the most predictable words that matched a
`character typed by the user.
`If the required word
`wasin the list, it was selected. If not the nextletter
`was typed. Considerable keystroke savings, and
`hence time savings can be made using this
`technique.
`
`1) Subject R: Cerebral Palsy, resulting in severe
`loss of speaking abilities and no useable control of
`The addition of on-screen assistive techniques meant
`hand movements.Ris able to use head movements
`that the layout of the screen had to be altered to
`accommodate the additional elements. This was
`to press a switch.
`accompanied by a reduction in the size of elements
`such as the video window andthe text telephone
`window. The effect of this on conversation fluency
`and effectiveness is a matter for further study.
`
`2) Subject E: Cerebral Palsy resulting in major loss
`of speaking ability and little useable control of
`hands. E uses her chin to control a motorised
`wheelchair through a set of switches.
`
`I: Advanced muscular dystrophy,
`3) Subject
`resulting in majorloss of strength and severe loss of
`speaking abilities.
`I is able to write, but with
`difficulty.
`
`4) Subject G: Cerebral Palsy, resulting in major
`loss of speakingabilities and reduced ability to use
`hands. G employs an assistive speaking device to
`generate speech and can slowly use a keyboard and
`mouse.
`
`5) Subject J: Impairment unknown,but has speech
`that is difficult to understand and some reduction in
`
`2) Subject E: This user was unableto type or use a
`mouse, but could use a rollerball mounted so thatit
`could be operated with movements of the chin. The
`roller ball was equipped with two buttons. Theleft
`hand one was configured to act as a conventional
`button and the right hand one was configured to
`stay “down” when it was used, allowing screen
`elements to be dragged. Typing was provided by
`adding an on-screen keyboard.
`Items on the
`keyboard were selected by clicking the pointer on
`them.
`In order to improve speed and typing
`accuracy, text prediction was added. The layout of
`the interface for this user is shown in figure 2
`below.
`
`Epic Games Ex. 1023
`Epic Games Ex. 1023
`Page 3
`Page 3
`
`

`

`An adaptable user interface
`
`397
`
`” fue Reais foasret Gpiions
`
`font
`
`sponen Resse Het
`
`:
`
`1: Helle
`2: Help bg
`3: Helpful
`4: Hell
`: Helpless
`
`Figure 2 - Emulated Videophonewith adaptations for subject E
`
`3) Subject I: The adaptation employed for this user
`consisted of an on-screen keyboard and text
`prediction.
`
`4) Subject G: This user did not need any adaptation
`to the interface. The siting of the keyboard, mouse
`and the assistive speaking device was critical.
`Because of reduced hand movement control, a
`rollerball was tried, but this proved to require too
`fine motor control for this user.
`
`5) Subject J: The only adaptation made to the
`system was the addition of the text prediction,
`added in the space below the picture window.
`
`6) Subject D: Again, the only adaptation made to
`the system wasthe addition of the text prediction.
`
`As a result of these adaptations, all the users were
`able to complete all parts of the conversation
`without any assistance.
`It was found that
`the
`interface could be configured to suit each user in a
`matter of minutes by recalling saved layouts.
`
`CONCLUSIONS
`
`This exercise demonstrated the need for an adaptable
`interface that could be tailored to the needs of
`different users, and it showed that it is possible for
`people with quite severe disabilities to participate
`independently in a multimedia conversation.
`
`Having verified that the basic emulated service tool
`operated as required and that the interfaces can be
`adjusted, stored and retrieved when required, a
`number of aspects of the interface to multimedia
`conversation services can now be explored usingthis
`tool. These include:
`
`1) The usefulness of different media to a
`conversation or collaborative task, and how this
`usefulness can be affected by the absolute and
`relative positions or sizes of the “media spaces”
`within the interface.
`
`2) The value of being able to customise the
`appearance of the interface and its elements
`according to task or personal preference, and being
`able to recall task or user dependentlayouts.
`
`3) Interaction between people in a conversation or
`collaborative task, particularly when one or more
`participants have difficulty handling an information
`medium. Issues such as those that govern the
`suitability of another medium to convey the same
`information,the effect that this difficulty has on the
`richness of the information being conveyed, and the
`types of assistance that the remote partner can give
`the person with a disability can all be explored.
`
`It is the intention of the authors to utilise this tool
`to explore someofthese issues.
`REFERENCES
`
`[1] Blattner M. M. and Dannenberg R. B., (1992)
`“Multimedia Interface Design”, Addison-Wesley
`Publishing Company
`
`[2] Schneiderman B., (1992) “Designing the User
`Interface”, Addison-Wesley Publishing Company
`
`[3] Weisbecker A., Machate J. and Koller F., (1993)
`“Guidelines & Rules for Development of MADE
`Multimedia Applications”, FhG-IAO, Deliverable of
`Esprit Project 6307 (MADE1)
`
`Epic Games Ex. 1023
`Epic Games Ex. 1023
`Page 4
`Page 4
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket