throbber
CHI 98 • 18-23 APRIL 1998
`
`PAPERS
`
`Squeeze Me, Hold Me, Tilt Me!
`An Exploration of Manipulative User Interfaces
`Beverly L. Harrison, Kenneth P. Fishkin, Anuj Gujar, Carlos Mochon*, ~oy Wont
`
`Xerox Palo Alto Research Center
`3333 Coyote Hill Road, Palo Alto, California, USA 94304
`{beverly, fishkin, agujar, want}@parc.xerox.com, carlosm@mit.edu
`
`ABSTRACT
`This paper reports on the design and use of tactile use~ in(cid:173)
`terfaces embedded within or wrapped around the deVIces
`that they control. We discuss three different interaction
`prototypes that we built These interfaces were embedded
`onto two handheld devices of dramatically different form
`factors. We describe the design and implementation chal(cid:173)
`lenges, and user feedback and reactions to these pr~totyp_es.
`Implications for future design in the area of marupulattve
`or haptic user interfaces are highlighted.
`
`KEYWORDS: Physical, tactile, and haptic Uls, pressure
`and tilt sensors, UI design, interaction technology.
`
`INTRODUCTION
`Over the past 5 years there has been increasing interest in
`augmented reality and physically-based user interfaces [ 4,
`6, 7, 8, 10, 12, 15, 16, 17]. A goal of these emerging proj(cid:173)
`ects is to seamlessly blend the affordances and strengths of
`physically manipulatable objects with virtual environments
`or artifacts, thereby leveraging the particular strengths of
`each. Typically, this integration exists in the form of physi(cid:173)
`cal input devices (e.g., "phicons" (7), "bricks" [4)) virtually
`linked to electronic graphical objects. Manipulation of the
`physical objects signals a related operation on the associ(cid:173)
`ated electronic objects. This mapping is further reinforced
`by tightly coupling the placement of the physical objects
`relative to the electronic objects on a flat table-like display
`surface.
`
`Another approach has been to use standard monitors or
`even stereoscopic monitors with more realistic input de(cid:173)
`vices [6, 8]. In these cases, unique physical input devices
`&e cleverly matched to the requirements of the specific
`application domain (e.g., MRis, remote telerobotic con(cid:173)
`trol). The affordances of the input devices are well matched
`to the virtual representation of the object that they repre(cid:173)
`sent Designers of commercial video games have been
`taking such an approach to user interface manipulation
`since the invention of the Data Glove TM and, more rf'.rP.nth•
`
`Permission to make digi1:1lnmrd copies of all or part of this material for
`pmon:tl or classroom use is granted '\.\ithout fee prO\ided th:it the copies
`:ire not made or distnouLed for profrt or commercial ad,':Ultnge, the copy(cid:173)
`rii;:ht notice. the title of 1he public:rtion :ind its: d:ite "Pl'=• :ind noti~ i<>
`given'lh:tt copyright is bypennission of the ACM, Inc. To copy o!ho!T\\iS.:,
`to republish. to post on set'\'.:rs or to redistribute to lists. requires specific
`permission ;ind/or fee.
`CHI 98 Los Angeles CA USA
`Copyright 1998 0-89791-975-0/98/ 4 .. $5.00
`
`with such games as flight simulators and car racing, where
`the UI is controlled by steering throttles or steering wheels.
`Again, in these examples a specialized input device con(cid:173)
`trols a separate electronic display.
`These extensions to graphical user interfaces seem logical
`in view of the widespread support and acceptance of direct
`manipulation interfaces (11} and of real-world metaphors,
`such as trash cans and file folders [12]. We believe that
`such physical user interface manipulators are a natura! step
`towards making the next UI metaphor the real world itself:
`real objects having real properties that are linked to or em(cid:173)
`bedded in the virtual artifacts that they control. Further(cid:173)
`more, we conjecture that this metaphor reflects a powerful,
`largely unexplored user interface paradigm.
`We set out to further explore this new area. We have been
`influenced by several previous research prototypes that
`reflect elements of an "embedded physicality" approach.
`Fitzmaurice [3], Rekimoto [9], and Small & Ishii [12] at(cid:173)
`tached sensors to small handheld displays and subsequently
`used these displays to "scroll" through or view a larger
`virtual space. Movement of the display is mapped to co~e(cid:173)
`sponding movements in the virtual space, such as c?ang'.ng
`the view perspective [9] or to the degree of magruficatlon
`(12]. These prototypes demonstrated the intuitiveness of
`this embedded physicality approach. The work we repo~
`here incorporates manipulations different from these previ(cid:173)
`ous examples to further improve our understanding of the
`breadth and potential of these new kinds of interactions.
`
`Our work differs from the previous work on "physical ban(cid:173)
`dies" in one particularly interesting way. We are investi(cid:173)
`gating situations in which the physical manipulations are
`directly integrated with the device or artifact, such ~ a
`small PDA, that is being controlled. We are not explonng
`separate input devices, but rather making the physical arti(cid:173)
`fact itself become the input device by means of embedded
`sensor technologies.
`The goal of this paper is to share our experiences in de(cid:173)
`signing, building, and using several prototypes of such user
`interface techniques and devices and in reflecting on new
`ways for thinking about this new class of user interface.
`
`* Intem at Xerox PARC from the MIT Department of Physics.
`
`..
`'
`
`17
`
`Petitioner Samsung Ex-1038, 0001
`
`

`

`PAPERS
`
`CHI 98 • 18-23 APRIL 1998
`
`CHOOSING TASKS
`We chose three diverse and previously unexamined user
`tasks. This allows us to explore alternative kinds of ma(cid:173)
`nipulations, test different sensing technologies, and more
`thoroughly probe the research space. Additionally, we se(cid:173)
`lected tasks that represented two different types of interac(cid:173)
`tion: active user interaction via explicit physical manipula(cid:173)
`tions or via passive user interaction, sensed implicitly.
`Finally, we selected tasks that were relevant for other
`PARC research groups who were implementing applica(cid:173)
`tions for portable document devices [10]. For this reason,
`we focused on portable pen-based systems.
`
`By implementing new user tasks, we hope to contribute to
`the general body of knowledge about physically manipulat(cid:173)
`able interfaces. We believe that this experience will assist
`us in formulating a more general framework, design princi(cid:173)
`ples, and theoretical foundations for physically afforded
`interfaces [2].
`
`We chose several simple tasks: navigation within a book or
`document, navigation through long sequential lists, and
`document annotation. In the next section, we describe ma(cid:173)
`nipulation of real world, traditional artifacts and outline the
`task representation, user actions, and feedback for each of
`our selected tasks. Following this, we describe our three
`task UI designs in terms of how these real world manipula(cid:173)
`tions were mapped to the devices we selected. Again we
`discuss our designs in terms of task representation, user
`actions required, and feedback. We then highlight some of
`the implementation decisions and tradeoffs that impacted
`the interaction design. Finally, we discuss feedback from
`the informal evaluations and interviews conducted thus far
`and the implications for future work.
`Navigation within a Book or Document
`The task representation assumes that the book or document
`has a number of predictable properties. These include
`physically manipulatable page units, a sequential organiza(cid:173)
`tion, a thickness or "extent", and obvious start/end points.
`These properties afford page-by-page manipulation and
`movement through blocks of pages relative to the start/end
`points of the whole book or document The user actions
`we looked at were flicking comers of pages (for page-by(cid:173)
`page navigation) and thumbing into a location of the
`book/document by moving blocks of pages relative to the
`beginning or ending. Manipulation of these traditional
`artifacts provides feedback in the form of visual cues
`(pages move or "animate", new destination page shows,
`new relative location shows), auditory cues (the sound of
`pages turning), and kinesthetic cues (tactile pressure on
`finger or thumb, tactile feedback of pages moving or slid(cid:173)
`ing).
`Navigation through Sequential Lists
`Generally, users conceptualize lists in different ways than
`books or documents (though similar navigation techniques
`could be used). We decided to use a Rolodex listing of
`index cards for this list navigation task. In this case, the
`task representation assumes physically manipulatable items
`or cards, a circular sequential organization, and a knob that
`
`controls the Rolodex. User actions are manipulation via
`turning the knob (with a rate parameter) and stopping at a
`desired location. Visual feedback includes the flipping of
`items or cards, the rate of flipping, and a new destination
`item or card. Auditory feedback is the sound of the cards
`flipping. Kinesthetic cues include finger pressure, extent
`of rotational movement, and direction ofrotation.
`Annotation and Handedness Detection
`We defined this task as hand written annotation on a page(cid:173)
`by-page basis (i.e., one page at a time), where the original
`page contains margins and white space within which the
`annotations are made. User actions are typically bi(cid:173)
`manual: the non-dominant hand anchors the page while the
`dominant hand writes the annotations wherever there is
`room. Visual feedback is the appearance of the annotations.
`There is minimal auditory feedback. Kinesthetic cues are
`the pen pressure in the hand, anchoring pressure in the non(cid:173)
`dominant hand, and the pen/writing movement and friction.
`
`This task is of particular interest to us in that we introduced
`new capabilities not available in the real world annotation
`task. As such, it represents an opportunity for computa(cid:173)
`tionally enhanced interaction. In traditional page annota(cid:173)
`tion users must fit the annotations into existing, limited
`white space based on the static layout of the page. During
`annotation, their writing hand also often obstructs the text
`that they are commenting on. We decided to optimize an(cid:173)
`notation by maximizing the amount of white space and its
`position within the page. We detect the handedness of the
`user and then dynamically shift the text "away" from the
`writing hand thereby maximizing white space directly un(cid:173)
`der that hand (see Figure 4 bottom). We describe this de(cid:173)
`sign and the implementation of it in subsequent sections.
`SELECTION OF DEVICES
`Our design criteria were that the devices chosen would be
`handheld, support pen-based input, allow serial port input
`(for sensor communication), have a development environ(cid:173)
`ment for custom applications, and be cost effective (since
`we anticipated embedded hardware). Ideally, we wanted
`several devices with different form factors.
`
`We decided to use two different portable devices to test our
`manipulations. We chose to use a handheld computer for
`the page turning and handedness detection manipulations (a
`Casio Cassiopeia™). For the list navigation manipulations,
`we chose a Palm Pilot™. Clearly, a number of other de(cid:173)
`vices could have been selected - these two were chosen for
`their ubiquity.
`DESIGNING THE INTERACTION
`Navigation within a Book or Document
`This task was divided into several simple, common sub(cid:173)
`tasks: turning to the next page, turning to the previous
`page, and moving forward or backwards in large "chunks"
`relative to the beginning and ending of the document.
`Page-by-page navigation
`We now had to decide on how these tasks would be ac(cid:173)
`complished by a user - we tried for manipulations similar
`to those used in traditional artifacts. As the Cassio™ dis-
`
`18
`
`Petitioner Samsung Ex-1038, 0002
`
`

`

`'CHI 98 • 18-23 APRIL 1998
`
`PAPERS
`
`played individual pages ,vith a sequential ordering, we de(cid:173)
`cided to use a flick on the upper right comer from right to
`left to indicate "forward one page". A flick on the upper
`left comer from left to right would indicate "back one
`page". These actions were highly similar to the real world
`actions. Visual feedback was similar; pages changed (,vith(cid:173)
`out animation), and the new destination page and page
`number became visible after the user action. After a page
`turning manipulation, both the page number and the con(cid:173)
`tents change to reflect either the preceding or next page,
`depending upon the direction of the stroke. However, we
`did not implement sound effects in this iteration and some
`kinesthetic feedback was lost (notably the friction of pages
`sliding). Figure 1 shows a "real-world" page-turning ges(cid:173)
`ture (top), and the implemented interface to the page(cid:173)
`turning command (bottom) on the CassioTM.
`
`tively, the surface of the device can have pressure sensors
`attached to it, which detect when they are pressed, detect
`the direction of pressure from a stroke, and have the active
`application respond appropriately. We decided to try this
`approach since this allowed us to "retro-fit" pressure(cid:173)
`sensing technology onto a normally pressure-insensitive
`device. Also, we did not need to use valuable screen real
`estate for the large area graphics needed to display a finger
`operated button. Finally, this would provide us with op(cid:173)
`portunities to later use the sensor technology in other appli(cid:173)
`cation contexts and across applications.
`Navigation by relative position
`The extent and start/end points were not obviously repre(cid:173)
`sented (the thickness of the Cassio™ was invariant and too
`narrow for relative positioning). Hence moving forward or
`backward by chunks relative to the beginning or ending of
`a document was more difficult to represent for virtual
`documents. We decided to use a grasping manipulation at
`the top of the device, where the relative position of the
`grasp determined the relative position within the document.
`Far left corresponded to page 1 and far right corresponded
`to the last page. While this was not tightly analogous to
`known real world metaphors, it appealed to the well-known
`GUI metaphor of the scroll bar. A grasp gesture will move
`to a new location in the document and display the new lo(cid:173)
`cation's page number and contents.
`
`Figure 1. Page-by-page navigation, real-world (top) and
`. with the prototype (bottom)
`
`This interaction requires that the left and right upper comer
`detect a finger press, the direction of a stroke, and a release
`of pressure. Several implementation options are possible.
`\Vithin each application where document reading occurs, a
`touch sensitive display can detect pressure points and their
`origin, determine if this aligns with a document upper cor(cid:173)
`ner, track the path of pressure to determine the stroke di(cid:173)
`rection, and execute the appropriate page tum. Altema-
`
`Figure 2. Navigation by relative position
`
`Based on our chosen representation for moving by
`"chunks" and the corresponding user action, we again de(cid:173)
`cided to use pressure sensors. To detect a grasp with the
`thumb and a finger, pressure strips were attached along the
`front top edge of the device. Grasping any portion of the
`strip moves to a document position relative to the begin(cid:173)
`ning or ending of the document, where the beginning maps
`to the far left of the strip and the end maps to the far right
`of the strip. For example, Figure 2 shows a "grasp" gesture
`moving the display to roughly 2/3 of the way through of
`the document
`Navigation through Sequential Lists
`We used a Rolodex metaphor-based technique for list
`navigation (see Figure 3, top). The circular list is manipu(cid:173)
`lated by turning the Rolodex, while the direction of the tum
`determines whether the cards flip from A to Z or from Z to
`A. Our device-embodied representation was similar to the
`
`19
`
`Petitioner Samsung Ex-1038, 0003
`
`

`

`PAPERS
`
`CHI 98 • 18-23 APRIL 1998
`
`ing the stylus). This strategy is appropriate for maximizing
`legibility of the text while holding the stylus and annotating
`adjacent to the text (Figure 4, top). In general, we were also
`interested in exploring different unobtrusive mechanisms
`for determining handedness.
`
`real world artifact in that we used items with visual tabs
`arranged in a sequence. Turning the circular list towards
`the user would begin flipping through from A towards Z
`(assuming an alphabetized list) and vice-versa. On a physi(cid:173)
`cal Rolodex, users turn the knob rotationally (at some rate
`of speed) (Figure 3, top). On the Palm PilotThi, the user
`action was in fact a tilt movement away from a neutral
`resting position and not a rotational turn of a knob (this
`would be more akin to rotation of the entire Rolodex). In(cid:173)
`stead of having a rate or speed of turning we used the ex(cid:173)
`tent or degree of tilt (Figure 3, bottom). Turning ''harder''
`(i.e., to a larger extreme) moves faster through the list,
`similar to Rekimoto [9]. To stop at or select a particular
`item, the user either ceases to tilt (i.e., maintains the list
`container in a neutral or vertical position relative to that
`item), or squeezes the device, mimicking a grasping ges(cid:173)
`ture (akin to grasping the Rolodex card).
`
`Figure 4. Annotation, real-world (top) and with the prototype
`(bottom)
`
`For the handedness detection task we needed to understand
`something about how users hold and operate the intended
`device. We designed this such that no special manipulation
`was needed other than picking up the device and/or stylus.
`(i.e., "passive user interaction"). The handedness detection
`is immediately visible when the user invokes any applica(cid:173)
`tion that wants to be "handedness-aware". In the specific
`task we implemented, for a right-handed user, text is im(cid:173)
`mediately left justified and annotation space remains at the
`right side of the text (Figure 4, bottom). The reverse is true
`for left handed users. When both hands are used or the
`device is set down (i.e., no hands), the text appears cen(cid:173)
`tered. Feedback from the annotation remains consistent
`with the real world case; ink trails appear as the pen writes.
`
`Finally, we examined sensor options for unobtrusively de(cid:173)
`termining handedness. Several implementation paths were
`considered. Heat sensors on either side of the device could
`potentially detect whether contact with a hand occurred on
`the right or left side of the device (based on heat from the
`user's hand). However, this detection would be complex
`since physical extremities such as hands aud feet generate
`heat levels comparable to many other environmental fac(cid:173)
`tors, including that of the device itself. Another alternative
`
`Figure 3. list navigation, real-world, knob rotation {top), and
`with the prototype, device tilt (bottom)
`Annotation and Handedness Detection
`Finally, we consider the task of optimizing annotation
`through maximizing "appropriate" white space and maxi(cid:173)
`mizing text visibility by detecting handedness. Sensing
`handedness means that text or graphics would be moved
`towards the non-dominant hand while screen space would
`be maximized on the opposite side (next to the hand hold-
`
`20
`
`Petitioner Samsung Ex-1038, 0004
`
`

`

`CHI 98 • 1 B-23 APRIL 1998
`
`PAPERS
`
`was to detect properties of the writing which are unique to
`left handed or right handed writing styles. This is some(cid:173)
`what problematic since these algorithms are complex and
`the system can only take effect after the user has started
`writing. We decided to use pressure sensors again, this
`time to determine the points of contact and detect how the
`device was being held (if at all). Pressure sensing pads
`were attached to the back of the device, on the left and
`right sides, in alignment with positions used for holding the
`device.
`IMPLEMENTATION
`We now focus on the implementation details and issues
`that directly impacted the user interface and interaction
`design.
`Navigation within a Book or Document
`The Casio device was augmented with a network of pres(cid:173)
`sure sensors. Two overlaid strips on the top edge detect the
`page turning manipulations (Figures 1 and 2). The pres(cid:173)
`sure sensor network reports its current values through an
`interface connected to the RS232 port on the device. A
`simple communications protocol was devised, where each
`packet indicates the ID of the reporting sensor, and the
`current value of the sensor. Packets are only sent when the
`value changes. Absolute values, rather than deltas, are re(cid:173)
`ported, so that we can recover from dropped/damaged
`packets. The document reading application runs as a multi(cid:173)
`threaded application under Windows CE: one thread per(cid:173)
`forms user I/0, while the other monitors the sensor stream.
`
`To implement the page turning manipulations, two pressure
`sensors are overlaid at the top edge of the device. One
`type of sensor strip reports pressure, but not spatial loca(cid:173)
`tion. The second type reports spatial location, but not pres(cid:173)
`sure. Unfortunately, the spatial sensor tended to have a
`great deal of jitter. In order to compensate for this, two
`measurements were made from the spatial sensor: the first
`measuring the distance from the left end, the second meas(cid:173)
`uring the distance from the right end. The sum of these two
`values should be a constant - if they differ too much from
`this constant, the values are rejected. Othenvise, the aver(cid:173)
`age of the two values is used. The {location, pressure}
`values are stored from the moment of pressure-down to
`pressure-up. If the sum of the inter-location differences is
`negative, the user is deemed to be stroking from right-to(cid:173)
`left. If the sum is positive, the user is deemed to be strok(cid:173)
`ing from left-to-righL If, regardless of this sum, the range
`of spatial locations is in a narrow range, the user is deemed
`to be pressing at a certain spot (ie., the "grasp" gesture).
`Navigation through Sequential Lists
`In order to implement a tilt detection mechanism for con(cid:173)
`tinuous list scrolling on a handheld computer, we investi(cid:173)
`gated a number of sensors. The commercially available tilt(cid:173)
`sensor design we chose is based on an electrolyte bordered
`on two sides by a pair of conductive plates. As the device is
`angled towards or away from either plate, the amount of
`electrolyte in contact ,vith the plate varies. The area of fluid
`in contact ,vith each plate ,vill affect the impedance pre(cid:173)
`sented by the contacts of the sensor. By monitoring this
`
`impedance and converting its change into a voltage, a sim:(cid:173)
`ple ADC interface to a microcontroller can capture the data
`and then process it. In our system the tilt angle is converted
`into a 4-bit value and transmitted to the Palm Pilot TM across
`an RS232 link after being prefixed with the 4-bit sensor(cid:173)
`ID, a total of 8 bits for each tilt sample.
`
`By mounting a tilt sensor of this type to the case of a Palm
`PilofrM, with the sensor plates parallel to the plane of the
`display, we were able to use the sensor readings as a crude
`measure of the computer's orientation relative to gravity.
`We arranged it so that the Pilot generated a neutral reading
`at the 45 degree point and produced 8 readings fonvard and
`backwards from that position: 45 degrees being close to the
`angle that most people use to read from the display of the
`PiloL Even though the range of angles detectable is thus
`very coarsely defined, we found that it has been adequate
`to implement and support the Rolodex-like metaphor.
`
`In addition to sensing tilt, the system must differentiate
`between inadvertent tilting, such as when walking with it,
`and intentional tilting, when the user wishes to navigate.
`There are two possible ways of addressing this issue. The
`first method is to apply higher threshold values to the tilt
`sensing itself, thereby removing manipulations which are
`not of extremes and hence presumably retaining only de(cid:173)
`liberate user requests. This was infeasible in our desired
`application since we wished to use ranges of tilt to indicate
`the rate of list movement Another possible solution is to
`create a second specific manipulation that indicates user
`intention. In our case, we decided to use an initial squeeze
`of the device to indicate the desire to navigate through the
`list, followed by a second squeeze to "grasp" the desired
`item, thereby ending the navigation task. To avoid muscle
`stress, users did not have to maintain the squeezing pres(cid:173)
`sure during navigation. The device was padded with foam
`to further suggest squeezing capability.
`
`To achieve the squeeze feature, we attached pressure sen(cid:173)
`sors along both sides of the Palm PilotTM in positions that
`aligned with the users' fingers and thumb (independent of
`which hand was holding the device). To differentiate
`squeezing from holding the device, we tested 10 users and
`derived an appropriate threshold value for the pressure sen(cid:173)
`sors. (In this case, using higher pressure threshold values to
`differentiate inadvertent from intentional action was appro(cid:173)
`priate). The "squeezing" gesture has several advantages for
`users of a hand-held device. It doesn't require that the user
`reposition either hand, or that the users alter their viewing
`angle or viewing distance to the device, requiring only a
`momentary increase in the pressure used to hold the device.
`While the sensors report which finger(s) are exerting this
`pressure, at present our algorithms make no use of this ad(cid:173)
`ditional information.
`
`The list navigation task provides two levels of user feed(cid:173)
`back. Since the device is often moved about, the "tilt fea(cid:173)
`ture" is initially disabled. When users wish to navigate
`through lists they commence movement by squeezing the
`device. At this point the device is tilt-enabled. At present
`we have a message displayed indicating this to the users (it
`
`?.1
`
`Petitioner Samsung Ex-1038, 0005
`
`

`

`PAPERS
`
`says "Go Tilt!"). Clearly, a different message and a differ(cid:173)
`ent means of conveying tilt-enabled would be better. Inde(cid:173)
`pendent of this or any message, it is visually obvious when
`tilt-based navigation is enabled. Tilting works as described
`and users can see the list flipping through entries at varying
`rates of speed in the appropriate direction, depending upon
`the direction and magnitude of the tilt The display ceases
`moving when the user either holds the device in the neutral
`position or again squeezes the device, thereby disabling tilt
`(This is akin to grabbing the currently displayed item).
`
`Annotation and Handedness Detection
`Since these handheld devices are typically gripped by the
`edge with one hand, while they are used with the other
`hand, we detected handedness by mounting two pressure(cid:173)
`sensitive pads on the back surface. If the pressure pad on
`the back left is pressed, the sensor thread of the program
`concludes that the user is holding the device with (at least)
`the left hand. If the pad on the back right is pressed, the
`program concludes that the user is holding the device with
`(at least) the right hand.
`
`USAGE
`A number of design issues arose as we iterated through the
`design and development of the prototypes. We did a num(cid:173)
`ber of in-laboratory, informal user tests to estimate thresh(cid:173)
`old values for sensors. Typically this was done with our
`immediate research project group and a few other inter(cid:173)
`ested people (n=7). Once the sensor values were initially
`determined, we then carried out informal user testing and
`interviews on 15 different people outside our research proj(cid:173)
`ect group. These users were fellow research staff who had
`little experience with physically manipulatable interfaces.
`Users were not instructed on how to hold the devices. They
`were given only brief descriptions of what the sensors
`would do (e.g., turns pages like you would in a paper
`document, tilting this moves through the Rolodex list).
`Following this, we observed them and recorded their com(cid:173)
`ments. We asked them specific questions about what they
`expected to occur, what problems they encountered, and
`what they liked most and least
`General Comments and Impressions
`In general, our test users found the manipulations "intui(cid:173)
`tive", "cool", and "pretty obvious in terms of what was
`going on." Some users needed quick demonstrations to
`understand that their manipulations would actually be in(cid:173)
`terpreted. Our users had little or no exposure to physically
`embedded user interfaces and therefore often did not ex(cid:173)
`pect interaction with the device to be understood. Undoubt(cid:173)
`edly, conveying the basic paradigm will be necessary in the
`same way that users needed to understand the conceptual
`foundation for direct manipulation interfaces and mice.
`Once users understood the basic paradigm, they immedi(cid:173)
`ately began to explore the range of interaction. Just as
`Gills users try to find out what is "clickable" by moving
`around the screen with the cursor and clicking, our test
`users tried a variety of manipulations on the prototypes to
`see what the range of detectable manipulations was. For
`example, to turn pages they tried long and short strokes,
`
`22
`
`. .
`fast and slow strokes, light and hard strokes, and starting
`the stroke at different points on the device surface.
`
`CHI 98 • 18-23 APRIL 1998
`
`,.
`
`While the user explicit actions were quickly understood,
`the passive interaction (handedness) was perceived as
`"magical." Since no explicit commands or manipulations
`were needed, users seemed amazed that the device recog(cid:173)
`nized and optimized for handedness. They were unable to
`tell how this was accomplished without us explaining it.
`This suggests not only that passive manipulations can be
`powerful, but that they greatly impact a user's interaction
`experience when well integrated with the form factor of the
`device. We clearly need to explore more passive manipula(cid:173)
`tions to see if this is a general property. Additionally, this
`illustrates an opportunity for computationally augmented
`task representations that provide more than the real world
`analogy (in experientially positive ways).
`Navigation within a Book or Document
`Several interesting usage observations were made in these
`tasks. Because of our need to overlay pressure sensors,
`users now had to exert greater pressure than they expected
`for the page-turning manipulations. Users try out manipu(cid:173)
`lations based on their expectations from the real world. A
`page turn in a paperback book, for example, takes very
`little pressure. All of our users initially attempted exactly
`the same manipulation on the device, which was too light
`to be sensed. However, they were able to quickly adjust
`with practice.
`
`In general, we believe that users will attempt to exactly
`replicate the analogous real-world manipulation, when
`those metaphors are used; and they will expect them to
`work. If we are striving for enriched interaction experi(cid:173)
`ences, the more exactly we can support or match these ex(cid:173)
`pectations the better. Making our sensors more sensitive to
`detect lighter page turning strokes would clearly be an im(cid:173)
`provement.
`
`Users had no problem in discovering the manipulation
`needed for "previous page" once they had tried the "next
`page" manipulation. Despite slight differences in the pres(cid:173)
`sure required over that of real-world interaction, users re(cid:173)
`lied on extending their understanding of the real-world
`metaphor to guide their further assumptions about what
`was possible with the device-embodied interface. As in
`GUI design, small inconsistencies in metaphor seem to be
`forgiven.
`
`Users needed to have the "navigation by chunks" mecha(cid:173)
`nism described to them. Almost certainly this was because
`the device did not resemble a book, nor did the manipula(cid:173)
`tion map directly to the manipulation in the real world.
`Grasping along a strip to indicate relative position is unique
`to this interaction. Once described or briefly demonstrated,
`users had no trouble remembering this manipulation or
`applying it.
`
`One difficulty arose as a consequence of our implementa(cid:173)
`tion strategy. Since page turning strokes and grasping are
`both done on the same region and usin

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket