throbber
CHI 99 15-20
`
`MAY
`
`1999
`
`Papers
`
`Touch-Sensing
`
`Input Devices
`
`Ken Hinckley and Mike Sinclair
`Microsoft Research, One Microsoft Way, Redmond, WA 98052
`{ kenh, Sinclair} @microsoft.com; Tel: +l-425-703-9065
`
`ABSTRACT
`We can touch things, and our senses tell us when our hands
`are touching something. But most computer input devices
`cannot detect when the user touches or releases the device
`or some portion of the device. Thus, adding touch sensors
`to
`input devices offers many possibilities
`for novel
`interaction techniques. We demonstrate the TouchTrackball
`and the Scrolling TouchMouse, which use unobtrusive
`capacitance sensors to detect contact from the user’s hand
`without
`requiring pressure or mechanical actuation of a
`switch. We further demonstrate how
`the capabilities of
`these devices can be matched to an implicit
`interaction
`technique,
`the On-Demand
`Interface, which uses the
`passive information captured by touch sensors to fade in or
`fade out portions of a display depending on what the user is
`doing; a second
`technique uses explicit,
`intentional
`interaction with touch sensors for enhanced scrolling. We
`present our new devices in the context of a simple tax-
`onomy of tactile input technologies. Finally, we discuss the
`properties of touch-sensing as an input channel in general.
`
`Keywords
`techniques, sensor technologies,
`input devices, interaction
`haptic input, tactile input, touch-sensing devices.
`
`INTRODUCTION
`The sense of touch is an important human sensory channel.
`In the present context, we use the term touch quite narrowly
`to refer to the cutaneous sense, or tactile perception
`[16].
`During
`interaction with physical objects, pets or other
`human beings,
`touch
`(physical contact) constitutes an
`extremely significant event. Yet computer
`input devices,
`for the most part, are indifferent
`to human contact in the
`sense that making physical contact, maintaining contact, or
`breaking contact provokes no reaction whatsoever
`from
`most software. As such, touch-sensing
`input devices offer
`many novel interaction possibilities.
`
`Touch-sensing devices do not include devices that provide
`active tactile or force feedback [22]. These are all output
`modalities that allow a device to physically
`respond to user
`actions by moving, resisting motion, or changing
`texture
`under software control. Touch sensing is an input channel;
`touch sensing allows
`the computer
`to have greater
`awareness of what the user is doing with the input device.
`
`to make digital or hard copies ufall or part ofthis work fol
`Permission
`personal or classroom use is granted without
`fee provided that topics
`are not made or distributed
`for profit or commercial advantage and that
`copies bear this notice and the full citation on the first page. To copy
`otherwise, to republish,
`to post on servers or lo redistribute
`to lists,
`requires prior spccilic permission and!or a fee.
`CHI ‘99 Pittsburgh PA USA
`Copyight ACM 1999 O-201 -48559-1/99/05...$5.00
`
`(a modified Kensington
`Left: The TouchTrackball
`Fig. 1
`Expert Mouse) senses when
`the user
`touches
`the ball.
`Right: The Scrolling TouchMouse
`(a modified Microsoft
`IntelliMouse
`Pro) senses when
`the user
`is holding
`the
`mouse by detecting
`touch
`in the combined
`palm/thumb
`areas.
`It can also sense when the user touches
`the wheel,
`the areas
`immediately
`above and below
`the wheel, or the
`left mouse button.
`(such as touchpads,
`input devices
`Of course, certain
`touchscreens, and touch tablets) that require touch as part of
`their normal operation have been available for many years.
`In all of these devices, one cannot specify positional data
`without
`touching the device, nor can one touch the device
`without specifying a position; hence touch sensing and
`position sensing are tightly coupled in these devices. Yet
`once it is recognized that touch sensing is an orthogonal
`property of input devices that need not be strictly coupled
`to position sensing, it becomes clear that there are many
`unexplored possibilities
`for input devices such as mice or
`trackballs that can sense one or more independent bits of
`touch data (Fig. I).
`
`that
`techniques
`We present two examples of interaction
`match these new input devices to appropriate
`tasks. The
`On-Demand
`Interface dynamically partitions screen real
`estate depending on what the user is doing, as sensed by
`implicit
`interaction with touch sensors. For example, when
`the user lets go of the mouse, an application’s
`toolbars are
`no longer needed, so we fade out the toolbars and maximize
`the screen real estate of the underlying document,
`thus
`presenting a simpler and less cluttered display. By contrast,
`we use the touch sensors located above and below
`the
`wheel on the Scrolling TouchMouse
`to support explicit,
`consciously activated interactions; the user can tap on these
`touch sensors to issue Page Up and Page Down requests.
`Touch sensors allow this functionality
`to be supported in
`very little physical real estate and without
`imposing undue
`restrictions on the shape or curvature of the region to be
`sensed. We conclude by enumerating
`some general
`properties of touch sensors that we hope will prove useful
`to consider in the design of touch-sensing input devices and
`interaction techniques.
`
`223
`
`MICROSOFT EXHIBIT 1007
`
`

`

`Papers
`
`CHI 99 15-20
`
`MAY
`
`1999
`
`PREVIOUS WORK
`that
`[3]
`Buxton proposes a taxonomy of input devices
`draws a distinction between input devices that operate by
`touch (such as a touchpad) versus input devices that operate
`via a mechanical intermediary (such as a stylus on a tablet).
`Card, Mackinlay, and Robertson [5] extend this taxonomy
`but give no special treatment to devices that operate via
`touch. These taxonomies do not suggest examples of touch-
`sensing positioning
`devices other
`than
`the
`touchpad,
`touchscreen, and touch
`tablet. Buxton et al. provide an
`insightful analysis of touch-sensitive
`tablet input [4], noting
`that
`touch
`tablets can sense a pair of signals
`that a
`traditional mouse cannot: Touch and Release. Our work
`shows how multiple pairs of such signals, in the form of
`touch sensors, can be applied to the mouse or other devices.
`
`For the case of the mouse, we have already introduced one
`version of such a device, called
`the TouchMouse,
`in
`previous
`work
`[lo].
`This
`particular
`TouchMouse
`incorporated a pair of contact sensors, one
`for
`the
`thumb/palm rest area of the mouse, and a second for the left
`mouse button. This TouchMouse was used in combination
`with a touchpad (for the nonpreferred hand) to support two-
`handed
`input. The present paper demonstrates
`the
`TouchTrackball and a new variation of the TouchMouse,
`matches these devices to new interaction
`techniques, and
`discusses the properties of touch-sensing devices in general.
`
`Balakrishnan and Pate1 describe the PadMouse, which is a
`touchpad integrated with a mouse [l]. The PadMouse can
`sense when the user’s finger
`touches the touchpad. The
`TouchCube
`[12] is a cube that has touchpads mounted on
`its faces to allow 3D manipulations. Rouse [21] uses a
`panel with 4 control pads, surrounding a fifth central pad, to
`implement a “touch sensitive joystick.” Rouse’s technique
`only senses simultaneous contact between the thumb on the
`central pad and the surrounding directional pads. Fakespace
`sells Pinch Gloves (derived from ChordGloves [ 17]), which
`detect contact between two or more digits of the gloves.
`
`Harrison et al. [7] detect contact with handheld displays
`interaction
`using pressure sensors, and demonstrate
`techniques for scrolling and for automatically detecting the
`user’s handedness. Harrison et al. also draw a distinction
`between explicit actions that are consciously
`initiated by
`the user, versus implicit actions where the computer senses
`what the user naturally does with the device.
`
`The Haptic Lens and HoloWall do not directly sense touch,
`but nonetheless achieve a similar effect using cameras. The
`Haptic Lens [23] senses the depression of an elastomer at
`multiple
`points using a camera mounted behind
`the
`elastomer. The HoloWall
`[ 181 uses an infrared camera to
`track the position of the user’s hands or a physical object
`held against a projection screen. Only objects close to the
`projection surface are visible
`to the camera and thus the
`HoloWall can detect when objects enter or leave proximity.
`
`for
`[20] describes a number of technologies
`Pickering
`touchscreens (including capacitive,
`infrared (IR) detection
`systems, resistive membrane, and surface acoustic wave
`detection); any of these technologies could potentially be
`
`224
`
`input devices. For
`touch-sensing
`implement
`to
`used
`example, when a user grabs a Microsoft Sidewinlder Force
`Feedback Pro joystick,
`this triggers an IR beam sensor and
`enables the joystick’s
`force feedback response.
`
`Looking beyond direct contact sensors, a number of non-
`contact proximity
`sensing devices and technollogies are
`available. Sinks
`in public
`restrooms activate when
`the
`user’s hands reflect an IR beam. Burglar alarms and
`outdoor lights often include motion detectors or light-level
`sensors. Electric
`field sensing devices [26][24] can detect
`the capacitance of
`the user’s hand or body
`to allow
`deviceless position or orientation
`sensing
`in multiple
`dimensions. Our touch-sensing
`input devices also sense
`capacitance, but by design we use this signal in ;a contact-
`sensing role. In principle, an input device could incorporate
`both contact sensors and proximity
`sensors based on
`electric fields or other technologies.
`
`taxonomy organizes the various tactile input
`The following
`technologies discussed above. The columns are divided into
`contact and non-contact
`technologies, with
`the contact
`category subdivided
`into touch-sensing versus pressure or
`force sensing technologies. The rows of the table classify
`these technologies as either discrete (providing an on I off
`signal only) or continuous
`if
`they return a proportional
`signal (e.g., contact area, pressure, or range to a target). A
`technology
`is single-channel
`if it measures touch, pressure,
`or proximity at a single point, or multi-channel
`if it includes
`multiple sensors or multiple points of contact. The table
`omits the position and orientation-sensing properties of
`input devices as these are handled well by previous
`taxonomies
`[3][.5]. The
`table also does not a.ttempt to
`organize the various technologies listed within each cell.
`
`Touch-sensing
`Touchpad
`touch tablet
`touchscreens
`(except
`IR)
`touch-based
`switches
`PadMouse
`
`_
`[l]
`
`TouchMouse
`[ 121
`TouchCube
`touch-sensitive
`joystick 1211
`Pinch Gloves
`[17]
`contact area
`(e.g. some
`touchpads &
`touchscreens)
`
`CONTACT
`/ Force
`1 Pressure
`push button
`membrane switch
`Palm Pilot screen
`(pressure
`required)
`supermarket
`floor
`mats
`car seat: weight
`sensors
`for-airbag
`Psychic Space
`[ 131
`(A grid of floor
`tiles that can
`tiles
`sense which
`a user is standing
`on.)
`pressure-sensitive
`touch tablet
`[4]
`vector
`input
`touchscreen
`torque sensor
`isometric
`joystick
`Multi-touch
`tablet wl
`pressure
`[ 151
`pressure sensors on
`handhelds
`[A
`Haptic
`lens (deform-
`ation at multiple
`
`[9]
`
`1 NON-CONTACT
`1 Proximity
`motion detectors
`electro-magnetic
`field sensor
`[l l]
`Light-level sensor
`Sidewinder
`force-
`feedback
`joystick
`(IR beam sensor)
`IR touchscreens
`
`aser rangefinder
`stud fillder
`
`(181
`ioloWall
`?eld-sensing
`devices
`[24][26]
`
`Table 1 : Classification
`
`of tactile
`
`input technologies.
`
`

`

`CHI 99 15-20
`
`MAY
`
`1999
`
`Papers
`
`TOUCH SENSING: HOW IT WORKS
`The touch-sensing
`input devices described in this paper
`employ the circuitry shown in Fig. 2, which senses contact
`from the user’s hand- no pressure or mechanical actuation
`of a switch
`is necessary to trigger the touch sensor. The
`“touch sensors” are conductive surfaces on the exterior of
`the device shell that are applied using conductive paint
`(available
`from Chemtronics
`[6]). The conductive paint is
`then connected internally
`to the touch sensing circuitry.
`
`The internal circuitry generates a 30 Hz square wave that is
`present on
`the conductive paint pad. The parasitic
`capacitance of the user’s hand induces a slight time delay in
`this square wave. When this time delay passes a critical
`threshold, a Touch or Release event
`is generated. A
`potentiometer
`(shown
`in
`the circuit diagram) allows
`adjustment of this threshold to accommodate conductive
`surfaces of various sizes; this only needs to be set once
`when
`the circuit
`is constructed (no calibration step is
`required for individual users). To provide a good coupling
`with the tactile feedback that the user feels, the capacitance
`sensors are set to generate Touch /Release events only and
`exactly when the user’s hand actually makes (or breaks)
`contact with
`the surface. Our current prototype sends the
`touch data to the host PC’s parallel port.
`vcc
`
`MOUSE-POWER
`
`Pad
`
`GND
`
`D
`
`>cik
`
`se!
`
`Ck
`
`Q-
`
`cr>
`
`74HC74
`
`To parallel pm
`pin 15 (SS)
`
`9
`
`74HC14
`
`O.IpF
`
`T
`GND
`
`To pnralkl pm
`
`GND
`GND
`Fig. 2 Circuit diagram for a single touch sensor.
`When providing multiple
`touch sensors with
`the circuit
`described above, the 30 Hz square wave can pass through
`the user’s body and be picked up by another touch sensor as
`a false Touch or Release signal. Thus, to avoid interference,
`all devices that the user may be touching at a given time
`should be synchronized to the same square wave.
`
`Software Emulation
`One could attempt to emulate Touch and Release events
`from software based only on the events provided by a
`normal mouse. Although
`this approach may be “good
`enough”
`for some interaction
`techniques or to support
`situations in which a touch-sensing device is not available,
`
`it suffers from two significant drawbacks. First, one cannot
`distinguish a user holding
`the mouse still from a user that
`has let go of the mouse; this also implies that one cannot
`know with certainty that subsequent mouse motion occurs
`because the user just touched the mouse, or because the
`user moved the mouse after holding it stationary for some
`period of time. A second limitation of software emulation is
`that only a single Touch /Release event pair for the entire
`input device can be inferred
`in this way. Without using
`actual touch sensors, it is impossible
`to know precisely
`which part(s) of the input device the user is touching, or to
`integrate multiple touch-sensitive controls with a device.
`
`TECHNIQUES
`INTERACTION
`TOUCH-SENSITIVE
`We now discuss specific
`interaction
`techniques that use
`touch sensors to good advantage. These techniques can be
`broadly categorized as implicit
`techniques, which passively
`sense how the user naturally uses an input device, versus
`explicit
`techniques, which require the user to learn a new
`way of touching or using the input device.
`
`an Input Device
`Implicit Actions Based on Touching
`Touch sensors can provide applications with
`information
`about the context of the user’s work, at the level of which
`input devices the user is currently holding. Implicit actions
`use this information
`to improve the delivery and timeliness
`of user interface services, without
`requiring
`the user to
`learn a new way to use the input device. The user .can
`benefit
`from
`touch sensing without necessarily even
`realizing
`that the device senses when he or she touches it.
`The
`following
`section demonstrates how
`this style of
`implicit
`interaction can be used to support the On-Demand
`Interface, and presents initial usability
`testing results for
`this technique.
`
`Interface
`The On-Demand
`Limited screen real estate is one of the most enduring
`design constraints of graphical user interfaces. Display
`resolutions are creeping upward, but quite slowly when
`compared to advances in memory and processor speed.
`Current market research data suggest that 66% of PC users
`are still restricted to a 640x480 pixel display surface [ 191.
`
`Interface uses touch sensors to derive
`The On-Demand
`contextual
`information
`that can be used to make decisions
`about
`the relative
`importance of different parts of a
`graphical
`interface display. We use the touch sensors
`provided
`by
`the TouchTrackball
`and
`the Scrolling
`TouchMouse
`to determine changes to the current
`task
`context, and thus to dynamically shift the focus of attention
`between different
`layers or portions of the display. It may
`be possible to use traditional
`input events such as mouse
`motion or clicks
`to emulate some aspects of the On-
`Demand Interface, but given that the signals from the touch
`sensors are reliable, unambiguous, and require little or no
`overhead
`to use, we believe
`these provide a superior
`information source upon which to base the technique.
`
`toolbars can make a large number of
`For example,
`functions “discoverable” and easy to access for the user, but
`they have often been criticized because these benefits come
`at the cost of permanently consuming screen real estate
`
`225
`
`

`

`P a p e r s
`
`C H I 9 9 15-20 M A Y 1 9 9 9
`
`[14]. Although some toolbars do provide visual indicationsof state (e.g. the current font and point size), most toolbarsdisplay no useful state information when the user is justlooking at a document or entering text with the keyboard.In the On-Demand Interface, when the user touches orreleases the TouchMouse, the toolbars fade in or fade outon an as-needed basis using smooth alpha-transparencyanimation’. Touching the mouse causes the tool bars to fadein quickly, while releasing the mouse causes the toolbars tofade out gradually. The end result is that when the user isnot actively using the toolbars, the screen appears simplerand less cluttered, while the display real estate allocated tothe document itself is maximized (Fig. 3). In the currentprototype, we leave the toolbar slightly transparent evenwhen it is faded in so that the user can maintain awarenessof parts of the document that are underneath the toolbar.We chose to use animations of alpha-transparency ratherthan animated motion such as sliding or zooming. Motiondraws the user’s attention, and our design goal is for theinterface to change in a manner that is minimallydistracting. Fading in takes place quickly (over 0.3 seconds)because the user may be grabbing the mouse with the intentto select an item from the toolbar; fading out takes placemore gradually (over 2.0 seconds) because we do not wantto draw the user’s attention to the withdrawal of thetoolbars. The toolbars could appear instantaneously, but wefind that instantaneous transitions seem very jarring andunpleasant, especially given that such a transition willoccur every time the user grabs the mouse.Note that although it would be possible to fade out
`
`menus and toolbars, this may not always be appropriate.Menus serve as reminder for keyboard shortcuts during textentry, and some toolbars do provide visual indications ofstate. However, one can distinguish the size of the toolbarthat is best for interaction with the mouse versus the size ofthe toolbar that is necessary to visually display the desiredstate information. As seen in Fig. 3, the On-DemandInterface fades in a
`toolbar, scrollbar, and menurepresentation while the toolbars fade out. During ourusability tests, most users did not notice or comment on thischange in appearance, although one user did mention that “Iwould expect the Bold icon to stay in the same place.”
`
`compact
`
`on the
`
`reading mode
`
`of interaction when the user engages thewheel. Rotating the wheel on a standard IntelliMousescrolls the document line-by-line, and we have observedthat users will often keep their finger perched on the wheelwhen they pause to read the document. Since the user doesnot need the toolbars while using the wheel, the On-Demand Interface senses initial contact with the wheel anduses this signal to gradually fade out the toolbars and againmaximize the display real estate allocated to the document.In our current design, a faint trace of the toolbars remainsvisible while in reading mode so that the user can see wherethe toolbars will appear upon return to normal mouse usage.The interface reverts to normal mouse usage when theuser’s index finger returns to and touches the mouse button,which quickly fades the toolbars back in. Althoughaccidentally touching the wheel and thus switching toreading mode might seem to be a problem, during ourusability tests we found this was not a significant issue.Regarding this point, one test user commented that “I likethat it fades back in kind of quick. So if you hadaccidentally touched [the wheel] it’s no big deal.”We use the TouchTrackball to apply the On-DemandInterface concept to the ToolGlass technique [2], whichprovides the user with a set of movable semi-transparent“click-through” tools that are controlled with a trackball inthe nonpreferred hand. When the user touches the trackball,the ToolGlass fades in quickly over 0.3 seconds; if the useris also touching the mouse, the toolbars simultaneouslyfade out
`
`(Fig.
`
`4). When the user releases the trackball, afterthe user is touching the mouse, the toolbars simultaneouslyfade in (over 1.0 second). If the user clicks-through a tool
`
`all
`
`’ We implemented this prototype using a 3D graphics accelerator to
`provide alpha-blending of texture maps; it is not a fully functional
`implementation of Microsoft Word.
`
`We also use the touch sensors on the wheel and
`mouse button of the Scrolling TouchMouse to support a
`

`

`CHI 99 15-20
`
`MAY
`
`1999
`
`Papers
`
`it fades out
`the ToolGlass,
`to initiate a command with
`immediately
`(over 0.2 seconds) and does not fade back in
`unless the user moves the trackball or releases and touches
`the trackball again.
`
`Informal Usability Evaluation
`tests of the On-Demand
`We conducted informal usability
`Interface, which were intended to explore user acceptance
`of this technique and to identify usability problems with our
`current
`implementation. We recruited 11 users from an
`internal pool of administrative assistants for this study. All
`users were familiar with Microsoft Word but none had seen
`or tried our touch-sensing input devices before.
`
`For the testing, we implemented the On-Demand Interface
`technique
`in a prototype
`that fully supported the various
`fade in I fade out transitions in response to interacting with
`the input devices, but only supported limited
`interaction
`with
`the document itself (users could click and drag with
`the mouse to circle regions of text) and limited keyboard
`text entry (as the user typed, text actually appeared in a
`small separate box below the main window). Nonetheless,
`we feel that this functionality was sufficient
`to test the
`utility of the On-Demand interface concept,
`
`In particular, since we felt that trunsirions between the
`different
`task contexts
`recognized by
`the On-Demand
`Interface might result in usability problems, we tried to test
`interleaving
`of
`the various
`task contexts as much as
`possible. For example, we asked users to highlight a word
`with the mouse; then type in some text to replace this; then
`click on the Bold icon in the toolbar; then switch back to
`typing again, and so on. After performing several structured
`tasks of this sort, users were also encouraged to play around
`with the interface to get a more thorough feel for what they
`did or did not like.
`
`to see
`Test users were quite enthusiastic about the ability
`more of the screen during typing and scrolling
`tasks, while
`at the same time having
`the toolbar available on short
`notice. One user explained
`that “I like that [the toolbar]
`comes up quickly when you need it and you can control
`how long it stays up” and that “all the extra stuff isn’t there
`when I don’t need it.” Subjective questionnaire ratings on a
`1 (disagree) to 5 (agree) scale confirmed
`these comments:
`users reported that the TouchMouse was easy to use and
`that
`they
`liked seeing more of
`the document at once
`(average rating 4.5 for both questions).
`
`that
`animations
`fading
`the
`liked
`users also
`Most
`transitioned between screen layouts. Two users did feel that
`the transition
`from the toolbars to a “clean screen” for text
`entry was too slow. One user wanted the toolbar to slide
`into place instead of fading. However,
`it was clear that
`transitions between the toolbars and the “clean screen”
`were well accepted overall and were not the source of any
`significant usability problems; when asked if “switching
`between the keyboard and mouse is disconcerting,” users
`clearly disagreed (average rating 1.9). Users also felt that
`the touch sensors provided an appropriate way to control
`these transitions, offering comments such as “I really like
`the touch-sensitive - I really like that a lot.”
`
`As noted above, in this prototype we experimented with
`leaving
`the toolbars slightly
`transparent even when they
`were
`fully
`faded
`in
`to allow some awareness of
`the
`occluded portions of the document. We felt
`this was a
`useful feature, but all
`II users reported that they disliked
`the slightly
`transparent toolbar, and often in no uncertain
`terms: one user described
`it as looking
`“like a wet
`newspaper” while another simply stated, “I hate that!”
`Users clearly
`felt the display should always transition
`to
`fully opaque or fully
`invisible states. In retrospect, we
`realized
`that
`this dissatisfaction with semi-transparent
`toolbars on top of a text editing application perhaps should
`have been expected given
`that studies of
`transparent
`interfaces have shown text backgrounds lead to relatively
`poor performance [8], and we may not have chosen the icon
`colors, styles, or transparency levels with sufficient care.
`
`With regard to the TouchTrackball and ToolGlass, users
`also generally liked that the ToolGlass faded in when they
`touched the trackball: “That’s cool when the ball takes over
`the hand commands.” One user did comment
`that the
`appearance
`of
`the ToolGlass,
`and
`simultaneous
`disappearance of the toolbars, was the only transition where
`“I felt like too much was going on.” Perhaps the toolbars
`should stay put or fade out more slowly
`in
`this case.
`Interestingly,
`in contrast to the strongly negative response
`to the slightly see-through
`toolbars, most users had no
`problem with
`the semi-transparency of the ToolGlass;
`it
`was probably easier to visually separate the foreground and
`background
`layers in this case because the user can move
`the ToolGlass. For example, one user mentioned that “It’s
`good to see where an action would be and what it would
`look like.” A couple of users commented that using two
`hands “would definitely
`take some getting used to,” but in
`general users seemed to agree that that “using the trackball
`was easy” (average 4.3).
`
`The On-Demand Interface demonstrates a novel application
`of touch sensors that dynamically adjusts screen real estate
`to get unnecessary portions of the interface out of the user’s
`face. Since the initial user feedback has been encouraging,
`we plan to add these capabilities to a more fully
`functional
`application and perform further studies of the technique to
`determine
`if additional
`issues might arise with
`long-term
`use. We are also investigating
`the appropriateness of the
`technique
`for other interface components such as floating
`tool palettes or dialog boxes.
`
`Explicit Actions Based on Touch Sensors
`A second general class of interaction techniques uses touch
`sensors to allow an input device to express an enhanced
`vocabulary of explicit actions, but the user must learn these
`new ways of touching or using the input device to fully
`benefit
`from
`them. Clearly, such actions should have
`minimal
`impact on the way one would normally use the
`device, so that the new capabilities do not interfere with the
`user’s existing skills for controlling
`the input device.
`
`The Scrolling TouchMouse
`is a modified
`(Fig. 1, right)
`The Scrolling TouchMouse
`Microsoft
`IntelliMouse Pro mouse. This mouse includes a
`
`227
`
`

`

`Pauers
`
`CHI 99 15-210 MAY
`
`1999
`
`wheel that can be used for scrolling, and an oblong plastic
`basin that surrounds the wheel. The wheel can also be
`clicked for use as a,middle mouse button.
`
`In the previous section, we described how several of the
`touch sensors on the Scrolling TouchMouse could be used
`for
`implicit
`sensing of the user’s task context.
`In
`this
`section, we describe the use of two touch sensors that we
`have added to the basin, one above and one below
`the
`wheel.
`In addition
`to
`the usual
`line-by-line
`scrolling
`supporting by
`rolling
`the wheel,
`these touch sensors
`enhance scrolling actions with several new behaviors:
`
`.
`
`l
`
`.
`
`Tapping: Tapping the top part of the basin triggers a
`Page Up command; tapping the bottom of the basin
`triggers a Page Down. The wheel is good for short-
`range scrolling, but is less effective
`for long range
`scrolling
`[25]; the tapping gesture provides an effective
`means for discrete scrolling at a larger scale of motion.
`
`the
`Roll-and-hold: This extends the gesture of rolling
`wheel to support smooth scrolling. Rolling
`the wheel
`until
`the finger contacts the top touch sensor or the
`bottom touch sensor initiates continuous up scrolling or
`continuous down scrolling, respectively. The scrolling
`starts after a brief delay (0.15 seconds) to prevent
`accidental activation from briefly brushing the sensor.
`
`touch
`Reading sensor: We already use the wheel
`sensor in the On-Demand interface to sense when the
`user begins a scrolling
`interaction. Since IntelliMouse
`users often leave their finger perched on the wheel
`while
`reading, an intriguing possibility
`is that dwell
`time on the wheel may prove useful as a predictor of
`how much time the user has spent reading content on a
`web page, for example. We have not yet tested the
`wheel sensor in this role.
`
`ten test users
`informal evaluations with
`We performed
`recruited
`from the Microsoft Usability pool; 3 of the 10
`users had previously used a mouse including a scrolling
`wheel. Test users were asked to scroll
`to various points
`within a long web page containing approximately 10 pages
`of content. For this informal study, we did not control the
`distances to the various scrolling targets, nor did we test the
`interleaving of scrolling with other common mouse tasks; a
`future study should address these issues. Our main goals
`were to observe user responses to the device, discover some
`potential usability problems, and see if touch sensors were
`effective for these kinds of interactions.
`
`Users found the tapping feature extremely appealing. When
`asked to respond to the statement “Paging up and down
`with
`the TouchMouse was easier than paging with my
`current mouse” user responses averaged a 4.6 (again on a l-
`5 scale). One user commented “I really like this, it’s pretty
`cool.. . just tap, tap, tap, done!” while another commented
`that “I didn’t really see a reason for the wheel. Just touching
`the gold [sensor] was easy enough.” One user did feel that
`“the tap surface should be larger.”
`
`Several users expected the tapping sensors to support an
`additional gesture that we currently have not implemented,
`the tap-and-hold. Tapping and then holding one’s finger
`would
`trigger a paging command followed by more rapid
`continuous up or down scrolling. One potential problem
`with the tap-and-hold
`is that simply resting one’s finger on
`the basin after tapping would now trigger an action. We
`plan
`to experiment with a tap-and-hold gesture to see
`whether or not it is genuinely useful.
`
`Problems with the device related to the wheel itself and the
`roll-and-hold behavior. When asked to respond to “I liked
`the way the wheel on the TouchMouse felt while scrolling,”
`user responses averaged a 3.2 (with 3 = neither agree nor
`disagree). Several difficulties
`led
`to
`this
`lukewarm
`response. Our
`touch-sensing modifications
`to the wheel
`made it slightly slippery and harder to turn; this problem
`also made it more likely that users would click the wheel by
`mistake, and due to a technical glitch, the roll-and-hold did
`not work
`correctly when
`this happened. Also,
`the
`“continuous”
`scrolling
`implemented
`in our prot.otype was
`jerky and moved
`too slowly. Users did not
`like
`this.
`Fortunately,
`these are not inherent problems and will be
`improved in our next design iteration.
`
`the roll-and-hold mentioned
`the problems with
`Despite
`above, users felt that overall “The TouchMouse was easy to
`use for scrolling”
`(resp

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket