`Handheld Devices
`Ivan Poupyrev 1, Shigeaki Maruyama 2 and Jun Rekimoto 1
`1 Interaction Lab, Sony CSL,
`2 Micro Device Center, Sony EMCS,
`3-14-13 Higashigotanda, Shinagawa
`2-15-3 Konan, Minato-ku
`Tokyo 141-0022, Japan
`Tokyo 108-6201, Japan
`{poup, rekimoto}@csl.sony.co.jp, shigeaki.maruyama@jp.sony.com
`http://www.csl.sony.co.jp/IL
`
`
`
`
`
`
`
`
`There are many possibilities for tactile feedback use in mo-
`bile interfaces. Here, we are particularly interested in a
`small subset of this design space: using touch as the ambi-
`ent, background channel of interaction. Our basic hypothe-
`sis is that in a mobile setting where the user’s attention is
`not fixed on the computer, but on real-world tasks, such
`ambient tactile interfaces would become a necessary and
`important counterpart to the traditional visual interaction.
`Tactile or coetaneous sense is defined as a combination of
`various sensations evoked by stimulating the skin [11]. In
`combination with kinesthesia, tactile feedback is often re-
`ferred to as haptics [4] and is crucial for us to interact with
`our physical environment. The importance of tactile feed-
`back has been recognized in many fields from virtual real-
`ity [8] to design of consumer electronics [1], and it is natu-
`ral to extend its applications to mobile computers.
`Indeed, mobile devices are naturally at a close proximity to
`our skin, whether it is in our hand or tucked away in a
`pocket. Skin is the largest human sensory organ (~1.8 m2
`[11]) and with the exception of water and heat regulation,
`most of it is unused. Mobile interfaces can utilize this.
`Tactile feedback also provides superior temporary dis-
`crimination, e.g. when rapidly successive data needs to be
`resolved, the feel of touch is about five times faster than
`vision [19]. Hence, it allows for precise and fast motor con-
`
`ABSTRACT
`This paper investigates the sense of touch as a channel for
`communicating with miniature handheld devices. We em-
`bedded a PDA with a TouchEngine™ – a thin, miniature
`lower-power tactile actuator that we have designed specifi-
`cally to use in mobile interfaces (Figure 1). Unlike previous
`tactile actuators, the TouchEngine is a universal tactile dis-
`play that can produce a wide variety of tactile feelings from
`simple clicks to complex vibrotactile patterns. Using the
`TouchEngine, we began exploring the design space of in-
`teractive tactile feedback for handheld computers. Here, we
`investigated only a subset of this space: using touch as the
`ambient, background channel of interaction. We proposed a
`general approach to design such tactile interfaces and de-
`scribed several implemented prototypes. Finally, our user
`studies demonstrated 22% faster task completion when we
`enhanced handheld tilting interfaces with tactile feedback.
`Keywords: tactile feedback, mobile devices and interfaces
`INTRODUCTION
`Miniature handheld computing devices such as mobile
`phones, PDAs, digital cameras, and music players are rap-
`idly permeating into our lives today. In the near future, we
`will, perhaps, spend more time interacting with these de-
`vices than with conventional desktop computers equipped
`with familiar graphical user interfaces (GUI). Therefore,
`the interface design for small handheld computing devices
`is an important and exciting challenge for interface re-
`searchers and designers.
`There are many limitations and difficulties to overcome,
`well documented in interface literature [25, 29, 33, 36].
`Many of these limitations, however, also reflect the as-
`sumptions and expectations inherited from traditional GUI,
`that fail in handheld devices, particularly heavy reliance on
`rich visual feedback and undivided attention from the user.
`In this work we attempt to extend beyond familiar visual
`interfaces and employ a relatively unexplored channel for
`interaction with handheld devices: the sense of touch.
`
`
`Permission to make digital or hard copies of all or part of this work for
`personal or classroom use is granted without fee provided that copies are
`not made or distributed for profit or commercial advantage and that
`copies bear this notice and the full citation on the first page. To copy
`otherwise, or republish, to post on servers or to redistribute to lists,
`requires prior specific permission and/or a fee.
`UIST’02, October 27-30, 2002, Paris, FRANCE.
`Copyright 2002 ACM 1-58113-488-6/02/0010…$5.00.
`
`
`
`Figure 1: TouchEngine™ actuator – a new vibrotactile
`actuator for designing tactile interfaces for small hand-
`held computing devices; it is only 0.5 mm thick
`
`Volume 4, Issue 2
`
`51
`
`
`
`SCEA Ex. 1019 Page 1
`
`
`
`trol: When we roll a pencil in our fingers, we can quickly
`and precisely re-adjust the 3D positions and grasping forces
`of our fingers by relying entirely on touch [3]. Furthermore,
`these complex motor operations produce little cognitive
`load and can be performed in parallel with other activities,
`such as reading a newspaper. This is because large areas of
`the sensory cortex are devoted to processing stimuli from
`the skin. Moreover, a large amount of the processing occurs
`in the lower level of the spinal cord, where sensory and
`motor neuron fibers intersect [11]1. By re-directing some of
`the information processing from the visual channel to
`touch, we can take advantage of this ability to reduce the
`cognitive load and make it easier to operate mobile devices.
`The skin also acts as a powerful information pick-up chan-
`nel. In the real world, it allows us to correctly judge object
`properties when vision fails: e.g. textures and surface varia-
`tions can be accurately detected by touch [9]. Geldart in
`1957 [18] developed a vibrotactile language called “Vi-
`bratese” and demonstrated that trained subjects were able to
`receive a complex message up to 38 words per minute. This
`and later studies [38] show that with proper encoding, mes-
`sages can be transmitted through the skin. We can take ad-
`vantage of this when designing mobile interfaces.
`The message, however, does not necessarily need to be
`symbolic: touch has a strong emotional impact. Running a
`finger into a splinter, touching a cat’s fur, or immersing
`into some unknown sticky substance all bring intense,
`though very different, emotional responses. Hence, touch is
`a very strong “break-in” sense: coetaneous sensations, es-
`pecially if aroused in unusual patterns, are highly attention-
`demanding [17]. This has already been explored in some
`mobile interfaces, but there is still more room for research.
`To conclude, these properties make touch an ideal channel
`of interaction with handheld devices: it is fast, needs little
`conscious control, allows for information encoding, and
`produces strong emotional responses. This paper investi-
`gates some of the implications of using tactile feedback in
`mobile interfaces.
`We begin with a review of a related work. We then de-
`scribe the design and implementation of novel tactile dis-
`plays based on the TouchEngine, a miniature low-power
`actuator that we created specifically for small handheld
`devices. Unlike other previously reported actuators, the
`TouchEngine is a versatile device that can be used to gen-
`erate a wide spectrum of tactile feelings. We continue by
`exploring the design space for tactile displays in mobile
`devices and investigate several applications using touch as
`the ambient, background channel for mobile communica-
`tion. Finally, we report the results of the experimental stud-
`ies, which demonstrated that tactile feedback resulted, on
`average, 22% faster task completion when used in combi-
`nation with tilting interfaces in a 1D-scrolling task.
`RELATED WORK
`The importance of tactile feedback in human-machine in-
`teraction is well recognized [1, 7]. Much effort has been
`
`1 In a striking example of this integration, it has been shown that cat walk-
`ing may be entirely controlled within its spinal cord [9].
`
`52
`
`Volume 4, Issue 2
`
`spent on simulating realistic tactile properties of our physi-
`cal environment, such as roughness of surfaces. This has
`been achieved by either developing new, special purpose
`haptic devices [9, 32] or by adding tactile display to usual
`desktop devices, e.g. a mouse or touchpad [2, 26, 30]. In
`passive tactile feedback, alternatively, the user can “feel”
`virtual objects via manipulating physical objects with simi-
`lar haptic properties, e.g. shape or surface texture. For ex-
`ample, in one system, a user rotates a doll’s head to rotate a
`virtual 3D model of a human brain [24]. This approach is
`being also explored in the tangible and graspable interface
`research areas [13, 27, 35].
`Tactile interfaces were intended to enhance the user experi-
`ence by making interfaces more realistic, intuitive, and easy
`to use. Another direction of research investigated how in-
`formation can be encoded and transmitted to the user
`through stimulation of the skin [17, 18, 38]. Results from
`this work were used to develop devices to assist blind or
`deaf people, e.g. Optacon [38]. Most of these devices rely
`on vibrotactile arrays to create a “tactile picture” that can
`be recognized through touch [e.g. 10].
`Unfortunately, little work has been done to add tactile
`feedback to mobile handheld devices. To effectively utilize
`tactile feedback, we must initially overcome two major
`challenges. First, there are feasibility limitations in the cur-
`rent actuator technology [20]. The fundamental require-
`ments for mobile actuators are 1) small, miniature size; 2)
`lightweight; 3) low voltage (~5V) and low power consump-
`tion; and 4) ease in customization to allow retrofitting to
`devices of various sizes and forms.
`While technical characteristics of displays are important,
`we should not forget that they should also feel good. Satis-
`fying only feasibility requirements would not produce ex-
`pressive tactile displays. Although touch may not be as rich
`as vision, it is by no means a one-dimensional sense: we
`have an amazing range of tactile sensations and tactile dis-
`plays should be able to evoke them. Depending on the de-
`sign, the difference in feeling between tactile displays can
`be profound; perhaps, as profound as the difference
`between today’s high-resolution color monitors to black
`and white vector displays from the 1960s.
`Hence, the second obstacle is human factors limitations.
`We still have limited understanding of the sense of touch,
`but we can suggest that the optimal mobile tactile display
`should satisfy the following requirements [9, 10, 17, 35]:
`1. Fast response. The minimal time in which humans can
`detect two consecutive tactile stimuli is about 5 ms [11].
`Note, that this is about five times faster then vision. Hence,
`we can approximate the minimum required latency of a
`tactile actuator to be ~5ms. A large lag would significantly
`reduce the quality of feeling and would not allow users to
`effectively regulate their motor movements or perceive
`complex tactile patterns. For example, vibration motors in
`mobile phones have significant latency and, therefore, can-
`not be used in interactive applications.
`2. Variable intensity. Humans can discriminate a vast range
`of intensities of tactile stimulus. Given appropriate condi-
`tions, the palm of a hand can feel vibratory stimulus with
`
`
`
`SCEA Ex. 1019 Page 2
`
`
`
`an amplitude as low as 0.2 microns [21]. On the other hand,
`for a comfortable tactile feeling, a much larger displace-
`ment should also be provided. Hence, the actuator should
`be able to generate variable skin displacements.
`3. Wide frequency bandwidth. The vibration frequency has
`a profound effect on tactile perception. For example, single
`tactile pulses, or taps, are perceived differently from the
`sinusoidal vibration, because different coetaneous receptors
`respond to these patterns [11]. The ideal actuator should
`provide variable frequencies from 1 to 1000 Hz.
`4. Multitude of different wave shapes. Humans are able to
`distinguish a wide range of tactile wave shapes. Gault in
`1924 [17] converted speech into vibration and found that
`trained subjects were able to distinguish one out of 10 short
`sentences with 75% accuracy. Hence, to be effective, tactile
`actuators should be able to produce numerous wave shapes.
`We are not aware of tactile actuators that satisfy these crite-
`ria: most were designed to create a single type of tactile
`feeling for a narrow application. Vibration motors, for ex-
`ample, rotate eccentrically weighted shafts that vibrate at
`about 130Hz. Because its purpose is to alert the user, it is
`not important that there is significant latency and that only
`sinusoidal patterns with limited amplitudes and range of
`frequencies can be displayed. These limitations, however,
`make it impossible to use vibration motors to encode any
`complex tactile feelings and to use them in interactive ap-
`plications where small latency is crucial.
`Voice coils and speakers can also be used for tactile stimu-
`lation [20], but the displacement and force they provide is
`low. Recently Fukamoto [16] embedded a voice coil type
`actuator that provided sufficient force when driven at the
`resonant frequency. However, only a single frequency and
`amplitude of vibration can be produced; therefore, complex
`patterns of vibration cannot be encoded.
`Other interesting technologies are tactile matrix arrays (e.g.
`[12]) and direct electrical stimulation [28]. Matrix arrays
`are usually large and heavy, and require a considerable
`amount of voltage and power, e.g. 350V in the case of [12],
`making them impossible to use in mobile devices. Electro-
`coetaneous devices can be very small and efficient, but
`their feel is quite different from the familiar tactile feeling.
`As with any display technology, properties of tactile dis-
`plays determine their interface applications. Currently few
`
`applications of mobile tactile displays are available. Vibra-
`tion motors have long been used as silent alarm devices,
`however, because of their inherent limitations, they provide
`only one bit of information: an occurrence of an event, e.g.
`a phone call. Probably, the first interactive tactile feedback
`for handheld devices was implemented in the Active Click
`by Fukamoto [16]. Active Click was designed to provide a
`tactile “click” feeling when users touch the graphical but-
`tons on a touch screen. The coil actuators that he used have
`very low latency and the illusion of physical button pushing
`was strong. However, this is only a narrow application–due
`to the limitations we discussed above, it might be difficult
`to further expand the Active Click into other applications.
`Tactile displays have been proposed for wearable com-
`puters [20, 39] to be used, for example, as a navigation aid.
`In this work, we, however, are more interested in handheld
`devices and more interactive applications than display of
`navigating information. We were inspired by designs that
`attempted to deviate from the traditional GUI paradigm to
`create interfaces explicitly tailored for mobile devices. Ex-
`amples include context-aware and ambient interfaces, em-
`bodied and gesture interfaces, augmented reality, and oth-
`ers [6, 15, 25, 36]. Most of these interfaces still center on
`visual interaction and assume focused attention by the user.
`By augmenting them with tactile displays, we want to ex-
`pand these concepts leading to new mobile interfaces.
`DESIGNING A HANDHELD TACTILE DISPLAY
`Development of any haptic display involves: 1) Choosing
`or developing a haptic actuator, a transducer that converts
`electrical signals into mechanical motion; electrical motors
`are typical actuators used in haptic research. 2) Designing a
`tactile display that converts mechanical motion produced
`by the actuator into force communicated to the user. The
`same actuator can lead to various haptic displays with very
`different interactive properties. Indeed, motors have been
`used in haptic displays ranging from exoskeletons to small
`vibrators in mobile phones. 3) Developing control hard-
`ware and software. Effective operation of haptic display
`requires in-depth understanding of its properties.
`This section describes the design of the TouchEngine–a
`new tactile technology that includes actuator, tactile display
`and control hardware and software that we have developed
`to overcome deficiencies of current tactile displays.
`
`
`
`Figure 2 Left: The bending motor: the top layers contract and the bottom expands bending the entire actuator; Middle: A micro-
`scopic view: 18 layers of piezo and 19 layers of electrode; Right: Indirect haptic display.
`
`Volume 4, Issue 2
`
`53
`
`
`
`SCEA Ex. 1019 Page 3
`
`
`
`TouchEngine actuator
`Basic structure
`The TouchEngine actuator is constructed as a sandwich of
`thin (~0.28µm) piezoceramic film with printed adhesive
`electrodes in between, forming an extremely thin (less then
`0.5mm) beam (Figure 1 and Figure 2). The piezoceramic
`material works as a solid state “muscle” by either shrinking
`or expanding, depending on the polarity of the applied volt-
`age. The material on the top has an opposite polarity to that
`on the bottom, so when a signal is applied the entire struc-
`ture bends (Figure 2, left). This configuration is often
`called a “bending motor” actuator.
`Bending motors that were previously used in tactile dis-
`plays consisted of only two layers (biomorphs) and re-
`quired extremely high voltage (e.g. 350V in [12]) making
`them unsuitable for mobile devices. By sandwiching multi-
`ple layers of very thin piezo film with the adhesive elec-
`trode (Figure 2, middle), we can reduce the voltage re-
`quired for maximum displacement to ±8-10V. Indeed, for
`the voltage V and the thickness of the piezoelectric layer T,
`the displacement D and force F for the serially connected
`bending motors are as follows [40]:
`
`
`F
`
`=
`
`VT
`⋅
`
`
`
`
`
`ever, are more concerned with the actuator dynamics, par-
`ticularly how fast an actuator can reach the required dis-
`placement: The greater the acceleration of the actuator, the
`stronger the impulse force becomes (see equation 4) and
`the sharper the tactile impulses can be detected by the user.
`The actuator latency will depend on its electrical properties.
`Electrically, the piezoceramic actuator behaves as a capaci-
`tor. The typical capacitances for TouchEngine are ~3µF for
`a 7-layer and ~4µF for an 11-layer actuators. Consequently,
`the voltage across the actuator–and the displacement–
`would change according to the capacitor charging and dis-
`charging waveforms (Figure 3). Because a capacitor reac-
`hes over 99% of it is charge in 5RC seconds, [22] we can
`estimate the latency of an ideal actuator as:
`(2)
`
`τ = 5RC.
`where C is the actuator capacitance and R is the serial resis-
`tance [22]. Hence, the latency of an actuator is constant and
`does not depend on the voltage; an actuator would reach
`any target displacement in the same amount of time.
`For a real actuator internal resistance and inductance, mate-
`rial dumping, physical mounting, and limitations in the
`current supply would cause further delay in the displace-
`ment. Figure 4 shows an actual mechanical response of an
`11-layer actuator measured using a laser displacement sen-
`sor. Latency of approximately 5ms can be clearly seen,
`which is still near optimal latency for the tactile actuator.
`Increasing latency by increasing the capacitance and serial
`resistance can produce a softer and gentler feeling, which
`may be necessary in some applications.
`Current requirement is another very important issue be-
`cause current supply significantly affects actuator response
`time, while in mobile devices current supply is limited.
`Current flows through the piezo actuator only when it
`bends. The actuator then stays in the bended state without
`draining any additional current. The current equation is:
`dV
`
`,
`Ci =
`dt
`where C is the capacitance and V is the signal. The current
`is a sinusoid for a sinusoidal signal and a narrow spike for a
`square wave [22]. This peak current can be fairly high: up
`to 450 mA for a 20V peak-to-peak signal. If the battery
`
`(3)
`
`+-
`
`(1)
`
`w
`4
`31
`g
`3
`2
`31
`where l and w are beam dimensions, and d31 and g31 are
`piezoelectric constants. We can see that by decreasing the
`thickness T, we can achieve the same displacement D with
`a lower voltage. This, however, also decreases the force F;
`we compensate this by layering multiple thin piezoceramic
`layers together (Figure 2, middle).
`The resulting actuator holds unique properties. It is ex-
`tremely thin, small, can be battery-operated, and manufac-
`tured in various sizes and number of layers. It is also fast
`allowing us to control both the amplitude and frequency of
`vibrations at the same time, to create a variety of tactile
`waveforms – this is not possible with any other actuator.
`Electromechanical properties
`This
`section discusses
`the TouchEngine’s electro-
`mechanical properties. Note that this discussion applies to
`any piezoceramic bending motor actuators.
`The relation between the signal voltage, the actuator dis-
`placement, and the force is linear (equation 1). We, how-
`
`
`
` and
`
`
`
`TV
`
`D
`
`=
`
`
`
`4 dl
`2
`
`
`
`Figure 3: Black: input signal; grey:
`actuator voltage: a capacitive charg-
`ing/discharging curve.
`
`Figure 4: Black: input signal, grey:
`actuator displacement (its jaggy be-
`cause of the limited sensor resolution).
`
`Figure 5: Control board that drives Touch-
`Engine actuator.
`
`54
`
`Volume 4, Issue 2
`
`
`
`
`
`SCEA Ex. 1019 Page 4
`
`
`
`
`
`=
`
`d
`
`−=
`
`the time derivative of the momentum; therefore, higher
`acceleration of the actuator results in a stronger force:
`r
`r
`r
`pd
`pd
`vdm
`r
`F
`a
`dt
`dt
`dt
`d
`Here, the entire handheld device acts as a tactile display.
`Because our actuator moves very fast, we can create very
`sharp, distinct force impulses. Also, the actuator can be
`attached anywhere inside the device making construction of
`a tactile display easy, even for very small devices.
`Implementing a mobile tactile display
`We embedded a TouchEngine tactile display into a Sony
`Clié PEG-N700C Palm OS based PDA. A custom board
`that controls the actuator is inserted into the PDA
`(Figure 5). We freed the space inside the PDA by removing
`the Memory Stick slot, Jog Dial, and pen compartment.
`A basic diagram of the board is presented in Figure 6. An
`Atmel AVR 8-bit RISC microprocessor creates different
`waveforms and sends them to the actuator using a 12-bit
`digital-to-analog converter (DAC). The power regulator (a
`DC-DC converter) is then used to boost the voltage from
`the 3.2V of the PDA battery to r12V, and the current is
`further amplified using 250 mA unity-gain buffers. Caution
`must be taken to prevent amplifier oscillations, which often
`occurs when driving large capacitive loads.
`Applications running on the Clié’s Palm OS communicate
`with the AVR chip via a serial port. We can easily repro-
`gram the microprocessor without removing the board from
`the PDA using a programming socket, so various tactile
`feedback applications can be prototyped on the same plat-
`form. The board can be optimized further, such as using
`power supplies that are much smaller and more efficient.
`Nevertheless, we are very satisfied with this platform as it
`fits entirely within the PDA, invisible to the user; provides
`high quality tactile feedback; and is easy to use for proto-
`typing and evaluating mobile tactile interfaces.
`MOBILE TACTILE INTERFACES
`This section analyzes the design space of applications for
`mobile tactile interfaces. We do not attempt to suggest the
`ultimate taxonomy of mobile tactile interfaces: there are,
`perhaps, more applications than we can imagine at this
`point; other taxonomies can also be used (e.g. [31]). In-
`stead, this taxonomy is only meant to be a rough outline
`that can guide designers and developers.
`Ambient displays and touch
`In our analysis of tactile mobile interfaces, we were in-
`spired by previous research on ambient and peripheral
`awareness interfaces [10, 27, 34, 37]. They attempt to util-
`ize human ability to receive and process information on the
`periphery of attention, without shifting focus from the pri-
`mary task. For example, it is relatively easy for someone to
`note that a meeting is about to start in the next office or that
`the weather is about to change [10].
`The premise of ambient or peripheral awareness displays is
`that they would first, allow users to perceive and process
`incoming information with little conscious effort and with-
`out need to interrupt current activities. Second, they would
`provide unobtrusive notification techniques to shift the
`
`a
`
`−=
`
`.
`
`(4)
`
`Volume 4, Issue 2
`
`55
`
`
`
`Figure 6: Basic diagram of the driving board.
`
`
`
`cannot supply the maximum required current the actuator
`latency will increase, e.g. by integrating equation (3):
`
`tiC
`
`tv )(
`= −1
`⋅
`where t is time. Therefore, in case of limited current supply
`the actuator voltage (and displacement) changes not as a
`capacitive curve on Figure 3, but as a linear function of
`time: more current can be supplied, the faster the actuator
`bends. Thus, current amplifiers are essential for achieving
`low latency. We have found that 250 mA unity-gain buffers
`were sufficient to achieve a latency of 5ms.
`While the peak current may be large, the average current is
`low and dependant on the signal frequency and maximum
`voltage. We have found that actuator require approximately
`3 mW of power for a single 20V peak-to-peak square pulse,
`which can be easily achieved in handheld devices.
`TouchEngine display
`The actuator that we designed bends from the signal ap-
`plied. The next step is to convert this mechanical motion
`into a detectable force. The major challenge in doing this is
`the very small total displacement of the actuator, less then
`0.1 mm. Two strategies have been developed to utilize our
`actuator: direct tactile display and indirect tactile display.
`In direct tactile display the actuator moves a part of the
`device, e.g. a single button or an entire PDA screen. The
`basic interaction metaphor is direct touch: when the user
`touches a part augmented with tactile feedback, various
`tactile patterns communicate back. Different elements of
`the device can be augmented with distinct tactile feedback,
`however, it may be difficult and expensive to retrofit all the
`elements of a device with its own tactile actuators.
`Indirect tactile display is based on the tool metaphor: When
`we hit a nail with a hammer, the force is not communicated
`directly to our hand; we feel the effect of the impact
`through the handle. To achieve a similar effect, the actuator
`is placed anywhere within the device with a small weight
`attached to it (Figure 2, right). Force is generated using
`conservation of momentum; for an isolated system with no
`external force, its total momentum is zero. Thus, when the
`actuator bends, the mass moves up or down with momen-
`tum pa and the entire device moves with equal momentum
`pd in the opposite direction. The force F felt by the user is
`
`SCEA Ex. 1019 Page 5
`
`
`
`proposed that a signal should be simplified to retain only
`the most significant bits of information and then re-mapped
`into a different abstract representation suitable for a par-
`ticular ambient display and application. This idea provides
`an approach to tackle an important problem in ambient in-
`terfaces: how to communicate “enough” information to
`make the display useful, while at the same time, produce
`low cognitive load and create easy to use interfaces.
`The lowest amount of abstraction in tactile displays is when
`we directly simulate the tactile feeling of real-world ob-
`jects. This was investigated in VR and telepresence appli-
`cations [8]. A tactile interface in this case does not usually
`require much additional attention from the user. The excep-
`tion is when the user has to perform precise identification
`or control only through touch, e.g. blind control.
`A tactile signal can be abstracted to communicate only par-
`ticular tactile properties, e.g. button presses can be simpli-
`fied to a simple sinusoidal pulse to create the button “click”
`feeling. The entire simulation of the button physics may not
`be needed [16]. In this case the tactile signal is in a sense a
`metaphor of the button; other tactile metaphors can be de-
`signed to communicate various feelings, e.g. scratching,
`tapping, breaking, and so on. By associating these tactile
`signals with interface events we can communicate the state
`of the mobile interface through touch. We can further in-
`crease abstractness of the tactile signal by using vibrotactile
`languages, such as Vibrotese, that do not resemble any real-
`istic tactile events but can carry a complex message [38].
`Using them, however, drastically increases cognitive load.
`Here, we are interested in the left top corner of the design
`space shown in Figure 7 that we refer to as Ambient Touch.
`We investigate some of its applications in the next section.
`AMBIENT TOUCH INTERFACES
`This section describes several application scenarios to
`evaluate mobile tactile displays. Although the TouchEngine
`can produce a great variety of vibrotactile patterns, we used
`only simple square waveforms of different frequencies and
`intensities (e.g. Figure 3). We still lack the in-depth under-
`standing of how we can design vibration patterns to evoke
`specific user feelings; hence, for the initial investigation,
`we have chosen very simple patterns in which the feeling
`can be easily predicted and used in interface design.
`We conducted a formal user study for one of the applica-
`tions: tactile feedback for tilting interfaces. With the re-
`maining applications, we informally tested them using our
`colleagues as subjects: the goal was not to collect data but
`evaluate the feasibility of our ideas.
`Notification through touch
`Touch is an excellent “break-in” sense and has been used to
`alert people since the 1960s [19]. Today, vibrotactile dis-
`plays are commonly utilized for user notification in mobile
`phones and, recently, in handheld computers. The common
`problem, however, is that current tactile displays convey
`only a single bit of information: the simple occurrence of
`an event, e.g. a phone call. Because no related information
`is provided, such as who is calling or call urgency, the user
`is forced to interrupt their current activity, interact with the
`device, access the relevant information, and make the fol-
`
`
`
`Ambient Touch
`
`Tactile notification
`Tactile progress bar
`
`Vibrotactile languages
`(e.g. Vibratese)
`
`Tactile feedback for tilting
` interfaces
`Tactile feedback for touch
`screen (e.g. Active Click)
`
`Precise blind control
`
`Simulation of real-world
`tactile feeling
`
`high
`
`Abstraction
`
`low
`
`
`Cognitive load
`low
`high
`Figure 7: Design space of mobile tactile interfaces
`user’s attention when the interruption is needed. A number
`of ambient displays has been explored [27, 37], however,
`all of them relied on visual or aural channels to communi-
`cate information to users.
`We believe that touch is the perfect ambient information
`channel, in many cases, more powerful then visuals or
`sound. We already heavily rely on the tactile sense to re-
`ceive information in the background: we feel the ground
`with our feet and unconsciously adjust how we walk when
`we step from the pavement to the grassy lawn; the move-
`ment from clothes informs us of changes in the wind direc-
`tion; and slight vibrations of a pen help to regulate the
`amount of pressure applied to the paper. Tactile feedback is
`an extraordinary attention management technique. We may
`not hear someone calling our name, but we certainly react
`when someone touches our shoulder. Moreover, because of
`the reasons discussed in the introduction, most of this is
`done with little focused attention, almost unconsciously.
`Ambient tactile interfaces may be particularly useful in
`mobile applications. Indeed, as mobile users are often pre-
`occupied with real world tasks; tactile displays that are al-
`ways close to the body allow information to be received
`without interrupting the current activity to look at the visual
`display [37]. Even when a user is focused on a mobile de-
`vice, constant distractions and noise do not allow constant
`attention on mobile devices; hence, touch may substitute
`vision for short distractions. Finally, due to superior tempo-
`ral processing abilities, we may be able to transfer some of
`the information processing from the eye to the finger.
`Design space of mobile tactile interfaces
`We have organized the design space of mobile tactile inte