throbber
(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2006/0284849 A1
`Grant et al.
`(43) Pub. Date:
`Dec. 21, 2006
`
`US 20060284.849A1
`
`(54) METHODS AND SYSTEMS FOR PROVIDING
`A VIRTUAL TOUCH HAPTC EFFECT TO
`HAND HELD COMMUNICATION DEVICES
`
`(76) Inventors: Danny A. Grant, Quebec (CA); Jeffrey
`Eid, Danville, CA (US); Shoichi Endo,
`Cupertino, CA (US); Erik J. Shahoian,
`San Ramon, CA (US); Dean C. Chang,
`Gaithersburg, MD (US)
`
`Correspondence Address:
`MIMERSON - THELEN RED & PRIEST
`LLP
`THELEN RED & PREST LLP
`P.O. BOX 640640
`SAN JOSE, CA 95164-0640 (US)
`
`Related U.S. Application Data
`
`(60) Provisional application No. 60/431,662, filed on Dec.
`8, 2002.
`
`Publication Classification
`
`(51) Int. Cl.
`(2006.01)
`G09G 5/00
`(52) U.S. Cl. .............................................................. 345/173
`
`(57)
`
`ABSTRACT
`
`Embodiments of the invention relate to methods and systems
`(100) for providing customized “haptic messaging to use of
`handheld communication devices in a variety of applica
`tions. In one embodiment, a method of providing virtual
`touch to a handheld communication device includes: receiv
`ing an input signal associated with a virtual touch; outputing
`a request relating to a contact with a user-interface member
`coupled to a handheld communication device; and providing
`a control signal associated with the contact to an actuator
`coupled to the handheld communication device, the control
`signal being configured to cause the actuator to output a
`haptic effect associated with the virtual touch.
`
`(21) Appl. No.:
`(22) PCT Filed:
`(86). PCT No.:
`
`10/538,161
`Dec. 8, 2003
`PCT/USO3A38862
`
`600 N.
`
`
`
`
`
`
`
`
`
`
`
`
`
`Requesting a
`contact with a user
`interface member
`
`625
`
`
`
`
`
`
`
`60
`
`
`
`620
`
`Receiving a virtual touch indicator
`
`Performing an initialization
`
`Receiving a virtual touch signal
`
`Providing a haptic effect based on the virtual touch
`signal
`
`
`
`640
`
`Exhibit 1006 - Page 1 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 1 of 9
`
`US 2006/0284849 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`0Ç I
`
`\,- -or
``J00I
`
`ZI I
`
`JOSS000ICH
`
`Exhibit 1006 - Page 2 of 18
`
`

`

`240
`
`250
`
`266
`
`200
`
` Providing a collection of haptic
`
`effects, each associated with a
`distinct control signal
`
`
`
`‘Receiving an input signal
`associated with an event
`
`210
`
`
`
`Receivinga mapping between an
`event of interest and one of the
`haptic effects
`
`
`
`
`Determining a source of the
`event and selecting the
`control signal based on the
`determination
`
`
`
`Compiling the mapping
`
`into a haptic lookup table
`
`
`
`
`
`
`Outputting acontrol signal to an
`actuator coupled to a handheld
`communication device
`
`230
`
`FIG.2
`
`
`
`6JO7994S9007“TZ99WOKINGUOHeIddyjud)eq
`
`
`
`
`
`TV6¢8¢870/9007SN
`
`Exhibit 1006 - Page 3 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 3 of 9
`
`US 2006/0284849 A1
`
`099
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`009
`
`Exhibit 1006 - Page 4 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 4 of 9
`
`US 2006/0284849 A1
`
`#7 “?IH
`
`
`
`
`Exhibit 1006 - Page 5 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 5 of 9
`
`US 2006/0284849 A1
`
`S "?INH
`
`
`
`
`
`
`
`
`
`*J009
`
`Exhibit 1006 - Page 6 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 6 of 9
`
`US 2006/0284849 A1
`
`
`
`
`
`
`
`
`
`
`
`
`019
`
`*J009
`
`9 "?INH
`
`Exhibit 1006 - Page 7 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 7 of 9
`
`US 2006/0284849 A1
`
`08/
`
`L “?IH
`
`
`
`
`*J00L
`
`Exhibit 1006 - Page 8 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 8 of 9
`098
`
`US 2006/0284849 A1
`
`018
`
`8 "?IH
`
`
`
`
`*J008
`
`Exhibit 1006 - Page 9 of 18
`
`

`

`Patent Application Publication Dec. 21, 2006 Sheet 9 of 9
`
`US 2006/0284849 A1
`
`6 * 5) I H
`
`006
`
`
`
`
`
`
`
`Exhibit 1006 - Page 10 of 18
`
`

`

`US 2006/0284849 A1
`
`Dec. 21, 2006
`
`METHODS AND SYSTEMIS FOR PROVIDING A
`VIRTUAL TOUCH HAPTC EFFECT TO
`HAND HELD COMMUNICATION DEVICES
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`0001. This application claims priority to U.S. Provisional
`Patent Application No. 60/431,662, filed on Dec. 8, 2002,
`the entire disclosure of which is incorporated herein by
`reference.
`
`FIELD OF THE INVENTION
`0002 This invention relates generally to haptic-feedback
`systems. More specifically, embodiments of the present
`invention relate to using customized haptic effects in a
`variety of applications to convey information to users of
`handheld communication devices.
`
`BACKGROUND
`0003. As handheld communication devices become part
`of everyday life, device manufactures and service providers
`strive to enhance the versatility and performance of such
`devices.
`0004 Handheld communication devices in the art (e.g.,
`mobile phones, pagers, personal digital assistants (PDAs),
`etc.) typically use auditory and visual cues to alert a user
`when incoming messages, such as voice calls and emails, are
`received. Such auditory and visual alerts, however, have the
`disadvantages of being distracting in Some situations (e.g.,
`during driving), or annoying in others (e.g., during a meeting
`or a concert). Although vibratory alerts are made available
`in some communication devices such as cellular phones,
`such vibratory effects cannot be customized or personalized
`according to applications, thus conveying little information
`to the user. A need, therefore, exists in the art for a new
`sensory modality that delivers information to users of hand
`held communication devices in a personalized fashion.
`
`SUMMARY
`0005 Embodiments of the invention relate to methods
`and systems for providing customized “haptic messaging to
`users of handheld communication devices in a variety of
`applications.
`0006.
`In one embodiment, a method of providing virtual
`touch to a handheld communication device includes: receiv
`ing an input signal associated with a virtual touch; output
`ting a request relating to a contact with a user-interface
`member coupled to a handheld communication device; and
`providing a control signal associated with the contact to an
`actuator coupled to the handheld communication device, the
`control signal being configured to cause the actuator to
`output a haptic effect associated with the virtual touch.
`0007. In another embodiment, a method of providing
`virtual touch to a handheld communication device includes:
`receiving a virtual touch indicator, performing an initializa
`tion responsive to the virtual touch indicator on a handheld
`communication device; receiving a virtual touch signal
`associated with the initialization; and outputting a control
`signal associated with the virtual touch signal to an actuator
`coupled to the handheld communication device.
`
`0008 Further details and advantages of embodiments of
`the invention are set forth below.
`
`BRIEF DESCRIPTION OF THE FIGURES
`0009. These and other features, aspects, and advantages
`of the present invention are better understood when the
`following Detailed Description is read with reference to the
`accompanying drawings, wherein:
`0010 FIG. 1 depicts a block diagram of a haptic hand
`held communication device according to an embodiment of
`the present invention;
`0011 FIG. 2 shows a flowchart depicting a method of
`using customized haptic effects to convey information to
`users of handheld communication devices, according to an
`embodiment of the invention;
`0012 FIG. 3 shows a flowchart depicting a method of
`using haptic logos to relate information to users of handheld
`communication devices, according to an embodiment of the
`invention;
`0013 FIG. 4 shows a flowchart depicting a method of
`haptically encoding communication signals, according to an
`embodiment of the invention;
`0014 FIG. 5 shows a flowchart depicting a method of
`providing haptic messaging to users of handheld communi
`cation devices, according to a further embodiment of the
`invention;
`0015 FIG. 6 shows a flowchart illustrating a method of
`providing an interactive virtual touch in one embodiment of
`the present invention;
`0016 FIG. 7 depicts a flowchart illustrating a method of
`carrying out a chat session using handheld communication
`devices, according to an embodiment of the invention;
`0017 FIG. 8 shows a flowchart depicting a method of
`using haptic effects to relate navigation information, accord
`ing to an embodiment of the invention; and
`0018 FIG. 9 shows a flowchart illustrating a method for
`providing haptic effects to a remote control in one embodi
`ment of the present invention.
`
`DETAILED DESCRIPTION
`0019 Embodiments described in the following descrip
`tion are provided by way of example to illustrate some
`general principles of the invention, and should not be
`construed as limiting the scope of the invention in any
`manner. One skilled in the art would also recognize that
`various changes and modifications can be made herein,
`without departing from the principles and scope of the
`invention.
`0020 FIG. 1 depicts a block diagram of a handheld
`communication device 100 according to an embodiment of
`the invention. It will be appreciated that various elements are
`shown in Schematic form for illustrative purposes and are
`not drawn to Scale. It will also be appreciated that many
`alternative ways of practicing the present invention exit.
`Accordingly, various changes and modifications may be
`made herein, without departing from the principles and
`Scope of the invention.
`
`Exhibit 1006 - Page 11 of 18
`
`

`

`US 2006/0284849 A1
`
`Dec. 21, 2006
`
`Device 100 includes a device body including a
`0021
`housing 110 and a user-interface 112; a processor 120; at
`least one actuator 130 in communication with processor 120;
`and a memory 140 in communication with processor 120.
`Device 100 also includes an antenna 150 and a transceiver
`160, in communication with processor 120. Device 100
`additionally includes a display module 170 and an audio
`module 180, in communication with processor 120. Display
`module 170 may include, for example, a liquid crystal
`device. Audio means 180 may include, for example, a
`speaker, a microphone, and the like.
`0022. For purpose of illustration in the embodiment of
`FIG. 1, processor 120, actuator 130, and memory 140 are
`shown to be enclosed within and coupled to the device body.
`Such an illustration, however, should not be construed as
`limiting the scope of the invention in any manner. In
`alternative embodiments, actuator 130 may, for example, be
`coupled to the outside of housing 110, or embedded in
`housing 110 via a suitable mechanism. Further, user-inter
`face 112 may include one or more user-interface members.
`As used herein, a user-interface member includes, without
`limitation, a key pad having one or more keys, one or more
`buttons, a touch screen or touch pad, a scroll wheel, a
`direction pad, a trackball, a knob, a miniature joystick, or
`other user-interface means known in the art.
`0023 Device 100 further includes an API (Application
`Program Interface) 190, working in conjunction with an
`operating system 195. A device driver (not shown) may
`optionally provide an interface between operating system
`195 and processor 120.
`0024 Memory 140 of device 100 stores a program code
`that includes instructions to cause processor 120 to perform
`various tasks. The following description provides some
`examples.
`0025 FIG. 2 shows a flowchart 200 depicting a method
`of using customized haptic effects to convey information to
`users of handheld communication devices, according to an
`embodiment of the invention. At step 210, an input signal
`associated with an event is received. At step 220, a source of
`the event is determined and a control signal is selected based
`on the determination. At step 230, a control signal is output
`to an actuator coupled to a handheld communication device
`(see FIG. 1 for an embodiment of such device). The control
`signal is configured to cause the actuator to output a haptic
`effect associated with the event.
`0026 Furthermore at step 240, a collection of haptic
`effects is provided, each haptic effect being associated with
`a control signal. For example, memory 140 of FIG. 1 can
`store a program code that includes instructions to generate
`the control signals (e.g., each characterized by a distinct
`waveform) for rendering the corresponding haptic effects.
`Haptic effects (along with associated control signals) may
`also be downloaded or transmitted from a remote source,
`such as a service provider, a network resource, a Web server,
`a remote handheld communication device or computer. Such
`downloaded or transmitted haptic effects can be further
`edited or modified. At step 250, a mapping between an event
`of interest and one of the stored haptic effects is received. By
`way of example, memory 140 of FIG. 1 may also store a
`program code that enables a user to map an event of interest
`to one of the haptic effects as provided, e.g., via user
`interface 112 through API 190, where the event may be
`
`identified by its source. At step 260, the one-to-one map
`pings made between various events of interest and the
`corresponding haptic effects are compiled into a haptic
`lookup table, which can, for example, be stored in memory
`140 of FIG. 1.
`0027. In the embodiment of FIG. 2, the term “selecting
`includes, without limitation, looking up a predetermined
`mapping between the event of interest and a corresponding
`haptic effect based on the source determination, and select
`ing/generating a control signal that is configured to render
`the desired haptic effect associated with the event (e.g., upon
`being applied to an actuator). Selection can be made based
`upon the aforementioned haptic lookup table, for example.
`0028. In one embodiment, the input signal may include a
`communication signal associated with a call event, Such as
`a voice call, an e-mail, or a message in text or multimedia
`form, which may be received via antenna 150 and trans
`ceiver 160 of FIG. 1, for example. The “source” of a call
`event may be related to a characteristic that distinctly
`identifies or characterizes the call event, such as the caller's
`phone number, the sender's e-mail address, a graphical
`feature or an icon associated with the incoming message,
`etc.
`0029. In another embodiment, the input signal may be
`associated with a reminder event, which may be a self
`generated message on the handheld communication device
`serving as a reminder for a pre-scheduled activity (e.g., an
`appointment or a meeting). The Source in this Scenario may
`be associated with the type of a pre-scheduled activity (e.g.,
`a business meeting Vs. a restaurant reservation), or the time
`at which the pre-scheduled activity takes place.
`0030. In yet another embodiment, the input signal may
`include a communication signal associated with a status
`event, for example, received via antenna 150 and transceiver
`160 of FIG. 1. Examples of a status event include, but are
`not limited to: an advertisement (e.g., sale) event, a one-to
`one marketing event, a business-transaction event, a stock
`trading event, a weather-forecast event, a sports (or game)
`event, an entertainment event, and an emergency (e.g., 911)
`event. In this scenario, the Source may be associated with a
`characteristic that distinctly identifies the sender and/or the
`nature of a status event, such as the phone number of the
`handheld user's stockbroker, the e-mail address of the user's
`favorite store, the logo associated with the user's favorite TV
`or radio station, and so on.
`0031. In one embodiment, an event of interest can be
`accompanied by a distinct haptic effect, or overlapping
`haptic effects, conveying to the user customized information
`Such as “who is calling.”“what is happening,” and so on. The
`user can also be allowed to update the haptic lookup table,
`e.g., to include new events, and/or to modify the mappings
`between the existing events of interest and the correspond
`ing haptic effects.
`0032 Moreover, a specific haptic effect can be assigned
`to any incoming signal event whose source is unknown, so
`as to alert the user that the incoming message is from an
`unidentifiable or sender.
`0033. As used herein, the term “handheld communication
`device' includes, without limitation, a mobile phone such as
`a cellular phone or a satellite phone, a personal digital
`assistant (PDA), a cordless telephone, a pager, a two-way
`
`Exhibit 1006 - Page 12 of 18
`
`

`

`US 2006/0284849 A1
`
`Dec. 21, 2006
`
`radio, a handheld or portable computer, a game console
`controller, a personal gaming device, an MP3 player, or
`other personal electronic devices known in the art that are
`equipped with communication or networking capabilities.
`0034. In one embodiment, the aforementioned haptic
`effects can be used as haptic ringers (e.g., counterparts to
`auditory ring tones) that are customized or personalized to
`convey information to the user about various events of
`interest. By way of example, a haptic ringer associated with
`a call from a loved one (e.g., the user's spouse) may
`comprise low-amplitude and high frequency vibrations that
`impart gentle sensations to the user. In contrast, a haptic
`ringer associated with an emergency event (such as a 911
`call) may comprise jolt-like pulses that impart pounding
`sensations to the user.
`0035) In contrast with conventional auditory ring tones,
`the aforementioned haptic effects (e.g., haptic ringers) are
`more desirable in an environment where extraneous auditory
`signals are prohibited (e.g., during a meeting or a concert),
`and/or where it is difficult to distinguish auditory signals
`(e.g., in a loud environment such as an airport). The haptic
`ringers are also more Suitable in distracting situations such
`as driving, so that the user of a handheld communication
`device can keep eyes on the road without having to look at
`the device. Moreover, such haptic ringers convey custom
`ized information to the user, so that the user is aware of “who
`is calling.”“what is happening, and so on, as the following
`examples further illustrate.
`0036) A handheld communication device such as a
`mobile phone may be configured to allow a user to include
`haptic information or a haptic code in an outgoing commu
`nication signal, e.g., carrying a Voice call, an e-mail, or a
`message. The encoding of a communication signal with
`haptic information may be based on an established scheme
`or protocol, and/or on a per-system basis. The haptic code is
`configured to cause a haptic effect to be output when the
`communication signal is delivered to another handheld
`communication device. In one embodiment, businesses and
`organizations may each be associated with a distinct haptic
`logo (e.g., a particular vibration pattern) and include their
`haptic logos in various messages sent to the handheld
`communication devices of their customers. Such haptic
`logos can serve as counterparts to conventional logos known
`in the art, for example. Various status events mentioned
`above may also be transmitted in this manner. By way of
`example, a merchant may include its haptic logo in various
`advertisement events and business transaction events to be
`transmitted to the handheld communication devices of its
`customers. Stock brokers (or brokerage firms), TV or radio
`stations, and marketing/advertising agencies may likewise
`include their haptic logos in various stock-trading events,
`weather-forecast events, sports events, entertainment events,
`and one-to-one marketing events to be transmitted to the
`handheld users.
`0037 FIG. 3 shows a flowchart 300 depicting a method
`of using haptic logos to relate information to users of
`handheld communication devices, according to an embodi
`ment of the invention. A handheld communication device
`receives an input signal at step 310, the input signal being
`associated with a status event. The handheld communication
`device extracts a haptic code from the input signal at step
`320, where the haptic code is associated with a haptic logo.
`
`At step 330, the handheld communication device provides a
`haptic effect associated with the haptic logo. Step 330 may
`include providing a control signal to an actuator coupled to
`the handheld communication device, where the control
`signal is based at least in part on the haptic code and
`configured to cause the actuator to output the haptic effect.
`0038. In one embodiment, the extracted haptic code may
`be directly applied to the actuator for rendering the desired
`haptic effect. In another embodiment, the haptic code may be
`configured according to a predetermined scheme or protocol
`that includes, for example, a table of haptic codes (some of
`which may be associated with one or more haptic logos)
`versus control signals for rendering the corresponding haptic
`effects. In this way, a processor in the handheld communi
`cation device can look up the corresponding control signal
`from the table based on the extracted haptic code, and output
`the selected control signal to the actuator for rendering the
`desired haptic effect.
`0039. In the embodiments of FIG. 2 or 3, the handheld
`communication device (or the haptic code) may be pro
`grammed Such that the haptic effect is output immediately,
`or at a prescribed time after receiving the input signal, as
`desired in applications. The haptic effects can also be
`triggered by, or synchronized with, other occurrences.
`0040. A handheld communication device may be further
`configured such that Some of its user-interface members
`(such as those described above) are each associated with a
`haptic code, e.g., according to a predetermined Scheme or
`protocol. In one embodiment, some of these haptic codes
`may be associated with haptic effects that emulate expres
`sions or behaviors, such as "laugh.”giggle.”hug.”high
`five.'"heartbeat,”“pet purring, etc. This allows haptic
`effects to be transmitted and experienced, e.g., in an inter
`active conversation or a chat session, by pressing or manipu
`lating Such members.
`0041. By way of example, suppose that user A (termed
`"Alice' herein) is engaged in a chat session with user B
`(termed “Bob” herein) via their respective mobile phones. In
`one embodiment, when Bob tells Alice a joke, Alice can
`respond by sending a "laugh’’ sensation to Bob, e.g., by
`pressing a key on her mobile phone that is assigned with a
`haptic code corresponding to a laugh sensation. This causes
`a signal to be transmitted from Alice's phone to Bob's
`phone, and a corresponding haptic effect to be output to
`Bob's phone (and thereby experienced by Bob). In alterna
`tive embodiments, Alice can include a haptic code in an
`outgoing message (which may also contain a video image
`Such as a picture taken by her mobile phone, and/or a
`graphical feature Such as an emoticon emulating a Smiley
`face) to be transmitted to Bob, e.g., by pressing the corre
`sponding user-interface member. The haptic code causes a
`haptic effect to be output when the message is delivered to
`a remote device such as Bob's mobile phone. In one
`embodiment, the haptic effect may be correlated or synchro
`nized with the displaying of a video image contained in the
`message. In another embodiment, the generation of the
`haptic effect based on the haptic code may be carried out in
`a manner similar to that described above with respect to the
`embodiment of FIG. 3.
`0042 FIG. 4 depicts a flowchart 400 illustrating a
`method of a method of haptically encoding communication
`signals, according to an embodiment of the invention. At
`
`Exhibit 1006 - Page 13 of 18
`
`

`

`US 2006/0284849 A1
`
`Dec. 21, 2006
`
`step 410, an input signal associated with an actuation of a
`user-interface member is received. By way of example, the
`input signal may be associated with Alice's pressing or
`manipulating a particular user-interface member. At step
`420, a haptic code associated with the actuation is deter
`mined. At step 430, the haptic code is included in an output
`signal, and the output signal is sent to a remote handheld
`communication device. As described above, the output sig
`nal may also include a message, a video image, and/or a
`graphical feature.
`0043. A handheld communication device may also be
`configured such that a haptic effect, along with a message,
`is output upon a contact with a user-interface member being
`made (e.g., by a user or an input device). FIG. 5 depicts a
`flowchart 500 illustrating a method of haptic message that
`can be associated with this situation, according to an
`embodiment of the invention. At step 510 of the flowchart
`500, a handheld communication device receives an input
`signal. At step 520, the handheld communication device
`outputs a request for a contact with a user-interface member
`coupled to the handheld communication device. At step 530,
`the handheld communication device provides a control
`signal associated with the contact to an actuator coupled to
`the handheld communication device. The control signal is
`configured to cause the actuator to output a haptic effect
`associated with the input signal. Step 520 may include
`having a visual effect displayed, an auditory effect played,
`and/or a distinctive haptic ringer output, which requests a
`contact with the user-interface member being made.
`0044) In one embodiment, the input signal in FIG. 5 may
`include a haptic code, along with a message, a video image,
`and/or a graphical feature, etc. For example, the haptic code
`may be configured to cause a 'hug' sensation to be output
`when the video image contained in the input signal is
`displayed. The input signal may also contain a provision or
`protocol that specifies that the incoming message along with
`the corresponding haptic effect is output upon a contact with
`a particular user-interface member (e.g., the #5 key) being
`made. Alternatively, the handheld communication device
`may determine the user-interface member to be contacted,
`before outputting incoming message along with the corre
`sponding haptic effect.
`0045. In another embodiment, the input signal of FIG. 5
`may be associated with a “virtual touch, e.g., to mimic a
`handshake, a “high-five,” a pat on the back, a pulse or
`heartbeat sensation, a pet purring sensation, or other touch
`sensations associated with human (and/or human-animal)
`interactions. In one scenario, the input signal at step 510 may
`include a “virtual touch indicator, based on which the
`request for a contact with a particular user-interface member
`is made. The virtual touch indicator may be in the form of
`a haptic code, a message, or other informative means. The
`control signal at step 530 may be generated, e.g., based on
`the virtual touch indicator, a haptic code associated with the
`user-interface member at play, or other predetermined
`scheme. The input signal at step 510 may also include a
`virtual touch indicator along with a virtual touch signal for
`rendering the desired haptic effect. In this case, the control
`signal at step 530 may be based on the virtual touch signal.
`0046 Referring back to the chat session between Alice
`and Bob, by way of example at the end of their chat session,
`Alice may wish to send Bob a “high-five.” She sends to
`
`Bob’s mobile phone a signal including a virtual touch
`indicator, which in turn prompts a request that Bob be in
`contact with a user-interface member coupled to his phone,
`Such as a direction pad (e.g., by putting his fingers on the
`individual keys of the direction pad), a key pad, a touch
`screen, a trackball, a joystick, or the like. The control signal
`for rendering a haptic effect that emulates a “high-five' may
`be based on the haptic code associated with the user
`interface member, transmitted with the input signal from
`Alice, and/or other predetermined scheme.
`0047 Interactive virtual touch can also be engaged
`between users of handheld communication devices, where
`the manipulation of a user-interface member on one hand
`held communication device is transmitted possibly in Sub
`stantially real-time to another handheld device and experi
`enced by its user, and vice versa. FIG. 6 depicts a flowchart
`600 illustrating a method of providing interactive virtual
`touch in one embodiment of the present invention. In the
`embodiment shown, a handheld communication device first
`receives an input signal including a virtual touch indicator at
`step 610. A distinctive haptic ringer may, for example,
`accompany the arrival of the virtual touch indicator, iden
`tifying the sender and the nature of the input signal. The
`handheld communication device may then perform any
`necessary initialization to enable the communication at Step
`620, which may also include requesting a contact with a
`particular user-interface member coupled to the handheld
`communication device at step 625. The handheld commu
`nication device subsequently receives a virtual touch signal
`in the communication associated with the desired haptic
`effect at step 630. The handheld communication device
`provides the haptic effect at step 640, e.g., by applying the
`virtual touch signal to an actuator coupled to the user
`interface member.
`0048. In one embodiment, the virtual touch signal may be
`associated with the manipulation of a user-interface member
`on a remote handheld device and transmitted in substantially
`real-time. And the user on the receiving end may respond by
`acting in a similar fashion, so as to emulate an interactive
`touch. Any schemes for delivering virtual touch to users of
`handheld communication devices may be used.
`0049 Haptic effects can also be used to enhance and
`complement the information content communicated
`between handheld communication devices. In one embodi
`ment, a plurality of handheld communication users may be
`engaged in a chat session via their handheld communication
`devices. The users may each have a graphical representation
`or avatar displayed on other handheld communication
`devices. Such avatars can also be haptically enabled, for
`example, whereby their expressions and/or behaviors are
`accompanied and enhanced by corresponding haptic effects.
`FIG. 7 is a flowchart 700 depicting a method of carrying out
`a chat session using handheld communication devices,
`according to an embodiment of the invention. In the embodi
`ment shown, a handheld communication device receives an
`input signal associated with a chat message at Step 710. The
`handheld communication device displays an avatar associ
`ated with the chat message at step 720. The avatar may be
`shown on display 170 of FIG. 1, in one embodiment. At step
`730, the handheld communication device provides a haptic
`effect associated with the chat message. Step 730 may
`include outputting a control signal to an actuator coupled to
`the handheld communication device, where the control
`
`Exhibit 1006 - Page 14 of 18
`
`

`

`US 2006/0284849 A1
`
`Dec. 21, 2006
`
`signal is configured to cause the actuator to output the haptic
`effect. In one embodiment, the haptic effect may be corre
`lated with an expression or behavior of the avatar, Such as a
`laugh or giggle, a cry, a pet purring, or the like.
`0050 Handheld communication devices are increasingly
`equipped with navigation capability, for example, in com
`munication with the Global Position System (GPS) or other
`navigation systems. Haptic effects can also be used to
`convey navigation information, Such as positional and/or
`directional information, to handheld users. By way of
`example, FIG. 8 shows a flowchart 800 depicting a method
`of haptic navigation, according to an embodiment of the
`present invention. The flowchart 800 discloses receiving an
`input signal associated with a position of a handheld com
`munication device at step 810; determining the position of a
`handheld communication device relative to a predetermined
`location at step 820; and providing a haptic effect associated
`with the determination at step 830. Step 830 may include
`outputting a control signal associated with the determination
`to an actuator coupled to the handheld communication
`device, the control signal being configured to cause the
`actuator to output the haptic effect. Further, the input signal
`at step 810 may be received from GPS, a digital compass, or
`other navigation systems known in the art.
`0051. In one embodiment, the haptic effect may be asso
`ciated with a distance between the position of the handheld
`communication device and a predetermined location
`(termed “destination herein). For example, the haptic effect
`may include a vibration having a magnitude and a frequency,
`where at least one of the magnitude and the frequency
`decreases as the distance from the destination diminishes.
`Additionally, the haptic effect may be conf

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket