throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`APPLE INC.,
`Petitioner
`
`v.
`
`IMMERSION CORPORATION,
`Patent Owner
`
`U.S. Patent No. 8,659,571
`Filing Date: February 21, 2013
`Issue Date: February 25, 2014
`Title: Interactivity Model for Shared Feedback on Mobile Devices
`
`
`Inter Partes Review No.: (Unassigned)
`
`
`
`PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO. 8,659,571
`UNDER 35 U.S.C. §§ 311-319 AND 37 C.F.R. §§ 42.1-100, ET SEQ.
`
`
`
`WEST\275180585.6
`
`

`

`TABLE OF CONTENTS
`
`I.
`
`COMPLIANCE WITH FORMAL REQUIREMENTS ................................. 1
`A. Mandatory Notices Under 37 C.F.R. §§ 42.8(b)(1)-(4) ....................... 1
`1.
`Real Party-In-Interest ................................................................. 1
`2.
`Related Matters .......................................................................... 1
`3.
`Lead and Backup Counsel ......................................................... 1
`4.
`Service Information.................................................................... 2
`Proof of Service on the Patent Owner .................................................. 2
`B.
`Power of Attorney ................................................................................ 2
`C.
`Standing ................................................................................................ 2
`D.
`Fees ....................................................................................................... 2
`E.
`STATEMENT OF PRECISE RELIEF REQUESTED .................................. 2
`II.
`III. FULL STATEMENT OF REASONS FOR REQUESTED RELIEF ............ 3
`A.
`Summary of the ’571 Patent ................................................................. 3
`B.
`Person of Ordinary Skill in the Art ...................................................... 4
`C.
`Claim Construction .............................................................................. 4
`1.
`“gesture signal” .......................................................................... 5
`2.
`“dynamic interaction parameter” ............................................... 5
`3.
`“vector signal” ............................................................................ 5
`4.
`“on screen signal” ...................................................................... 5
`5.
`“generating a dynamic interaction parameter using… a
`physical model” .......................................................................... 5
`“generating a dynamic interaction parameter using… an
`animation” .................................................................................. 6
`D. Ground 1: Claims 1-4, 7, 23-26 and 29 are Obvious Under 35
`U.S.C. § 103(a) (pre-AIA) in View of Poupyrev ................................. 6
`1.
`Limitation 1.pre: “A method of producing a haptic effect
`comprising:” ............................................................................... 9
`Claim 2: “The method of claim 1 wherein the first or
`second gesture signal comprises a vector signal.” ................... 23
`
`6.
`
`2.
`
`WEST\275180585. 6i
`
`

`

`Table of Contents
`(continued)
`
`Page
`
`3.
`
`4.
`
`5.
`
`Claim 3: “The method of claim 1 wherein the first or
`second gesture signal comprises an on-screen signal.” ........... 25
`Claim 4: “The method of claim 1 wherein generating a
`dynamic interaction parameter comprises generating a
`dynamic interaction parameter from a difference between
`the first gesture signal and the second gesture signal.” ........... 25
`Claim 7: “The method of claim 1 further comprising:
`receiving a first device sensor signal; receiving a second
`device sensor signal; and wherein generating a dynamic
`interaction parameter comprises generating a dynamic
`interaction parameter using the first gesture signal and
`the second gesture signal and the first device sensor
`signal and the second device sensor signal.” ........................... 26
`Claim 23.pre: “A non-transitory computer readable
`medium having instructions stored thereon that, when
`executed by a processor, causes the processor to produce
`a haptic effect, the instructions comprising:” .......................... 30
`Claim 23.a: “receiving a first gesture signal;” ......................... 31
`Claim 23.b: “receiving a second gesture signal;” .................... 31
`Claim 23.c: “generating a dynamic interaction parameter
`using the first gesture signal and the second gesture
`signal; and” .............................................................................. 31
`10. Claim 23.d: “applying a drive signal to a haptic output
`device according to the dynamic interaction parameter.” ....... 31
`Ground 2: Claims 5 and 27 Are Obvious Under 35 U.S.C. §
`103(a) (pre-AIA) in View of Poupyrev and Primer ........................... 31
`1.
`Claim 5: “The method of claim 1 wherein generating a
`dynamic interaction parameter comprises generating a
`dynamic interaction parameter using the first gesture
`signal and the second gesture signal and a physical
`model.” ..................................................................................... 31
`Claim 27 ................................................................................... 34
`ii
`
`6.
`
`7.
`8.
`9.
`
`2.
`
`E.
`
`
`
`
`
`WEST\275180585.6
`
`

`

`Table of Contents
`(continued)
`
`Page
`
`F.
`
`Ground 3: Claims 6 and 28 Are Obvious Under 35 U.S.C. §
`103(a) (pre-AIA) in View of Poupyrev and Tecot ............................. 34
`1.
`Claim 6: “The method of claim 1 wherein generating a
`dynamic interaction parameter comprises generating a
`dynamic interaction parameter using the first gesture
`signal and the second gesture signal and an animation.” ......... 34
`Claim 28 ................................................................................... 39
`2.
`G. Ground 4: Claims 1, 2, 4-6, 23, 24, and 26-29 are Obvious
`Under 35 U.S.C. § 103(a) (pre-AIA) in Light of Rosenberg
`’373 ..................................................................................................... 39
`1.
`Limitation 1.pre: “A method of producing a haptic effect
`comprising:” ............................................................................. 42
`Claim 2: “The method of claim 1 wherein the first or
`second gesture signal comprises a vector signal.” ................... 55
`Claim 4: “The method of claim 1 wherein generating a
`dynamic interaction parameter comprises generating a
`dynamic interaction parameter from a difference between
`the first gesture signal and the second gesture signal.” ........... 57
`Claim 5: “The method of claim 1 wherein generating a
`dynamic interaction parameter comprises generating a
`dynamic interaction parameter using the first gesture
`signal and the second gesture signal and a physical
`model.” ..................................................................................... 58
`Claim 6: “The method of claim 1 wherein generating a
`dynamic interaction parameter comprises generating a
`dynamic interaction parameter using the first gesture
`signal and the second gesture signal and an animation.” ......... 59
`Limitation 7.a: “The method of claim 1 further
`comprising: receiving a first device sensor signal;” ................ 61
`H. Ground 5: Claims 3 and 25 Are Obvious Under 35 U.S.C. §
`103(a) (pre-AIA) in Light of Rosenberg ’373 and Rosenberg
`’846 ..................................................................................................... 67
`iii
`
`2.
`
`3.
`
`4.
`
`5.
`
`6.
`
`
`
`
`
`WEST\275180585.6
`
`

`

`Table of Contents
`(continued)
`
`Page
`
`1.
`
`Claim 3: “The method of claim 1 wherein the first or
`second gesture signal comprises an on-screen signal.” ........... 68
`Claim 25 ................................................................................... 70
`2.
`This Petition Is Proper Under 35 U.S.C. § 325(d) ............................. 70
`I.
`IV. CONCLUSION ............................................................................................. 70
`
`
`
`
`
`iv
`
`WEST\275180585.6
`
`

`

`EXHIBIT LIST
`
`Exhibit No. Description
`
`
`
`1001
`
`1002
`
`1003
`
`1004
`
`1005
`
`1006
`
`1007
`
`1008
`
`1009
`
`1010
`
`1011
`
`1012
`
`1013
`
`1014
`
`1015
`
`1016
`
`U.S. Patent No. 8,659,571.
`
`Declaration of expert Dr. V. Michael Bove Jr. (“Bove Decl.”).
`
`File history of U.S. Patent No. 8,659,571.
`
`U.S. Patent No. 5,734,373 to Rosenberg et al. (“Rosenberg ’373”).
`
`Reserved.
`
`U.S. Patent No. 6,429,846 to Rosenberg et al. (“Rosenberg ’846”).
`
`File history of U.S. Patent App. No. 13/472,698 (the “’698
`application”).
`
`Excerpts from Barron’s Dictionary of Mathematics Terms, 3rd ed.
`(2009).
`
`Excerpts from The American Heritage Dictionary of the English
`Language, 5th ed. (2011).
`
`Reserved.
`
`Reserved.
`
`Reserved.
`
`U.S. Patent No. 7,952,566 to Poupyrev et al. (“Poupyrev”).
`
`U.S. Patent No. 6,281,651 to Haanpaa et al.
`
`Canadian Pat. App. 2,059,893 to Tecot (“Tecot”).
`
`Excerpt from Canadian Patent Office Record (Vol. 127, No. 18,
`May 1999).
`
`WEST\275180585. 6
`
`v
`
`

`

`1017
`
`1018
`
`A FORCE FEEDBACK PROGRAMMING PRIMER, Louis Rosenberg
`(1997).
`“Synaptics TouchPad Interfacing Guide” (Document No. 510-
`
`000080-A; Rev. 2.5) (“Synaptics”)
`
`1019
`
`Internet Archive Affidavit for Synaptics Web Page
`
`http://www.synaptics.com/support/dev_support.cfm
`
`
`
`WEST\275180585. 6
`
`vi
`
`

`

`
`
`Apple Inc. (“Apple” or “Petitioner”) hereby petitions for inter partes review
`
`of U.S. Patent No. 8,659,571 (Ex. 1001, the “’571 patent”).
`
`I.
`
`COMPLIANCE WITH FORMAL REQUIREMENTS
`A. Mandatory Notices Under 37 C.F.R. §§ 42.8(b)(1)-(4)
`1.
`Real Party-In-Interest
`Apple is the real party-in-interest.
`
`2.
`Related Matters
`The ’571 patent is subject to the following actions: 1) Certain Mobile
`
`Electronic Devices Incorporating Haptics (Including Smartphones and
`
`Smartwatches) and Components Thereof, U.S. International Trade Commission
`
`Investigation No. 337-TA-990; and 2) Immersion Corporation v. Apple Inc., et al.,
`
`Case No. 1:16-cv-00077 (D. Del.). The ’571 patent is also the subject of IPR2016-
`
`01372, in which Apple is the petitioner.
`
`Lead and Backup Counsel
`
`3.
`Lead counsel is James M. Heintz, Reg. No. 41,828, of DLA Piper LLP (US),
`
`11911 Freedom Drive, Suite 300; Reston, VA 20190; Jim.heintz@dlapiper.com,
`
`703-773-4148 (phone), 703-773-5200 (fax). Backup counsel is Robert Buergi,
`
`Reg. No. 58,125, of DLA Piper LLP (US); 2000 University Ave; East Palo Alto,
`
`CA 94330; robert.buergi@dlapiper.com, 650-833-2407 (phone), 650-687-1144
`
`(fax).
`
`WEST\275180585. 6
`
`1
`
`

`

`
`
`Service Information
`
`4.
`Please address all correspondence to the lead and back-up counsel as shown
`
`above. Apple consents to electronic service to lead and back-up counsel and to
`
`Apple-Immersion-IPRs@dlapiper.com.
`
`Proof of Service on the Patent Owner
`
`B.
`As identified in the attached Certificate of Service, a copy of this Petition in
`
`its entirety is being served to the Patent Owner’s attorney of record at the address
`
`listed in the USPTO’s records by overnight courier pursuant to 37 C.F.R. § 42.6.
`
`Power of Attorney
`
`C.
`Powers of attorney are being filed with designation of counsel in accordance
`
`with 37 C.F.R. § 41.10(b).
`
`Standing
`
`D.
`In accordance with 37 C.F.R. §42.104(a), Petitioner certifies that the ’571
`
`patent is available for inter partes review and that Petitioner is not barred or
`
`estopped from requesting an inter partes review challenging the patent claims on
`
`the grounds identified in this Petition.
`
`Fees
`
`E.
`The undersigned authorizes the Director to charge the fee specified by 37
`
`C.F.R. § 42.15(a) and any additional fees that might be due in connection with this
`
`Petition to Deposit Account No. 50-1442.
`
`WEST\275180585. 6
`
`2
`
`

`

`STATEMENT OF PRECISE RELIEF REQUESTED
`
`In accordance with 35 U.S.C. § 311, Petitioner requests cancelation of
`
`
`
`II.
`
`claims 1-7 and 23-29 of the ’571 patent in view of the following grounds:
`
`A. Claims 1-4, 7, 23-26 and 29 are obvious under 35 U.S.C. § 103(a)
`
`(pre-AIA) in light of U.S. Patent No. 7,952,566 to Poupyrev et al. (“Poupyrev”).
`
`B.
`
`Claims 5 and 27 are obvious under 35 U.S.C. § 103(a) (pre-AIA) in
`
`light of Poupyrev and Rosenberg, A Force Feedback Programming Primer (1997)
`
`(“Primer”).
`
`C.
`
`Claims 6 and 28 are obvious under 35 U.S.C. § 103(a) (pre-AIA) in
`
`light of Poupyrev and Canadian Pat. App. 2,059,893 to Tecot (“Tecot”).
`
`D. Claims 1, 2, 4-7, 23, 24, and 26-29 are obvious under 35 U.S.C. §
`
`103(a) (pre-AIA) in light of U.S. Patent No. 5,734,373 to Rosenberg et al.
`
`(“Rosenberg ’373”).
`
`E.
`
`Claims 3 and 25 are obvious under 35 U.S.C. § 103(a) (pre-AIA) in
`
`light of Rosenberg ’373 and U.S. Patent No. 6,429,846 to Rosenberg et al.
`
`(“Rosenberg ’846”).
`
`III. FULL STATEMENT OF REASONS FOR REQUESTED RELIEF
`A.
`Summary of the ’571 Patent
`The ’571 patent is titled “Interactivity Model For Shared Feedback On
`
`Mobile Devices.” Ex. 1001, cover. The ’571 patent states that “[t]raditional
`
`architectures that provide haptic feedback only with triggered effects are
`3
`
`WEST\275180585. 6
`
`

`

`
`
`available,” and they “must be carefully designed to make sure the timing of the
`
`haptic feedback is correlated to user initiated gestures or system animations.” Id.
`
`at 1:49-52. “However, because these user gestures and system animations have
`
`variable timing, the correlation to haptic feedback may be static and inconsistent
`
`and therefore less compelling to the user.” Id. at 1:53-56. “Further, device sensor
`
`information is typically not used in combination with gestures to product haptic
`
`feedback.” Id. at 1:56-57. Therefore, “there is a need for an improved system of
`
`providing a dynamic haptic effect that includes multiple gesture signals and device
`
`sensor signals.” Id. at 1:58-60. To solve these problems, the ’571 patent discloses
`
`providing “dynamic” haptic effects based upon gesture signals and/or device
`
`sensor signals. Id. at 1:66-2:5. A “dynamic haptic effect refers to a haptic effect
`
`that evolves over time as it responds to one or more input parameters.” Id. at 2:65-
`
`67.
`
`Person of Ordinary Skill in the Art
`
`B.
`A person of ordinary skill in the art at the time of the alleged invention of
`
`the ’571 patent (a “POSITA”) is evidenced by the prior art of record and the type
`
`of problems and solutions described in the’571 patent, and includes experience in
`
`haptic response technology in multi-touch or multi-gestures systems. (IPR 2016-
`
`01372, Paper 7, at 24-25); see also Ex. 1002 ¶60.
`
`C. Claim Construction
`
`WEST\275180585. 6
`
`4
`
`

`

`
`
`In accordance with 37 C.F.R. § 42.104(b)(3), Petitioner provides the
`
`following statement regarding construction of the ’571 patent claims.
`
`“gesture signal”
`
`1.
`In IPR2016-01372, the Board construed gesture signal to mean “a signal
`
`indicating a movement of the body that conveys meaning or user intent.”
`
`IPR2016-01372, Paper 7 at 12. Petitioner will apply that construction.
`
`“dynamic interaction parameter”
`
`2.
`In IPR2016-01372, the Board construed this term to mean “a parameter that
`
`changes over time or reacts in real time based on a user’s interaction with a
`
`device.” Petitioner will apply that construction.
`
`“vector signal”
`
`3.
`Immersion agrees with Apple that this term (claims 2, 24) should be
`
`construed as “a signal that includes both a magnitude and direction.” IPR2016,
`
`Paper 6, at 15.
`
`“on screen signal”
`
`4.
`Immersion agrees with Apple that this term (claims 3, 25) should be
`
`construed as “signal generated based on interactions with a touch screen.” Id.
`
`5.
`
`“generating a dynamic interaction parameter using… a
`physical model”
`
`The ’571 patent describes a “physical model” as a “mathematical model
`
`related to a real-world physical effect such as gravity, acceleration, friction or
`
`WEST\275180585. 6
`
`5
`
`

`

`
`
`inertia.” Ex. 1001 at 12:38-44. Thus, this limitation should encompass generating
`
`a dynamic interaction based on such a mathematical model.
`
`6.
`
`“generating a dynamic interaction parameter using… an
`animation”
`
`For the purposes of this Petition, Apple will apply Immersion’s construction
`
`of “generating a dynamic interaction parameter by incorporating information from
`
`an animation or coordinating the dynamic interaction parameter to an animation.”
`
`D. Ground 1: Claims 1-4, 7, 23-26 and 29 are Obvious
`Under 35 U.S.C. § 103(a) (pre-AIA) in View of Poupyrev.
`
`Claims 1, 3-4, 7, 23, 25-26 and 29 are rendered obvious by Poupyrev. The
`
`application for Poupyrev was filed on 7/31/07, more than five years before the
`
`earliest possible priority date of the ’571 patent (8/23/12), and is therefore prior art
`
`to the ’571 patent under 35 U.S.C. § 102 (e) (pre-AIA). Ex. 1013, cover.
`
`Poupyrev discloses various methods and systems for providing tactile
`
`feedback on a device with a pressure sensitive touchscreen.1 Poupyrev discloses
`
`that in “the embodiments of the present invention, the form of the haptic feedback
`
`is determined depending on the touch position, the pressure applied by the user and
`
`the current logical state of the graphical user interface object. Accordingly, various
`
`forms of the haptic feedback may be provided for different logical states of the
`
`graphical user interface object, making it easy for the user to know the current state
`
`
`1 Poupyrev, Abstract.
`
`WEST\275180585. 6
`
`6
`
`

`

`
`
`of the graphical user interface object.”2
`
`An exemplary system for implementing the methods disclosed by Poupyrev
`
`is illustrated in Figure 1 (reproduced below).
`
`Poupyrev discloses that the system includes a 2D position sensing unit
`
`which “starts tracking and keeps track of the user’s finger ... position on the
`
`
`
`
`2Poupyrev 3:23-30.
`
`WEST\275180585. 6
`
`7
`
`

`

`
`
`screen,”3 and a pressure sensing unit 105, which “monitors the pressure of the
`
`user’s finger ... applied to the screen.”4
`
`Poupyrev discloses a variety of methods for providing haptic feedback based
`
`on the position and/or pressure detected on the touch screen. In one embodiment,
`
`Poupyrev discloses providing dynamic haptic effects in which parameters of the
`
`haptic effect are determined as a function of the pressure applied by the user:
`
`A dynamic tactile wave shape where tactile feedback parameters (e.g.
`frequency, amplitude, intensity, etc.) are a function of the pressure
`applied by the user on the GUI object; The dependency can be
`
`(a) A step function, e.g. when the user 2 presses a button-type GUI
`object, such that tactile feedback changes in discreet steps; or
`
`(b) The continuous dependency between feedback and pressure
`applied, e.g. the stronger the user presses the button the higher the
`vibration frequency can be or the amplitude of the vibrations. In the
`simplest case, the intensity of tactile feedback increases as the user
`presses stronger.5
`
`Poupyrev further discloses a signal generating unit 106 that “generates and
`
`provides a signal to the tactile feedback generating unit 102 for driving tactile
`
`
`3Id. 8:25-27.
`
`4Id. 8:32-36.
`
`5Id. 10:51-62; see also 5:8-10; Ex. 1002, ¶¶168-170.
`
`WEST\275180585. 6
`
`8
`
`

`

`
`
`feedback generating elements or the piezoelectric actuators.”6 These signals are
`
`provided to a tactile feedback generating unit 102, which “receives the input signal
`
`and converts the input signal into force patterns that are transmitted to the user 2
`
`via a mechanical assembly that combines the screen with the tactile feedback
`
`generating elements or the piezoelectric actuators,” such that “the user 2 can feel
`
`the force patterns when the user 2 is touching the screen.”7
`
`Poupyrev renders obvious each of asserted claims 1-4, 7, 23-26 and 29 for
`
`the reasons set forth below. Ex. 1002 ¶¶166-172.
`
`1.
`
`Limitation 1.pre: “A method of producing a haptic effect
`comprising:”
`
`The preamble of claim 1 is not a limitation of the claim. Nonetheless,
`
`Poupyrev discloses the preamble. For example, Poupyrev discloses that the
`
`“present invention relates to a method of a user interface utilizing a touch screen
`
`and tactile feedback, and an apparatus that employs such a user interface method.”8
`
`a.
`
`Limitation 1.a: “receiving a first gesture signal” and
`Limitation 1.b: “receiving a second gesture signal”
`
`Poupyrev discloses receiving at least two types of gesture signals: signals
`
`indicating a pressure associated with movements of a user’s body that convey
`
`
`6Id. 6:43-46.
`
`7IPoupyrev 6:55-60.
`
`8IPoupyrev 1:9-11; Ex. 1002 ¶¶173-176.
`
`WEST\275180585. 6
`
`9
`
`

`

`
`
`meaning or user intent, and signals indicating a position associated with
`
`movements of a user’s body that convey meaning or user intent. For example,
`
`Poupyrev discloses a method that includes “detecting a touch position on the touch
`
`screen, at which a user’s finger or a pointing device is touching; detecting pressure
`
`applied on the touch screen when the touch position is detected.”9 Both types of
`
`gesture signals are discussed below.
`
`Poupyrev discloses a position sensing unit 104 that tracks the (x,y) position
`
`of the user’s finger or pen-type device: “Upon detecting the user’s touch, the 2D
`
`position sensing unit 104 starts tracking and keeps track of the user’s finger or pen-
`
`type device position on the screen.”10 Poupyrev also discloses a pressure sensing
`
`unit 105, which “monitors the pressure of the user’s finger or pen-type device
`
`applied to the screen.”11
`
`Poupyrev further discloses that signals generated by the position sensing unit
`
`104 and the pressure sensing unit 105 are received by controller 20. Specifically,
`
`Poupyrev discloses that the “controller section 20 includes … a two dimensional
`
`(2D) position sensing unit controller 108, a pressure sensing unit controller 109 . . .
`
`
`9Id. 3:9-15; Ex. 1002 ¶¶178-179 .
`
`10Poupyrev 8:25-27.
`
`11Id. 8:32-36; Ex. 1002 ¶¶180-181.
`
`WEST\275180585. 6
`
`10
`
`

`

`
`
`a graphical user interface controller (GUI) 112 ” and various other components.12
`
`The “pressure sensing unit controller 109 determines the value of pressure applied
`
`when the user is touching on the screen.”13
`
`Poupyrev further discloses that the position sensing unit controller 108
`
`determines the position where the user is touching on the screen.” Ex. 1013 at 7:3-
`
`4.
`
`Poupyrev also discloses that the pressures from the pressure sensing unit
`
`controller 109 and the positions from the position sensing unit controller 108 are
`
`“communicated to the GUI controller 112.” Ex. 1013 at 7:4-5.
`
`The pressures output by the pressure sensing unit 105 and the pressure
`
`sensing unit controller 109, and the positions output by the position sensing unit
`
`104 and the position sensing unit controller 108, are all “signals” as that term is
`
`used in the ’571 patent. Poupyrev explicitly refers to them as such: “when the user
`
`presses the screen and at the same time tactile feedback is provided to the user, the
`
`pressure signal will have a component from the tactile feedback signal.” Ex. 1013
`
`at 7:6-8. Moreover, the ’571 patent uses the term “signal” broadly to refer to
`
`information or data, as well as a physical embodiment of such information or data
`
`in either analog and digital form, such as a voltage. This can be seen, for example,
`
`12Poupyrev 6:38-42; see also Fig. 1.
`
`13Id. 7:1-5 Ex. 1002 ¶¶182-183.
`
`WEST\275180585. 6
`
`11
`
`

`

`
`
`with respect to the ‘571 patent’s references to “virtual sensor signal” in claim 11
`
`(Ex. 1001 at 16:48-50) and the discussion of step 1507 (“read the gesture or sensor
`
`signal from the data file on the second device”) in Ex. 1001 at 15:63-65. Ex. 1002
`
`¶¶185-186.
`
`To the extent that “signal” is construed to refer only to physical
`
`embodiments of information/data (e.g., an electrical sine wave signal), and to the
`
`extent that Poupyrev does not expressly state that the pressures and positions in hs
`
`system are all “signals” under such a construction, implementing Poupyrev’s
`
`system in a manner that utilizes such signals would have been obvious. Poupyrev
`
`discloses at 6:26-30 that the “[v]arious functions performed by sub-sections of the
`
`control section 20 and the application section 30 may be realized by . . . adding
`
`dedicated circuitry or hardware to the computer.” When utilizing such dedicated
`
`circuits, a POSITA would have understood and found obvious that electrical
`
`signals representing pressures and positions associated with gestures (which could
`
`be in either analog or digital form) would be communicated between the dedicated
`
`circuits because circuits communicate by exchanging signals. Ex. 1002 ¶ 187.
`
`It should also be understood that each instance of a pressure or a position
`
`(i.e., a pressure or a position at a particular point in time) exchanged between the
`
`units 105, 109, 104, 108 and 112 is a gesture signal. For example, each pressure at
`
`a particular point in time communicated between the pressure sensing unit
`
`WEST\275180585. 6
`
`12
`
`

`

`
`
`controller 109 and the GUI controller 112 (Ex. 1013 at 7:1-5) in response to a user
`
`movement that conveys meaning or user intent is a gesture signal. Thus, when a
`
`user presses over an area of the touchscreen corresponding to a GUI object at a
`
`first pressure at a first time, and then presses with a second, greater pressure at a
`
`second point in time, the pressure sensing unit controller 109 transmits first and
`
`second gesture signals representing the first and second pressures. Ex. 1002 ¶188.
`
`The pressure and position signals exchanged between units 105, 109, 104,
`
`108 and 112 are all gesture signals because they are associated with movements of
`
`a user’s body that convey meaning or user intent, and Poupyrev discloses
`
`interpreting these signals to discern the user’s meaning/intent.
`
`For example, Poupyrev discloses that, using the pressure and position data
`
`from the user’s touch, the “GUI controller 112 determines which GUI object the
`
`user is intending to interact with.” Ex. 1013 at 7:1-15 (emphasis added).
`
`Poupyrev further discloses that pressure sensing unit 105 “monitors the pressure of
`
`the user’s finger or pen-type device applied to the screen. The pressing event is
`
`recognized, for example, if the pressure more than a predetermined value is
`
`detected.”14 Such recognition is a determination of a user’s intent. Ex. 1002 ¶190.
`
`Poupyrev further discloses that the detected pressure can be used to implement a
`
`variety of functions, such as, for example, actuating a displayed button on the
`
`14Id. 8:32-36 (emphasis added).
`
`WEST\275180585. 6
`
`13
`
`

`

`
`
`graphic user interface (GUI): “In case (iii), the GUI object 310 is not actuated but
`
`the apparatus 1 is tracking the pressure value and attempt[s] to recognize an
`
`actuation event, i.e. some gesture that allows the user to specify that the GUI object
`
`310 should be actuated.”15
`
`A POSITA would understand that the pressing gestures described by
`
`Poupyrev are a composite of multiple simple gestures, including for example
`
`individual finger down gestures and finger motion gestures. For example,
`
`Poupyrev describes a “typical touch screen interaction” as being comprised of
`
`simple gestures, such as “touch-down,” “drag” and “lift off”:
`
`“Fig. 2 presents an example of a typical touch-screen interaction of related
`art. The interaction starts when the user 2 touches the screen (touch down
`event T1). The user 2 can then either drag a finger across the input space
`(drag or slide state T2) or hold it steady in one place (hold state T3). Further
`the user 2 can lift the finger off the screen, which can happen either from
`inside of the GUI object (lift off event T4) or from outside of the GUI object
`(lift off out event T5).”16
`
`Poupyrev similarly discloses detecting multiple “pressure” gestures over
`
`time as the pressure applied by the user’s finger varies. For example, Poupyrev
`
`discloses that “tactile feedback may be provided when the user 2 changes pressure
`
`inside of the GUI object. In this variation, when the user 2 places the finger inside
`
`of the GUI object and presses on it, the tactile feedback is provided for each
`
`15Id. 10:3-6 (emphasis added).
`
`16Id. 7:47-54.
`
`WEST\275180585. 6
`
`14
`
`

`

`
`
`incremental change in pressure applied.”17 Thus, a POSITA would understand that
`
`the pressure gestures disclosed by Poupyrev comprise multiple simple gestures,
`
`recognized as, for example, touch down, move, or incremental change in pressure
`
`events.18 Accordingly, Poupyrev discloses that signals indicative of the location or
`
`pressure of the user’s finger are recognized as gestures, and such signals are
`
`therefore gesture signals under the Board’s construction.
`
`Additionally, it would have been obvious to a POSITA that each signal
`
`received from the touch screen could be recognized as a gesture signal. A POSITA
`
`would appreciate that the Poupyrev press gestures could be implemented using
`
`multiple simple gestures, such as finger down gestures when each finger touches
`
`the touch screen, finger move gestures when the position of the finger changes or
`
`press gestures when the incremental pressure applied to the touchscreen changes.
`
`Ex. 1002 ¶195.
`
`b.
`
`Limitation 1.c: “generating a dynamic interaction
`parameter using the first gesture signal and the
`second gesture signal; and”
`
`Poupyrev discloses that the “controller section determines a form of the
`
`haptic feedback to be generated depending on (i) the detected touch position, (ii)
`
`the detected pressure value and (iii) the determined current logical state of the
`
`17Id. 11:1-5.
`
`18See, e.g., id. 3:35-56.; 10:39-55, Ex1002 ¶189-194
`
`WEST\275180585. 6
`
`15
`
`

`

`
`
`graphical user interface object.”19
`
`As discussed above, Poupyrev discloses various embodiments in which
`
`haptic effects are generated in response to these position and pressure inputs. For
`
`example, Poupyrev discloses one embodiment in which “a dynamic tactile wave
`
`shape where tactile feedback parameters (e.g. frequency, amplitude, intensity, etc.)
`
`are a function of the pressure applied by the user on the GUI object.”20 Poupyrev
`
`discloses that the dependency between the tactile feedback parameters and applied
`
`pressure can be:
`
`(a) A step function, e.g. when the user 2 presses a button-type GUI
`object, such that tactile feedback changes in discreet steps; or
`(b) The continuous dependency between feedback and pressure
`applied, e.g. the stronger the user presses the button the higher the
`vibration frequency can be or the amplitude of the vibrations. In the
`simplest case, the intensity of tactile feedback increases as the user
`presses stronger.”21
`
`In this embodiment, Poupyrev discloses a dynamic haptic effect (i.e.
`
`“dynamic tactile waveshape”) where the parameters that define the haptic effect
`
`(i.e. “tactile feedback parameters”) are a function of the pressure applied by the
`
`19Poupyrev 2:10-14, Ex. 1002 ¶196-197.
`
`20Id. 10:51-53.
`
`21Id. 10:51-62.
`
`WEST\275180585. 6
`
`16
`
`

`

`
`
`user. Poupyrev further discloses a “continuous dependency” between the pressure
`
`applied to the touchscreen and the corresponding tactile feedback, by varying one
`
`or more parameters, such as the “vibration frequency” or the “amplitude of the
`
`vibrations” that define the corresponding haptic effect. 22 In this regard, Poupyrev
`
`notes that the “tactile feedback may also be generated in accordance with any other
`
`parameter or multiple parameters that define the tactile waveshape.”23 Inasmuch as
`
`the tactile feedback parameters are generated as a function of the pressure applied
`
`by the user over time, the tactile feedback parameters are generated based on the
`
`first and second gesture signals, i.e. signals indicative of the pressure applied to the
`
`device at two points in time. Ex. 1002 ¶¶198-199.
`
`The tactile feedback parameters are “dynamic interaction parameters” (i.e.
`
`“parameter that changes over time or reacts in real time based on a user’s
`
`interaction with a device”), because they change over time based on the user’s
`
`interaction with the device (i.e. pressure applied to the device). In this regard,
`
`Poupyrev explains that there is a “continuous dependenc

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket